Science.gov

Sample records for operating basis earthquake

  1. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  2. [Autism after an earthquake: the experience of L'Aquila (Central Italy) as a basis for an operative guideline].

    PubMed

    Valenti, Marco; Di Giovanni, Chiara; Mariano, Melania; Pino, Maria Chiara; Sconci, Vittorio; Mazza, Monica

    2016-01-01

    People with autism, their families, and their specialised caregivers are a social group at high health risk after a disruptive earthquake. They need emergency assistance and immediate structured support according to definite protocols and quality standards. We recommend to establish national guidelines for taking-in-charge people with autism after an earthquake. The adaptive behaviour of participants with autism declined dramatically in the first months after the earthquake in all the dimensions examined (i.e., communication, daily living, socialisation, and motor skills). After relatively stable conditions returned and with immediate and intensive post-disaster intervention, children and adolescents with autism showed a trend towards partial recovery of adaptive functioning. As to the impact on services, this study indicates the need for supporting exposed caregivers at high risk of burnout over the first two years after the disaster and for an immediate reorganisation of person-tailored services.

  3. [Autism after an earthquake: the experience of L'Aquila (Central Italy) as a basis for an operative guideline].

    PubMed

    Valenti, Marco; Di Giovanni, Chiara; Mariano, Melania; Pino, Maria Chiara; Sconci, Vittorio; Mazza, Monica

    2016-01-01

    People with autism, their families, and their specialised caregivers are a social group at high health risk after a disruptive earthquake. They need emergency assistance and immediate structured support according to definite protocols and quality standards. We recommend to establish national guidelines for taking-in-charge people with autism after an earthquake. The adaptive behaviour of participants with autism declined dramatically in the first months after the earthquake in all the dimensions examined (i.e., communication, daily living, socialisation, and motor skills). After relatively stable conditions returned and with immediate and intensive post-disaster intervention, children and adolescents with autism showed a trend towards partial recovery of adaptive functioning. As to the impact on services, this study indicates the need for supporting exposed caregivers at high risk of burnout over the first two years after the disaster and for an immediate reorganisation of person-tailored services. PMID:27291209

  4. The potential uses of operational earthquake forecasting

    USGS Publications Warehouse

    Field, Ned; Jordan, Thomas; Jones, Lucille; Michael, Andrew; Blanpied, Michael L.

    2016-01-01

    This article reports on a workshop held to explore the potential uses of operational earthquake forecasting (OEF). We discuss the current status of OEF in the United States and elsewhere, the types of products that could be generated, the various potential users and uses of OEF, and the need for carefully crafted communication protocols. Although operationalization challenges remain, there was clear consensus among the stakeholders at the workshop that OEF could be useful.

  5. Operational Earthquake Forecasting of Aftershocks for New England

    NASA Astrophysics Data System (ADS)

    Ebel, J.; Fadugba, O. I.

    2015-12-01

    Although the forecasting of mainshocks is not possible, recent research demonstrates that probabilistic forecasts of expected aftershock activity following moderate and strong earthquakes is possible. Previous work has shown that aftershock sequences in intraplate regions behave similarly to those in California, and thus the operational aftershocks forecasting methods that are currently employed in California can be adopted for use in areas of the eastern U.S. such as New England. In our application, immediately after a felt earthquake in New England, a forecast of expected aftershock activity for the next 7 days will be generated based on a generic aftershock activity model. Approximately 24 hours after the mainshock, the parameters of the aftershock model will be updated using the observed aftershock activity observed to that point in time, and a new forecast of expected aftershock activity for the next 7 days will be issued. The forecast will estimate the average number of weak, felt aftershocks and the average expected number of aftershocks based on the aftershock statistics of past New England earthquakes. The forecast also will estimate the probability that an earthquake that is stronger than the mainshock will take place during the next 7 days. The aftershock forecast will specify the expected aftershocks locations as well as the areas over which aftershocks of different magnitudes could be felt. The system will use web pages, email and text messages to distribute the aftershock forecasts. For protracted aftershock sequences, new forecasts will be issued on a regular basis, such as weekly. Initially, the distribution system of the aftershock forecasts will be limited, but later it will be expanded as experience with and confidence in the system grows.

  6. Linking earthquakes and hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2013-01-01

    Hydraulic fracturing, also known as fracking, to extract oil and gas from rock, has been a controversial but increasingly common practice; some studies have linked it to groundwater contamination and induced earthquakes. Scientists discussed several studies on the connection between fracking and earthquakes at the AGU Fall Meeting in San Francisco in December.

  7. The Bender-Dunne basis operators as Hilbert space operators

    SciTech Connect

    Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph

    2014-02-15

    The Bender-Dunne basis operators, T{sub −m,n}=2{sup −n}∑{sub k=0}{sup n}(n/k )q{sup k}p{sup −m}q{sup n−k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub −m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)

  8. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart.

  9. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  10. Minimization of Basis Risk in Parametric Earthquake Cat Bonds

    NASA Astrophysics Data System (ADS)

    Franco, G.

    2009-12-01

    A catastrophe -cat- bond is an instrument used by insurance and reinsurance companies, by governments or by groups of nations to cede catastrophic risk to the financial markets, which are capable of supplying cover for highly destructive events, surpassing the typical capacity of traditional reinsurance contracts. Parametric cat bonds, a specific type of cat bonds, use trigger mechanisms or indices that depend on physical event parameters published by respected third parties in order to determine whether a part or the entire bond principal is to be paid for a certain event. First generation cat bonds, or cat-in-a-box bonds, display a trigger mechanism that consists of a set of geographic zones in which certain conditions need to be met by an earthquake’s magnitude and depth in order to trigger payment of the bond principal. Second generation cat bonds use an index formulation that typically consists of a sum of products of a set of weights by a polynomial function of the ground motion variables reported by a geographically distributed seismic network. These instruments are especially appealing to developing countries with incipient insurance industries wishing to cede catastrophic losses to the financial markets because the payment trigger mechanism is transparent and does not involve the parties ceding or accepting the risk, significantly reducing moral hazard. In order to be successful in the market, however, parametric cat bonds have typically been required to specify relatively simple trigger conditions. The consequence of such simplifications is the increase of basis risk. This risk represents the possibility that the trigger mechanism fails to accurately capture the actual losses of a catastrophic event, namely that it does not trigger for a highly destructive event or vice versa, that a payment of the bond principal is caused by an event that produced insignificant losses. The first case disfavors the sponsor who was seeking cover for its losses while the

  11. Retrospective tests of hybrid operational earthquake forecasting models for Canterbury

    NASA Astrophysics Data System (ADS)

    Rhoades, D. A.; Liukis, M.; Christophersen, A.; Gerstenberger, M. C.

    2016-01-01

    The Canterbury, New Zealand, earthquake sequence, which began in September 2010, occurred in a region of low crustal deformation and previously low seismicity. Because, the ensuing seismicity in the region is likely to remain above previous levels for many years, a hybrid operational earthquake forecasting model for Canterbury was developed to inform decisions on building standards and urban planning for the rebuilding of Christchurch. The model estimates occurrence probabilities for magnitudes M ≥ 5.0 in the Canterbury region for each of the next 50 yr. It combines two short-term, two medium-term and four long-term forecasting models. The weight accorded to each individual model in the operational hybrid was determined by an expert elicitation process. A retrospective test of the operational hybrid model and of an earlier informally developed hybrid model in the whole New Zealand region has been carried out. The individual and hybrid models were installed in the New Zealand Earthquake Forecast Testing Centre and used to make retrospective annual forecasts of earthquakes with magnitude M > 4.95 from 1986 on, for time-lags up to 25 yr. All models underpredict the number of earthquakes due to an abnormally large number of earthquakes in the testing period since 2008 compared to those in the learning period. However, the operational hybrid model is more informative than any of the individual time-varying models for nearly all time-lags. Its information gain relative to a reference model of least information decreases as the time-lag increases to become zero at a time-lag of about 20 yr. An optimal hybrid model with the same mathematical form as the operational hybrid model was computed for each time-lag from the 26-yr test period. The time-varying component of the optimal hybrid is dominated by the medium-term models for time-lags up to 12 yr and has hardly any impact on the optimal hybrid model for greater time-lags. The optimal hybrid model is considerably more

  12. Operational earthquake forecasting in the South Iceland Seismic Zone: improving the earthquake catalogue

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Vogfjörd, Kristin; Zechar, J. Douglas; Eberhard, David

    2014-05-01

    A major earthquake sequence is ongoing in the South Iceland Seismic Zone (SISZ), where experts expect earthquakes of up to MW = 7.1 in the coming years to decades. The historical seismicity in this region is well known and many major faults here and on Reykjanes Peninsula (RP) have already been mapped. The faults are predominantly N-S with right-lateral strike-slip motion, while the overall motion in the SISZ is E-W oriented left-lateral motion. The area that we propose for operational earthquake forecasting(OEF) contains both the SISZ and the RP. The earthquake catalogue considered for OEF, called the SIL catalogue, spans the period from 1991 until September 2013 and contains more than 200,000 earthquakes. Some of these events have a large azimuthal gap between stations, and some have large horizontal and vertical uncertainties. We are interested in building seismicity models using high-quality data, so we filter the catalogue using the criteria proposed by Gomberg et al. (1990) and Bondar et al. (2004). The resulting filtered catalogue contains around 130,000 earthquakes. Magnitude estimates in the Iceland catalogue also require special attention. The SIL system uses two methods to estimate magnitude. The first method is based on an empirical local magnitude (ML) relationship. The other magnitude scale is a so-called "local moment magnitude" (MLW), originally constructed by Slunga et al. (1984) to agree with local magnitude scales in Sweden. In the SIL catalogue, there are two main problems with the magnitude estimates and consequently it is not immediately possible to convert MLW to moment magnitude (MW). These problems are: (i) immediate aftershocks of large events are assigned magnitudes that are too high; and (ii) the seismic moment of large earthquakes is underestimated. For this reason the magnitude values in the catalogue must be corrected before developing an OEF system. To obtain a reliable MW estimate, we calibrate a magnitude relationship based on

  13. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  14. 78 FR 39781 - Consequence Study of a Beyond-Design-Basis Earthquake Affecting the Spent Fuel Pool for a U.S...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ... earthquake that occurred off the coast of Japan on March 11, 2011. That earthquake did not result in any... COMMISSION Consequence Study of a Beyond-Design-Basis Earthquake Affecting the Spent Fuel Pool for a U.S... comment, titled Consequence Study of a Beyond- Design-Basis Earthquake Affecting the Spent Fuel Pool for...

  15. The doctrinal basis for medical stability operations.

    PubMed

    Baker, Jay B

    2010-01-01

    This article describes possible roles for the military in the health sector during stability operations, which exist primarily when security conditions do not permit the free movement of civilian actors. This article reviews the new U.S. Army Field Manuals (FMs) 3-24, Counterinsurgency and FM 3-07, Stability Operations, in the context of the health sector. Essential tasks in medical stability operations are identified for various logical lines of operation including information operations, civil security, civil control, support to governance, support to economic development, and restoration of essential services. Restoring essential services is addressed in detail including coordination, assessment, actions, and metrics in the health sector. Coordination by the military with other actors in the health sector including host nation medical officials, other United States governmental agencies, international governmental organizations (IGOs), and nongovernment organizations (NGOs) is key to success in medical stability operations. PMID:20108837

  16. The Establishment of an Operational Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Lombardi, Anna Maria; Casarotti, Emanuele

    2014-05-01

    Just after the Mw 6.2 earthquake that hit L'Aquila, on April 6 2009, the Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) that paved the way to the development of the Operational Earthquake Forecasting (OEF), defined as the "procedures for gathering and disseminating authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes". In this paper we introduce the first official OEF system in Italy that has been developed by the new-born Centro di Pericolosità Sismica at the Istituto Nazionale di Geofisica e Vulcanologia. The system provides every day an update of the weekly probabilities of ground shaking over the whole Italian territory. In this presentation, we describe in detail the philosophy behind the system, the scientific details, and the output format that has been preliminary defined in agreement with Civil Protection. To our knowledge, this is the first operational system that fully satisfies the ICEF guidelines. Probably, the most sensitive issue is related to the communication of such a kind of message to the population. Acknowledging this inherent difficulty, in agreement with Civil Protection we are planning pilot tests to be carried out in few selected areas in Italy; the purpose of such tests is to check the effectiveness of the message and to receive feedbacks.

  17. FB Line Basis for Interim Operation

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The safety analysis of the FB-Line Facility indicates that the operation of FB-Line to support the current mission does not present undue risk to the facility and co-located workers, general public, or the environment.

  18. Design basis for the NRC Operations Center

    SciTech Connect

    Lindell, M.K.; Wise, J.A.; Griffin, B.N.; Desrosiers, A.E.; Meitzler, W.D.

    1983-05-01

    This report documents the development of a design for a new NRC Operations Center (NRCOC). The project was conducted in two phases: organizational analysis and facility design. In order to control the amount of traffic, congestion and noise within the facility, it is recommended that information flow in the new NRCOC be accomplished by means of an electronic Status Information Management System. Functional requirements and a conceptual design for this system are described. An idealized architectural design and a detailed design program are presented that provide the appropriate amount of space for operations, equipment and circulation within team areas. The overall layout provides controlled access to the facility and, through the use of a zoning concept, provides each team within the NRCOC the appropriate balance of ready access and privacy determined from the organizational analyses conducted during the initial phase of the project.

  19. Solid waste retrieval. Phase 1, Operational basis

    SciTech Connect

    Johnson, D.M.

    1994-09-30

    This Document describes the operational requirements, procedures, and options for execution of the retrieval of the waste containers placed in buried storage in Burial Ground 218W-4C, Trench 04 as TRU waste or suspect TRU waste under the activity levels defining this waste in effect at the time of placement. Trench 04 in Burial Ground 218W-4C is totally dedicated to storage of retrievable TRU waste containers or retrievable suspect TRU waste containers and has not been used for any other purpose.

  20. Lessons of L'Aquila for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  1. Scientific and non-scientific challenges for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2015-12-01

    Tracking the time evolution of seismic hazard in time windows shorter than the usual 50-years of long-term hazard models may offer additional opportunities to reduce the seismic risk. This is the target of operational earthquake forecasting (OEF). During the OEF development in Italy we identify several challenges that range from pure science to the more practical interface of science with society. From a scientific point of view, although earthquake clustering is the clearest empirical evidence about earthquake occurrence, and OEF clustering models are the most (successfully) tested hazard models in seismology, we note that some seismologists are still reluctant to accept their scientific reliability. After exploring the motivations of these scientific doubts, we also look into an issue that is often overlooked in this discussion, i.e., in any kind of hazard analysis, we do not use a model because it is the true one, but because it is the better than anything else we can think of. The non-scientific aspects are mostly related to the fact that OEF usually provides weekly probabilities of large eartquakes smaller than 1%. These probabilities are considered by some seismologists too small to be of interest or useful. However, in a recent collaboration with engineers we show that such earthquake probabilities may lead to intolerable individual risk of death. Interestingly, this debate calls for a better definition of the still fuzzy boundaries among the different expertise required for the whole risk mitigation process. The last and probably more pressing challenge is related to the communication to the public. In fact, a wrong message could be useless or even counterproductive. Here we show some progresses that we have made in this field working with communication experts in Italy.

  2. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  3. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  4. Is there a basis for preferring characteristic earthquakes over a Gutenberg-Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg-Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg-Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  5. PFP total operating efficiency calculation and basis of estimate

    SciTech Connect

    SINCLAIR, J.C.

    1999-05-03

    The purpose of the Plutonium Finishing Plant (PFP) Total Operating Efficiency Calculation and Basis of Estimate document is to provide the calculated value and basis of estimate for the Total Operating Efficiency (TOE) for the material stabilization operations to be conducted in 234-52 Building. This information will be used to support both the planning and execution of the Plutonium Finishing Plant (PFP) Stabilization and Deactivation Project's (hereafter called the Project) resource-loaded, integrated schedule.

  6. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  7. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  8. Ground motion following selection of SRS design basis earthquake and associated deterministic approach. Final report: Revision 1

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section`s Seismic Qualification Program for reactor restart.

  9. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  10. Earthquake Response Modeling for a Parked and Operating Megawatt-Scale Wind Turbine

    SciTech Connect

    Prowell, I.; Elgamal, A.; Romanowitz, H.; Duggan, J. E.; Jonkman, J.

    2010-10-01

    Demand parameters for turbines, such as tower moment demand, are primarily driven by wind excitation and dynamics associated with operation. For that purpose, computational simulation platforms have been developed, such as FAST, maintained by the National Renewable Energy Laboratory (NREL). For seismically active regions, building codes also require the consideration of earthquake loading. Historically, it has been common to use simple building code approaches to estimate the structural demand from earthquake shaking, as an independent loading scenario. Currently, International Electrotechnical Commission (IEC) design requirements include the consideration of earthquake shaking while the turbine is operating. Numerical and analytical tools used to consider earthquake loads for buildings and other static civil structures are not well suited for modeling simultaneous wind and earthquake excitation in conjunction with operational dynamics. Through the addition of seismic loading capabilities to FAST, it is possible to simulate earthquake shaking in the time domain, which allows consideration of non-linear effects such as structural nonlinearities, aerodynamic hysteresis, control system influence, and transients. This paper presents a FAST model of a modern 900-kW wind turbine, which is calibrated based on field vibration measurements. With this calibrated model, both coupled and uncoupled simulations are conducted looking at the structural demand for the turbine tower. Response is compared under the conditions of normal operation and potential emergency shutdown due the earthquake induced vibrations. The results highlight the availability of a numerical tool for conducting such studies, and provide insights into the combined wind-earthquake loading mechanism.

  11. Post Test Analysis of a PCCV Model Dynamically Tested Under Simulated Design-Basis Earthquakes

    SciTech Connect

    Cherry, J.; Chokshi, N.; James, R.J.; Rashid, Y.R.; Tsurumaki, S.; Zhang, L.

    1998-11-09

    In a collaborative program between the United States Nuclear Regulatory Commission (USNRC) and the Nuclear Power Engineering Corporation (NUPEC) of Japan under sponsorship of the Ministry of International Trade and Ihdustry, the seismic behavior of Prestressed Concrete Containment Vessels (PCCV) is being investigated. A 1:10 scale PCCV model has been constructed by NUPEC and subjected to seismic simulation tests using the high performance shaking table at the Tadotsu Engineering Laboratory. A primary objective of the testing program is to demonstrate the capability of the PCCV to withstand design basis earthquakes with a significant safety margin against major damage or failure. As part of the collaborative program, Sandia National Laboratories (SNL) is conducting research in state-of-the-art analytical methods for predicting the seismic behavior of PCCV structures, with the eventual goal of understanding, validating, and improving calculations dated to containment structure performance under design and severe seismic events. With the increased emphasis on risk-informed- regulatory focus, more accurate ch&@erization (less uncertainty) of containment structural and functional integri~ is desirable. This paper presents results of post-test calculations conducted at ANATECH to simulate the design level scale model tests.

  12. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  13. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  14. Complete basis for power suppressed collinear-ultrasoft operators

    NASA Astrophysics Data System (ADS)

    Pirjol, Dan; Stewart, Iain W.

    2003-05-01

    We construct operators that describe power corrections in mixed collinear-ultrasoft processes in QCD. We treat the ultrasoft-collinear Lagrangian to O(λ2) and heavy-to-light currents involving collinear quarks to O(λ), including new three body currents. A complete gauge invariant basis is derived which has a full reduction in Dirac structures and is valid for matching at any order in αs. The full set of reparametrization invariance (RPI) constraints is included, and is found to restrict the number of parameters appearing in Wilson coefficients and to rule out some classes of operators. The QCD ultrasoft-collinear Lagrangian has two O(λ2) operators in its gauge invariant form. For the O(λ) heavy-to-light currents there are (4,4,14,14,21) subleading (scalar, pseudoscalar, vector, axial-vector, tensor) currents, where (1,1,4,4,7) have coefficients that are not determined by RPI. In a frame where v⊥=0 and nṡv=1 the total number of currents reduces to (2,2,8,8,13), but the number of undetermined coefficients is the same. The role of these operators and universality of jet functions in the factorization theorem for heavy-to-light form factors is discussed.

  15. Earthquake Early Warning using a Seismogeodetic Approach: An operational plan for Cascadia

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Bodin, P.; Vidale, J. E.; Schmidt, D. A.; Melbourne, T. I.; Scrivner, C. W.; Santillan, V. M.; Szeliga, W. M.; Minson, S. E.; Bock, Y.; Melgar, D.

    2013-12-01

    We present an operational plan for implementing combined seismic and geodetic time series in an earthquake early warning system for Cascadia. The Cascadian subduction zone presents one of the greatest risks for a megaquake in the continental United States. Ascertaining the full magnitude and extent of large earthquakes is problematic for earthquake early warning systems due to instability when double integrating strong-motion records to ground displacement. This problem can be mitigated by augmenting earthquake early warning systems with real-time GPS data, allowing for the progression and spatial extent of large earthquakes to be better resolved due to GPS's ability to measure both dynamic and permanent displacements. The Pacific Northwest Seismic Network (PNSN) at the University of Washington is implementing an integrated seismogeodetic approach to earthquake early warning. Regional GPS data are provided by the Pacific Northwest Geodetic Array (PANGA) at Central Washington University. Precise Point Positioning (PPP) solutions are sent from PANGA to the PNSN through JSON formatted streams and processed with a Python-based quality control (QC) module. The QC module also ingest accelerations from PNSN seismic stations through the Earthworm seismic acquisition and processing system for the purpose of detecting outliers and Kalman filtering when collocated instruments exist. The QC module outputs time aligned and cleaned displacement waveforms to ActiveMQ, an XML-based messaging broker that is currently used in seismic early warning architecture. Earthquake characterization modules read displacement information from ActiveMQ when triggered by warnings from ElarmS earthquake early warning algorithm. Peak ground displacement and P-wave scaling relationships from Kalman filtered waveforms provide initial magnitude estimates. Additional modules perform more complex source modeling such as centroid moment tensors and slip inversions that characterize the full size and

  16. M6.0 South Napa Earthquake Forecasting on the basis of jet stream precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.

    2014-12-01

    Currently earthquake prediction research methods can be divided into the crust change, radon concentration, well water level, animal behavior, Very high frequency (VHF) signals, GPS/TEC in ionospheric variations, thermal infrared radiation (TIR) anomalies. Before major earthquakes (M> 6) occurred, jet stream in the epicenter area will interrupt or velocity flow lines cross. That meaning is that before earthquake happen, atmospheric pressure in high altitude suddenly dropped during 6~12 hours (Wu & Tikhonov, 2014). This technique has been used to predict the strong earthquakes in real time, and then pre-registered on the website. For example: M6.0 Northern California earthquake on 2014/08/24(figure1) , M6.6 Russia earthquake on 2013/10/12(figure2), As far as 2014/08/24 M6.6 earthquake in CA, USA, the front end of the 60knots speed line was at the S.F. on 2014/06/16 12:00, and then after 69 days ,M6.1 earthquake happened. We predicted that magnitude is larger than 5.5 but the period is only 30 days on 2014/07/16 . The deviation of predicted point was about 70 km. Lithosphere-atmosphere-ionosphere (LAI) coupling model may be explained this phenomenon : Ionization of the air produced by an increased emanation of radon at epicenter. The water molecules in the air react with these ions, and then release heat. The heat result in temperature rise in the air. They are also accompanied by a large-scale change in the atmospheric pressure and jet streams morphology.We obtain satisfactory accuracy of estimation of the epicenter location. As well we define the short alarm period. That's the positive aspects of our forecast. However, estimates of magnitude jet contain a big uncertainty.Reference:H.C Wu, I.N. Tikhonov, 2014, "Jet streams anomalies as possible short-term precursors of earthquakes with M>6.0", Research in geophysics, DOI: http://dx.doi.org/10.4081/ rg.2014.4939 http://www.pagepress.org/journals/index.php/rg/article/view/rg.2014.4939

  17. PBO Southwest Region: Baja Earthquake Response and Network Operations

    NASA Astrophysics Data System (ADS)

    Walls, C. P.; Basset, A.; Mann, D.; Lawrence, S.; Jarvis, C.; Feaux, K.; Jackson, M. E.

    2011-12-01

    The SW region of the Plate Boundary Observatory consists of 455 continuously operating GPS stations located principally along the transform system of the San Andreas fault and Eastern California Shear Zone. In the past year network uptime exceeded an average of 97% with greater than 99% data acquisition. Communications range from CDMA modem (307), radio (92), Vsat (30), DSL/T1/other (25) to manual downloads (1). Sixty-three stations stream 1 Hz data over the VRS3Net typically with <0.5 second latency. Over 620 maintenance activities were performed during 316 onsite visits out of approximately 368 engineer field days. Within the past year there have been 7 incidences of minor (attempted theft) to moderate vandalism (solar panel stolen) with one total loss of receiver and communications gear. Security was enhanced at these sites through fencing and more secure station configurations. In the past 12 months, 4 new stations were installed to replace removed stations or to augment the network at strategic locations. Following the M7.2 El Mayor-Cucapah earthquake CGPS station P796, a deep-drilled braced monument, was constructed in San Luis, AZ along the border within 5 weeks of the event. In addition, UNAVCO participated in a successful University of Arizona-led RAPID proposal for the installation of six continuous GPS stations for post-seismic observations. Six stations are installed and telemetered through a UNAM relay at the Sierra San Pedro Martir. Four of these stations have Vaisala WXT520 meteorological sensors. An additional site in the Sierra Cucapah (PTAX) that was built by CICESE, an Associate UNAVCO Member institution in Mexico, and Caltech has been integrated into PBO dataflow. The stations will be maintained as part of the PBO network in coordination with CICESE. UNAVCO is working with NOAA to upgrade PBO stations with WXT520 meteorological sensors and communications systems capable of streaming real-time GPS and met data. The real-time GPS and

  18. Operational earthquake forecasting in California: A prototype system combining UCERF3 and CyberShake

    NASA Astrophysics Data System (ADS)

    Milner, K. R.; Jordan, T. H.; Field, E. H.

    2014-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about time-dependent earthquake probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To attain this goal, OEF must provide a complete description of the seismic hazard—ground motion exceedance probabilities as well as short-term rupture probabilities—in concert with the long-term forecasts of probabilistic seismic hazard analysis. We have combined the Third Uniform California Earthquake Rupture Forecast (UCERF3) of the Working Group on California Earthquake Probabilities (Field et al., 2014) with the CyberShake ground-motion model of the Southern California Earthquake Center (Graves et al., 2011; Callaghan et al., this meeting) into a prototype OEF system for generating time-dependent hazard maps. UCERF3 represents future earthquake activity in terms of fault-rupture probabilities, incorporating both Reid-type renewal models and Omori-type clustering models. The current CyberShake model comprises approximately 415,000 earthquake rupture variations to represent the conditional probability of future shaking at 285 geographic sites in the Los Angeles region (~236 million horizontal-component seismograms). This combination provides significant probability gains relative to OEF models based on empirical ground-motion prediction equations (GMPEs), primarily because the physics-based CyberShake simulations account for the rupture directivity, basin effects, and directivity-basin coupling that are not represented by the GMPEs.

  19. Circuit breaker operation and potential failure modes during an earthquake: a preliminary investigation

    SciTech Connect

    Lambert, H.E.

    1984-04-09

    This study addresses the effect of a strong-motion earthquake on circuit breaker operation. It focuses on the loss of offsite power (LOSP) transient caused by a strong-motion earthquake at the Zion Nuclear Power Plant. This report also describes the operator action necessary to prevent core melt if the above circuit breaker failure modes occur simultaneously on three 4.16 KV buses. Numerous circuit breakers important to plant safety, such as circuit breakers to diesel generators and engineered safety systems, (ESS), must open and/or close during this transient while strong motion is occurring. Nearly 500 electrical drawings were examined to address the effects of earthquakes on circuit breaker operation. Due to the complexity of the problem, this study is not intended to be definitive but serves as a focusing tool for further work. 5 references, 9 figures, 3 tables.

  20. Development of Design Basis Earthquake Parameters for TMI-2 Independent Spent Fuel Storage Installation at the INEEL

    SciTech Connect

    URS Greiner Woodward Clyde Federal Services; Geomatrix Consultants; Pacific Engineering and Analysis; S. M. Payne

    1999-11-01

    Probabilistically-based Design Basis Earthquake (DBE) ground motion parameters have been developed for the TMI-2 Independent Spent Fuel Storage Installation (ISFSI) located at the Idaho Nuclear Technology and Engineering Center (INTEC), Idaho National Engineering and Environmental Laboratory. The probabilistic seismic hazard at INTEC has been recomputed using ground motion attenuation relationships more appropriate for extensional tectonic regimes. The empirical attenuation relationships used in this analysis were adjusted for extensional tectonic regimes as part of the Yucca Mountain Project. Seismic hazard curves and uniform hazard spectra for rock produced using the revised attenuation relationships result in lower ground motions when compared to the results of the 1996 INEEL site-wide seismic hazard evaluation. The DBE ground motions for rock and soil have been developed to be applicable to the TMI-2 ISFSI and the entire INTEC site by incorporating variations in the rock and soil properties over the INTEC area. The DBE rock and soil ground motions presented in the report are recommended for use in developing final design earthquake parameters. Peer reviewers of this report support this recommendation. Because the Nuclear Regulatory Commission regulations have recently evolved to incorporate probabilistically-based seismic design for independent fuel storage facilities, a deterministic Maximum Credible Earthquake analysis performed for INTEC earlier in this study is also presented in this report.

  1. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology.

  2. Plutonium uranium extraction (PUREX) end state basis for interim operation (BIO) for surveillance and maintenance

    SciTech Connect

    DODD, E.N.

    1999-05-12

    This Basis for Interim Operation (BIO) was developed for the PUREX end state condition following completion of the deactivation project. The deactivation project has removed or stabilized the hazardous materials within the facility structure and equipment to reduce the hazards posed by the facility during the surveillance and maintenance (S and M) period, and to reduce the costs associated with the S and M. This document serves as the authorization basis for the PUREX facility, excluding the storage tunnels, railroad cut, and associated tracks, for the deactivated end state condition during the S and M period. The storage tunnels, and associated systems and areas, are addressed in WHC-SD-HS-SAR-001, Rev. 1, PUREX Final Safety Analysis Report. During S and M, the mission of the facility is to maintain the conditions and equipment in a manner that ensures the safety of the workers, environment, and the public. The S and M phase will continue until the final decontamination and decommissioning (D and D) project and activities are begun. Based on the methodology of DOE-STD-1027-92, Hazards Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, the final facility hazards category is identified as hazards category This considers the remaining material inventories, form and distribution of the material, and the energies present to initiate events of concern. Given the current facility configuration, conditions, and authorized S and M activities, there are no operational events identified resulting in significant hazard to any of the target receptor groups (e.g., workers, public, environment). The only accident scenarios identified with consequences to the onsite co-located workers were based on external natural phenomena, specifically an earthquake. The dose consequences of these events are within the current risk evaluation guidelines and are consistent with the expectations for a hazards category 2

  3. The G-FAST Geodetic Earthquake Early Warning System: Operational Performance and Synthetic Testing

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Schmidt, D. A.; Bodin, P.; Vidale, J. E.; Melbourne, T. I.; Santillan, V. M.

    2015-12-01

    The G-FAST (Geodetic First Approximation of Size and TIming) earthquake early warning module is part of a joint seismic and geodetic earthquake early warning system currently under development at the Pacific Northwest Seismic Network (PNSN). Our two-stage approach to earthquake early warning includes: (1) initial detection and characterization from PNSN strong-motion and broadband data with the ElarmS package within ShakeAlert, and then (2) modeling of GPS data from the Pacific Northwest Geodetic Array (PANGA). The two geodetic modeling modules are (1) a fast peak-ground-displacement magnitude and depth estimate and (2) a CMT-based finite fault inversion that utilizes coseismic offsets to compute earthquake extent, slip and magnitude. The seismic and geodetic source estimates are then combined in a decision module currently under development. In this presentation, we first report on the operational performance during the first several months that G-FAST has been live with respect to magnitude estimates, timing information, and stability. Secondly, we report on the performance of the G-FAST test system using simulated displacements from plausible Cascadian earthquake scenarios. The test system permits us to: (1) replay segments of actual seismic waveform data recorded from the PNSN and neighboring networks to investigate both earthquakes and noise conditions, and (2) broadcast synthetic data into the system to simulate signals we anticipate from earthquakes for which we have no actual ground motion recordings. The test system lets us also simulate various error conditions (latent and/or out-of-sequence data, telemetry drop-outs, etc.) in order to explore how best to mitigate them. For example, we show for a replay of the 2001 M6.8 Nisqually earthquake that telemetry drop-outs create the largest variability and biases in magnitude and depth estimates whereas latency only causes some variability towards the beginning of the recordings before quickly stabilizing

  4. Development of Site-Specific Soil Design Basis Earthquake (DBE) Parameters for the Integrated Waste Treatment Unit (IWTU)

    SciTech Connect

    Payne, Suzette

    2008-08-01

    Horizontal and vertical PC 3 (2,500 yr) Soil Design Basis Earthquake (DBE) 5% damped spectra, corresponding time histories, and strain-compatible soil properties were developed for the Integrated Waste Treatment Unit (IWTU). The IWTU is located at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Laboratory (INL). Mean and 84th percentile horizontal DBE spectra derived from site-specific site response analyses were evaluated for the IWTU. The horizontal and vertical PC 3 (2,500 yr) Soil DBE 5% damped spectra at the 84th percentile were selected for Soil Structure Interaction (SSI) analyses at IWTU. The site response analyses were performed consistent with applicable Department of Energy (DOE) Standards, recommended guidance of the Nuclear Regulatory Commission (NRC), American Society of Civil Engineers (ASCE) Standards, and recommendations of the Blue Ribbon Panel (BRP) and Defense Nuclear Facilities Safety Board (DNFSB).

  5. Dermatology aboard the USNS COMFORT: Disaster relief operations in Haiti after the 2010 earthquake.

    PubMed

    Galeckas, Kenneth

    2011-01-01

    On the 12th of January 2010, Haiti was struck by a 7.0 Richter magnitude earthquake that devastated its already fragile capital region. Approximately 230,000 people died immediately or during ensuing weeks, mostly due to acute trauma. Countless others suffered significant life- or limb-threatening injuries. As a part of the United States' response to this tragedy, eventually named Operation Unified Response, the United States Navy deployed hundreds of physicians and other medical response individuals on a hospital ship. Operation Unified Response was a military joint task force operation augmented by governmental and nongovernmental organizations. Its mission was to bring medical and logistical support to the region.

  6. The Earthquake Prediction Experiment on the Basis of the Jet Stream's Precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.; Tikhonov, I. N.

    2014-12-01

    Simultaneous analysis of the jet stream maps and EQ data of M > 6.0 have been made. 58 cases of EQ occurred in 2006-2010 were studied. It has been found that interruption or velocity flow lines cross above an epicenter of EQ take place 1-70 days prior to event. The duration was 6-12 hours. The assumption is that jet stream will go up or down near an epicenter. In 45 cases the distance between epicenters and jet stream's precursor does not exceed 90 km. The forecast during 30 days before the EQ was 66.1 % (Wu and Tikhonov, 2014). This technique has been used to predict the strong EQ and pre-registered on the website (for example, the 23 October 2011, M 7.2 EQ (Turkey); the 20 May 2012, M 6.1 EQ (Italy); the 16 April 2013, M 7.8 EQ (Iran); the 12 November 2013, M 6.6 EQ (Russia); the 03 March 2014, M 6.7 Ryukyu EQ (Japan); the 21 July 2014, M 6.2 Kuril EQ). We obtain satisfactory accuracy of the epicenter location. As well we define the short alarm period. That's the positive aspects of forecast. However, estimates of magnitude contain a big uncertainty. Reference Wu, H.C., Tikhonov, I.N., 2014. Jet streams anomalies as possible short-term precursors of earthquakes with M > 6.0. Research in Geophysics, Special Issue on Earthquake Precursors. Vol. 4. No 1. doi:10.4081/rg.2014.4939. The precursor of M9.0 Japan EQ on 2011/03/11(fig1). A. M6.1 Italy EQ (2012/05/20, 44.80 N, 11.19 E, H = 5.1 km) Prediction: 2012/03/20~2012/04/20 (45.6 N, 10.5 E), M > 5.5(fig2) http://ireport.cnn.com/docs/DOC-764800 B. M7.8 Iran EQ (2013/04/16, 28.11 N, 62.05 E, H = 82.0 km) Prediction: 2013/01/14~2013/02/04 (28.0 N, 61.3 E) M > 6.0(fig3) http://ireport.cnn.com/docs/DOC-910919 C. M6.6 Russia EQ (2013/11/12, 54.68 N, 162.29 E, H = 47.2 km). Prediction: 2013/10/27~2013/11/13 (56.0 N, 162.9 E) M > 5.5 http://ireport.cnn.com/docs/DOC-1053599 D. M6.7 Japan EQ (2014/03/03, 27.41 N, 127.34 E, H = 111.2 km). Prediction: 2013/12/02 ~2014/01/15 (26.7 N, 128.1 E) M > 6.5(fig4) http

  7. [How to solve hospital operating problems based on the experience of the Hanshin earthquake?].

    PubMed

    Yoshikawa, J

    1995-06-01

    Immediately after the Hanshin (Kobe-Osaka) earthquake, electricity, water and gas supplies were discontinued at Kobe General Hospital, causing difficulties with many important hospital functions including the water cooled independent electric power plant, respirators, and sterilizers. A large water storage facility is needed to keep the water cooled type independent power plant operating. Alternative plans including the introduction of an air cooled independent power plant and a sea water to fresh water exchange system should be considered. Portable compressors are needed to retain the function of respirators in the absence of a water supply. The emergency use of propane gas should also be considered for sterilization and cooking facilities. There were very great problems in communication after the earthquake. The only method was the use of public phones, which have priority over private lines. Therefore, each hospital should have phones with similar priority. In addition, the use of personal computers and/or computer network methods should be considered to preserve a high level of communication after an earthquake. Otherwise, a hospital should be equipped with wireless phones. Immediately after the earthquake, the care of critically ill patients could not be achieved. Therefore, 20 cardiac patients requiring intensive care had to be transferred to other heart centers. The best method for the transfer is by helicopter. Kobe City suffered a transport crisis which occurred immediately after the earthquake and continued up to the end of March. A big helicopter or a special bus guided by a police car should be considered for hospital staff transport.(ABSTRACT TRUNCATED AT 250 WORDS)

  8. Planning a Preliminary program for Earthquake Loss Estimation and Emergency Operation by Three-dimensional Structural Model of Active Faults

    NASA Astrophysics Data System (ADS)

    Ke, M. C.

    2015-12-01

    Large scale earthquakes often cause serious economic losses and a lot of deaths. Because the seismic magnitude, the occurring time and the occurring location of earthquakes are still unable to predict now. The pre-disaster risk modeling and post-disaster operation are really important works of reducing earthquake damages. In order to understanding disaster risk of earthquakes, people usually use the technology of Earthquake simulation to build the earthquake scenarios. Therefore, Point source, fault line source and fault plane source are the models which often are used as a seismic source of scenarios. The assessment results made from different models used on risk assessment and emergency operation of earthquakes are well, but the accuracy of the assessment results could still be upgrade. This program invites experts and scholars from Taiwan University, National Central University, and National Cheng Kung University, and tries using historical records of earthquakes, geological data and geophysical data to build underground three-dimensional structure planes of active faults. It is a purpose to replace projection fault planes by underground fault planes as similar true. The analysis accuracy of earthquake prevention efforts can be upgraded by this database. Then these three-dimensional data will be applied to different stages of disaster prevention. For pre-disaster, results of earthquake risk analysis obtained by the three-dimensional data of the fault plane are closer to real damage. For disaster, three-dimensional data of the fault plane can be help to speculate that aftershocks distributed and serious damage area. The program has been used 14 geological profiles to build the three dimensional data of Hsinchu fault and HisnCheng faults in 2015. Other active faults will be completed in 2018 and be actually applied on earthquake disaster prevention.

  9. Planning Matters: Response Operations following the 30 September 2009 Sumatran Earthquake

    NASA Astrophysics Data System (ADS)

    Comfort, L. K.; Cedillos, V.; Rahayu, H.

    2009-12-01

    Response operations following the 9/30/2009 West Sumatra earthquake tested extensive planning that had been done in Indonesia since the 26 December 2004 Sumatran Earthquake and Tsunami. After massive destruction in Aceh Province in 2004, the Indonesian National Government revised its national disaster management plans. A key component was to select six cities in Indonesia exposed to significant risk and make a focused investment of resources, planning activities, and public education to reduce risk of major disasters. Padang City was selected for this national “showcase” for disaster preparedness, planning, and response. The question is whether planning improved governmental performance and coordination in practice. There is substantial evidence that disaster preparedness planning and training initiated over the past four years had a positive effect on Padang in terms of disaster risk reduction. The National Disaster Management Agency (BNPB, 10/28/09) reported the following casualties: Padang City: deaths, 383; severe injuries, 431, minor injuries, 771. Province of West Sumatra: deaths, 1209; severe injuries, 67; minor injuries, 1179. These figures contrasted markedly with the estimated losses following the 2004 Earthquake and Tsunami when no training had been done: Banda Aceh, deaths, 118,000; Aceh Province, dead/missing, 236,169 (ID Health Ministry 2/22/05). The 2004 events were more severe, yet the comparable scale of loss was significantly lower in the 9/30/09 earthquake. Three factors contributed to reducing disaster risk in Padang and West Sumatra. First, annual training exercises for tsunami warning and evacuation had been organized by national agencies since 2004. In 2008, all exercises and training activities were placed under the newly established BNPB. The exercise held in Padang in February, 2009 served as an organizing framework for response operations in the 9/30/09 earthquake. Public officers with key responsibilities for emergency operations

  10. TECHNICAL BASIS FOR VENTILATION REQUIREMENTS IN TANK FARMS OPERATING SPECIFICATIONS DOCUMENTS

    SciTech Connect

    BERGLIN, E J

    2003-06-23

    This report provides the technical basis for high efficiency particulate air filter (HEPA) for Hanford tank farm ventilation systems (sometimes known as heating, ventilation and air conditioning [HVAC]) to support limits defined in Process Engineering Operating Specification Documents (OSDs). This technical basis included a review of older technical basis and provides clarifications, as necessary, to technical basis limit revisions or justification. This document provides an updated technical basis for tank farm ventilation systems related to Operation Specification Documents (OSDs) for double-shell tanks (DSTs), single-shell tanks (SSTs), double-contained receiver tanks (DCRTs), catch tanks, and various other miscellaneous facilities.

  11. Theoretical basis for operational ensemble forecasting of coronal mass ejections

    NASA Astrophysics Data System (ADS)

    Pizzo, V. J.; Koning, C.; Cash, M.; Millward, G.; Biesecker, D. A.; Puga, L.; Codrescu, M.; Odstrcil, D.

    2015-10-01

    We lay out the theoretical underpinnings for the application of the Wang-Sheeley-Arge-Enlil modeling system to ensemble forecasting of coronal mass ejections (CMEs) in an operational environment. In such models, there is no magnetic cloud component, so our results pertain only to CME front properties, such as transit time to Earth. Within this framework, we find no evidence that the propagation is chaotic, and therefore, CME forecasting calls for different tactics than employed for terrestrial weather or hurricane forecasting. We explore a broad range of CME cone inputs and ambient states to flesh out differing CME evolutionary behavior in the various dynamical domains (e.g., large, fast CMEs launched into a slow ambient, and the converse; plus numerous permutations in between). CME propagation in both uniform and highly structured ambient flows is considered to assess how much the solar wind background affects the CME front properties at 1 AU. Graphical and analytic tools pertinent to an ensemble approach are developed to enable uncertainties in forecasting CME impact at Earth to be realistically estimated. We discuss how uncertainties in CME pointing relative to the Sun-Earth line affects the reliability of a forecast and how glancing blows become an issue for CME off-points greater than about the half width of the estimated input CME. While the basic results appear consistent with established impressions of CME behavior, the next step is to use existing records of well-observed CMEs at both Sun and Earth to verify that real events appear to follow the systematic tendencies presented in this study.

  12. Compressed plane waves yield a compactly supported multiresolution basis for the Laplace operator.

    PubMed

    Ozoliņš, Vidvuds; Lai, Rongjie; Caflisch, Russel; Osher, Stanley

    2014-02-01

    This paper describes an L1 regularized variational framework for developing a spatially localized basis, compressed plane waves, that spans the eigenspace of a differential operator, for instance, the Laplace operator. Our approach generalizes the concept of plane waves to an orthogonal real-space basis with multiresolution capabilities.

  13. A century of oilfield operations and earthquakes in the greater Los Angeles Basin, southern California

    USGS Publications Warehouse

    Hauksson, Egill; Goebel, Thomas; Ampuero, Jean-Paul; Cochran, Elizabeth S.

    2015-01-01

    Most of the seismicity in the Los Angeles Basin (LA Basin) occurs at depth below the sediments and is caused by transpressional tectonics related to the big bend in the San Andreas fault. However, some of the seismicity could be associated with fluid extraction or injection in oil fields that have been in production for almost a century and cover ∼ 17% of the basin. In a recent study, first the influence of industry operations was evaluated by analyzing seismicity characteristics, including normalized seismicity rates, focal depths, and b-values, but no significant difference was found in seismicity characteristics inside and outside the oil fields. In addition, to identify possible temporal correlations, the seismicity and available monthly fluid extraction and injection volumes since 1977 were analyzed. Second, the production and deformation history of the Wilmington oil field were used to evaluate whether other oil fields are likely to experience similar surface deformation in the future. Third, the maximum earthquake magnitudes of events within the perimeters of the oil fields were analyzed to see whether they correlate with total net injected volumes, as suggested by previous studies. Similarly, maximum magnitudes were examined to see whether they exhibit an increase with net extraction volume. Overall, no obvious previously unidentified induced earthquakes were found, and the management of balanced production and injection of fluids appears to reduce the risk of induced-earthquake activity in the oil fields.

  14. Dermatology aboard the USNS COMFORT: Disaster relief operations in Haiti after the 2010 earthquake.

    PubMed

    Galeckas, Kenneth

    2011-01-01

    On the 12th of January 2010, Haiti was struck by a 7.0 Richter magnitude earthquake that devastated its already fragile capital region. Approximately 230,000 people died immediately or during ensuing weeks, mostly due to acute trauma. Countless others suffered significant life- or limb-threatening injuries. As a part of the United States' response to this tragedy, eventually named Operation Unified Response, the United States Navy deployed hundreds of physicians and other medical response individuals on a hospital ship. Operation Unified Response was a military joint task force operation augmented by governmental and nongovernmental organizations. Its mission was to bring medical and logistical support to the region. PMID:21095523

  15. Flammable gas deflagration consequence calculations for the tankwaste remediation system basis for interim operation

    SciTech Connect

    Van Vleet, R.J., Westinghouse Hanford

    1996-09-23

    This paper calculates the radiological dose consequences and the toxic exposures for deflagration accidents at various Tank Waste Remediation System facilities. These will be used in support of the Tank Waste Remediation System Basis for Interim Operation.

  16. Basis for Interim Operation for the K-Reactor in Cold Standby

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The Basis for Interim Operation (BIO) document for K Reactor in Cold Standby and the L- and P-Reactor Disassembly Basins was prepared in accordance with the draft DOE standard for BIO preparation (dated October 26, 1993).

  17. Basis for Resource Allocation: Analysis of Operations in a Large Library System

    ERIC Educational Resources Information Center

    Martin, Gordon P.; West, Martha W.

    1975-01-01

    Reports on the efforts of the California State University and Colleges to develop a series of cost studies to identify and analyze specific library operations as a basis for decision-making among alternatives. (Author/PF)

  18. Real-time earthquake alert system for the greater San Francisco Bay Area: a prototype design to address operational issues

    SciTech Connect

    Harben, P.E.; Jarpe, S.; Hunter, S.

    1996-05-29

    This paper describes a prototype for this EAS (real time) in the Bay area. Approach is pragmatic, attempting to establish a prototype system at a low cost and quickly. A real-time warning system can protect the public and mitigate earthquake damage. The proposed system is a distributed network of real-time strong-motion monitoring stations that telemetered data in real time to a central analysis facility which could transmit earthquake parameter information to an area before elastic wave energy arrived. Upgrades and issues that should be resolved before an operational EAS can be established, are listed.

  19. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years.

  20. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years. PMID:25156190

  1. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  2. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2014-07-01 2014-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  3. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2013-07-01 2013-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  4. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2012-07-01 2012-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  5. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2011-07-01 2011-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  6. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2010-07-01 2010-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  7. On the Physical Basis of Rate Law Formulations for River Evolution, and their Applicability to the Simulation of Evolution after Earthquakes

    NASA Astrophysics Data System (ADS)

    An, C.; Parker, G.; Fu, X.

    2015-12-01

    River morphology evolves in response to trade-offs among a series of environmental forcing factors, and this evolution will be disturbed if such environmental factors change. One example of response to chronic disturbance is the intensive river evolution after earthquakes in southwest China's mountain areas. When simulating river morphological response to environmental disturbance, an exponential rate law with a specified characteristic response time is often regarded as a practical tool for quantification. As conceptual models, empirical rate law formulations can be used to describe broad brush morphological response, but their physically basis is not solid in that they do not consider the details of morphodynamic processes. Meanwhile, river evolution can also be simulated with physically-based morphodynamic models which conserve sediment via the Exner equation. Here we study the links between the rate law formalism and the Exner equation through solving the Exner equation mathematically and numerically. The results show that, when implementing a very simplified form of a relation for bedload transport, the Exner equation can be reduced to the diffusion equation, the solution of which is a Gaussian function. This solution coincides with the solution associated with rate laws, thus providing a physical basis for such formulations. However, when the complexities of a natural river are considered, the solution of the Exner equation will no longer be a simple Gaussian function. Under such circumstances, the rate law becomes invalid, and a full understanding of the response of rivers to earthquakes requires a complete morphodynamic model.

  8. Real-time operative earthquake forecasting: the case of L'Aquila sequence

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Lombardi, A.

    2009-12-01

    A reliable earthquake forecast is one of the fundamental components required for reducing seismic risk. Despite very recent efforts devoted to test the validity of available models, the present skill at forecasting the evolution of seismicity is still largely unknown. The recent Mw 6.3 earthquake - that struck near the city of L'Aquila, Italy on April 6, 2009, causing hundreds of deaths and vast damages - offered to scientists a unique opportunity to test for the first time the forecasting capability in a real-time application. Here, we describe the results of this first prospective experiment. Immediately following the large event, we began producing daily one-day earthquake forecasts for the region, and we provided these forecasts to Civil Protection - the agency responsible for managing the emergency. The forecasts are based on a stochastic model that combines the Gutenberg-Richter distribution of earthquake magnitudes and power-law decay in space and time of triggered earthquakes. The results from the first month following the L'Aquila earthquake exhibit a good fit between forecasts and observations, indicating that accurate earthquake forecasting is now a realistic goal. Our experience with this experiment demonstrates an urgent need for a connection between probabilistic forecasts and decision-making in order to establish - before crises - quantitative and transparent protocols for decision support.

  9. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  10. Lessons Learned from Eight Years' Experience of Actual Operation, and Future Prospects of JMA Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Nishimae, Y.

    2015-12-01

    Since 2007, experiences of actual operation of EEW have been gained by the Japan Meteorological Agency (JMA). During this period, we have learned lessons from many M6- and M7-class earthquakes, and the Mw9.0 Tohoku earthquake. During the Mw9.0 Tohoku earthquake, JMA system functioned well: it issued a warning message more than 15 s before strong ground shaking in the Tohoku district (relatively near distance from the epicenter). However, it was not perfect: in addition to the problem of large extent of fault rupture, some false warning messages were issued due to the confusion of the system because of simultaneous multiple aftershocks which occurred at the wide rupture area. To address the problems, JMA will introduce two new methods into the operational system this year to start their tests, aiming at practical operation within a couple of years. One is Integrated Particle Filter (IPF) method, which is an integrated algorithm of multiple hypocenter determination techniques with Bayesian estimation, in which amplitude information is also used for hypocenter determination. The other is Propagation of Local Undamped Motion (PLUM) method, in which warning message is issued when strong ground shaking is detected at nearby stations around the target site (e.g., within 30 km). Here, hypocenter and magnitude are not required in PLUM. Aiming at application for several years later, we are investigating a new approach, in which current wavefield is estimated in real time, and then future wavefield is predicted time evolutionally from the current situation using physics of wave propagation. Here, hypocenter and magnitude are not necessarily required, but real-time observation of ground shaking is necessary. JMA also plans to predict long period ground motion (up to 8 s) with the EEW system for earthquake damage mitigation in high-rise buildings. Its test will start using the operational system in the near future.

  11. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  12. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  13. Computing single step operators of logic programming in radial basis function neural networks

    SciTech Connect

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  14. The investigation of the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents.

    PubMed

    Hekimoglu, Yavuz; Dursun, Recep; Karadas, Sevdegul; Asirdizer, Mahmut

    2015-10-01

    The purpose of this study is to identify the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents. In this study, we evaluated 245 patients of occupational accidents who were admitted to emergency services of Van city hospitals in the 1-year periods including pre-earthquake and post-earthquake. We determined that there was a 63.4% (P < 0.05) increase in work-related accidents in the post-earthquake period compared to the pre-earthquake period. Also, injuries due to occupational accidents increased 211% (P < 0.05) in the construction industry, the rate of injuries due to falls from height increased 168% (P < 0.05), and the rate of traumas to the head and upper limbs increased 200% (P < 0.05) and 130% (P < 0.05), respectively, in the post-earthquake period compared to the pre-earthquake period. We determined that the ignoring of measures for occupational health and safety by employers and employees during conducted rapid construction activities and post-earthquake restoration works in order to remove the effects of the earthquake increased the number of work accidents. In this study, the impact of disasters such as earthquakes on the accidents at work was evaluated as we have not seen in literature. This study emphasizes that governments should make regulations and process relating to the post-disaster business before the emergence of disaster by taking into account factors that may increase their work-related accidents.

  15. The Mixed Waste Management Facility. Design basis integrated operations plan (Title I design)

    SciTech Connect

    1994-12-01

    The Mixed Waste Management Facility (MWMF) will be a fully integrated, pilotscale facility for the demonstration of low-level, organic-matrix mixed waste treatment technologies. It will provide the bridge from bench-scale demonstrated technologies to the deployment and operation of full-scale treatment facilities. The MWMF is a key element in reducing the risk in deployment of effective and environmentally acceptable treatment processes for organic mixed-waste streams. The MWMF will provide the engineering test data, formal evaluation, and operating experience that will be required for these demonstration systems to become accepted by EPA and deployable in waste treatment facilities. The deployment will also demonstrate how to approach the permitting process with the regulatory agencies and how to operate and maintain the processes in a safe manner. This document describes, at a high level, how the facility will be designed and operated to achieve this mission. It frequently refers the reader to additional documentation that provides more detail in specific areas. Effective evaluation of a technology consists of a variety of informal and formal demonstrations involving individual technology systems or subsystems, integrated technology system combinations, or complete integrated treatment trains. Informal demonstrations will typically be used to gather general operating information and to establish a basis for development of formal demonstration plans. Formal demonstrations consist of a specific series of tests that are used to rigorously demonstrate the operation or performance of a specific system configuration.

  16. Power systems after the Northridge earthquake: Emergency operations and changes in seismic equipment specifications, practice, and system configuration

    SciTech Connect

    Schiff, A.J.; Tognazzini, R.; Ostrom, D.

    1995-12-31

    The Northridge earthquake caused extensive damage to high voltage substation equipment, and for the first time the failure of transmission towers. Power was lost to much of the earthquake impacted area, and 93% of the customers were restored within 24 hours. To restore service, damage monitoring, communication and protective equipment, such as current-voltage transformers, wave traps, and lightning arresters, were removed or bypassed and operation restored. To improve performance some porcelain members are being replaced with composite materials for bushings, current-voltage transformers and lightning arresters. Interim equipment seismic specifications for equipment have been instituted. Some substations are being re-configured and rigid bus and conductors are being replaced with flexible conductors. Non-load carrying conductors, such as those used on lightning arrester, are being reduced in size to reduce potential interaction problems. Better methods of documenting damage and repair costs are being considered.

  17. Representation of discrete Steklov-Poincare operator arising in domain decomposition methods in wavelet basis

    SciTech Connect

    Jemcov, A.; Matovic, M.D.

    1996-12-31

    This paper examines the sparse representation and preconditioning of a discrete Steklov-Poincare operator which arises in domain decomposition methods. A non-overlapping domain decomposition method is applied to a second order self-adjoint elliptic operator (Poisson equation), with homogeneous boundary conditions, as a model problem. It is shown that the discrete Steklov-Poincare operator allows sparse representation with a bounded condition number in wavelet basis if the transformation is followed by thresholding and resealing. These two steps combined enable the effective use of Krylov subspace methods as an iterative solution procedure for the system of linear equations. Finding the solution of an interface problem in domain decomposition methods, known as a Schur complement problem, has been shown to be equivalent to the discrete form of Steklov-Poincare operator. A common way to obtain Schur complement matrix is by ordering the matrix of discrete differential operator in subdomain node groups then block eliminating interface nodes. The result is a dense matrix which corresponds to the interface problem. This is equivalent to reducing the original problem to several smaller differential problems and one boundary integral equation problem for the subdomain interface.

  18. Operant self-administration models for testing the neuropharmacological basis of ethanol consumption in rats.

    PubMed

    June, Harry L; Gilpin, Nicholas W

    2010-04-01

    Operant self-administration procedures are used to assess the neural basis of ethanol-seeking behavior under a wide range of experimental conditions. In general, rats do not spontaneously self-administer ethanol in pharmacologically meaningful amounts. This unit provides a step-by-step guide for training rats to self-administer quantities of ethanol that produce moderate to high blood-alcohol content. Different protocols are used for rats that are genetically heterogeneous versus rats that are selectively bred for high alcohol preference. Also, these protocols have different sets of advantages and disadvantages in terms of the ability to control for caloric intake and taste of solutions in operant testing. Basic self-administration protocols can also be altered to focus on different aspects of the motivational properties of ethanol (for example, those related to dependence). This unit provides multiple protocols that lead to alcohol intake in rats, which can be pharmacologically probed relative to a variety of control conditions.

  19. Real-time earthquake alert system for the greater San Francisco Bay Area: a prototype design to address operational issues

    SciTech Connect

    Harben, P.E.; Jarpe, S.; Hunter, S.

    1996-12-10

    The purpose of the earthquake alert system (EAS) is to outrun the seismic energy released in a large earthquake using a geographically distributed network of strong motion sensors that telemeter data to a rapid CPU-processing station, which then issues an area-wide warning to a region before strong motion will occur. The warning times involved are short, from 0 to 30 seconds or so; consequently, most responses must be automated. The San Francisco Bay Area is particularly well suited for an EAS because (1) large earthquakes have relatively shallow hypocenters (10- to 20-kilometer depth), giving favorable ray-path geometries for larger warning times than deeper from earthquakes, and (2) the active faults are few in number and well characterized, which means far fewer geographically distributed strong motion sensors are (about 50 in this region). An EAS prototype is being implemented in the San Francisco Bay Area. The system consists of four distinct subsystems: (1) a distributed strong motion seismic network, (2) a central processing station, (3) a warning communications system and (4) user receiver and response systems. We have designed a simple, reliable, and inexpensive strong motion monitoring station that consists of a three-component Analog Devices ADXLO5 accelerometer sensing unit, a vertical component weak motion sensor for system testing, a 16-bit digitizer with multiplexing, and communication output ports for RS232 modem or radio telemetry. The unit is battery-powered and will be sited in fire stations. The prototype central computer analysis system consists of a PC dam-acquisition platform that pipes the incoming strong motion data via Ethernet to Unix-based workstations for dam processing. Simple real-time algorithms, particularly for magnitude estimation, are implemented to give estimates of the time since the earthquake`s onset its hypocenter location, its magnitude, and the reliability of the estimate. These parameters are calculated and transmitted

  20. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  1. Real Time Earthquake Information System in Japan

    NASA Astrophysics Data System (ADS)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  2. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    SciTech Connect

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  3. Sensitivity Analysis of Ordered Weighted Averaging Operator in Earthquake Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Moradi, M.; Delavar, M. R.; Moshiri, B.

    2013-09-01

    The main objective of this research is to find the extent to which the minimal variability Ordered Weighted Averaging (OWA) model of seismic vulnerability assessment is sensitive to variation of optimism degree. There are a variety of models proposed for seismic vulnerability assessment. In order to examine the efficiency of seismic vulnerability assessment models, the stability of results could be analysed. Seismic vulnerability assessment is done to estimate the probable losses in the future earthquake. Multi-Criteria Decision Making (MCDM) methods have been applied by a number of researchers to estimate the human, physical and financial losses in urban areas. The study area of this research is Tehran Metropolitan Area (TMA) which has more than eight million inhabitants. In addition, this paper assumes that North Tehran Fault (NTF) is activated and caused an earthquake in TMA. 1996 census data is used to extract the attribute values for six effective criteria in seismic vulnerability assessment. The results demonstrate that minimal variability OWA model of Seismic Loss Estimation (SLE) is more stable where the aggregated seismic vulnerability degree has a lower value. Moreover, minimal variability OWA is very sensitive to optimism degree in northern areas of Tehran. A number of statistical units in southern areas of the city also indicate considerable sensitivity to optimism degree due to numerous non-standard buildings. In addition, the change of seismic vulnerability degree caused by variation of optimism degree does not exceed 25 % of the original value which means that the overall accuracy of the model is acceptable.

  4. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1995-01-01

    Incineration as a method of treating radioactive or mixed waste is attractive because of volume reduction, but may result in high concentrations of some hazardous components. For safety reasons during operation, and because of the environmental impact of the plant, it is important to know how these materials partition between the furnace slay, the fly ash, and the stack emission. The chemistry of about 50 elements is discussed and through consideration of high temperature thermodynamic equilibria, an attempt is made to provide a basis for predicting how various radionuclides and heavy metals behave in a typical incinerator. The chemistry of the individual elements is first considered and a prediction of the most stable chemical species in the typical incinerator atmosphere is made. The treatment emphasizes volatility and the parameters considered are temperature, acidity, oxygen, sulfur, and halogen content, and the presence of several other key non-radioactive elements. A computer model is used to calculate equilibrium concentrations of many species in several systems at temperatures ranging from 500 to 1600{degrees}K. It is suggested that deliberate addition of various feed chemicals can have a major impact on the fate of many radionuclides and heavy metals. Several problems concerning limitations and application of the data are considered.

  5. Rethinking first-principles electron transport theories with projection operators: the problems caused by partitioning the basis set.

    PubMed

    Reuter, Matthew G; Harrison, Robert J

    2013-09-21

    We revisit the derivation of electron transport theories with a focus on the projection operators chosen to partition the system. The prevailing choice of assigning each computational basis function to a region causes two problems. First, this choice generally results in oblique projection operators, which are non-Hermitian and violate implicit assumptions in the derivation. Second, these operators are defined with the physically insignificant basis set and, as such, preclude a well-defined basis set limit. We thus advocate for the selection of physically motivated, orthogonal projection operators (which are Hermitian) and present an operator-based derivation of electron transport theories. Unlike the conventional, matrix-based approaches, this derivation requires no knowledge of the computational basis set. In this process, we also find that common transport formalisms for nonorthogonal basis sets improperly decouple the exterior regions, leading to a short circuit through the system. We finally discuss the implications of these results for first-principles calculations of electron transport.

  6. Transient Fluid Flow Along Basement Faults and Rupture Mechanics: Can We Expect Injection-Induced Earthquake Behavior to Correspond Directly With Injection Operations?

    NASA Astrophysics Data System (ADS)

    Norbeck, J. H.; Horne, R. N.

    2015-12-01

    We explored injection-induced earthquake behavior in geologic settings where basement faults are connected hydraulically to overlying saline aquifers targeted for wastewater disposal. Understanding how the interaction between natural geology and injection well operations affects the behavior of injection-induced earthquake sequences has important implications for characterizing seismic hazard risk. Numerical experiments were performed to investigate the extent to which seismicity is influenced by the migration of pressure perturbations along fault zones. Two distinct behaviors were observed: a) earthquake ruptures that were confined to the pressurized region of the fault and b) sustained earthquake ruptures that propagated far beyond the pressure front. These two faulting mechanisms have important implications for assessing the manner in which seismicity can be expected respond to injection well operations.Based upon observations from the numerical experiments, we developed a criterion that can be used to classify the expected faulting behavior near wastewater disposal sites. The faulting criterion depends on the state of stress, the initial fluid pressure, the orientation of the fault, and the dynamic friction coefficient of the fault. If the initial ratio of shear to effective normal stress resolved on the fault (the prestress ratio) is less than the fault's dynamic friction coefficient, then earthquake ruptures will tend to be limited by the distance of the pressure front. In this case, parameters that affect seismic hazard assessment, like the maximum earthquake magnitude or earthquake recurrence interval, could correlate with injection well operational parameters. For example, the maximum earthquake magnitude might be expected to grow over time in a systematic manner as larger patches of the fault are exposed to significant pressure changes. In contrast, if the prestress ratio is greater than dynamic friction, a stress drop can occur outside of the pressurized

  7. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  8. Report on disaster medical operations with acupuncture/massage therapy after the great East Japan earthquake.

    PubMed

    Takayama, Shin; Kamiya, Tetsuharu; Watanabe, Masashi; Hirano, Atsushi; Matsuda, Ayane; Monma, Yasutake; Numata, Takehiro; Kusuyama, Hiroko; Yaegashi, Nobuo

    2012-01-01

    The Great East Japan Earthquake inflicted immense damage over a wide area of eastern Japan with the consequent tsunami. Department of Traditional Asian Medicine, Tohoku University, started providing medical assistance to the disaster-stricken regions mainly employing traditional Asian therapies.We visited seven evacuation centers in Miyagi and Fukushima Prefecture and provided acupuncture/massage therapy. While massage therapy was performed manually, filiform needles and press tack needles were used to administer acupuncture. In total, 553 people were treated (mean age, 54.0 years; 206 men, 347 women). Assessment by interview showed that the most common complaint was shoulder/back stiffness. The rate of therapy satisfaction was 92.3%. Many people answered that they experienced not only physical but also psychological relief.At the time of the disaster, acupuncture/massage therapy, which has both mental and physical soothing effects, may be a therapeutic approach that can be effectively used in combination with Western medical practices. PMID:22563235

  9. LLNL earthquake impact analysis committee report on the Livermore, California, earthquakes of January 24 and 26, 1980

    SciTech Connect

    Not Available

    1980-07-15

    The overall effects of the earthquakes of January 24 and 26, 1980, at the Lawrence Livermore National Laboratory in northern California are outlined. The damage caused by those earthquakes and how employees responded are discussed. The immediate emergency actions taken by management and the subsequent measures to resume operations are summarized. Long-range plans for recovery and repair, and the seisic history of the Livermore Valley region, various investigations concerning the design-basis earthquake (DBE), and seismic criteria for structures are reviewed. Following an analysis of the Laboratory's earthquake preparedness, emergency response, and related matters a series of conclusions and recommendations are presented. Appendixes provide additional information, such as persons interviewed, seismic and site maps, and a summary of the estimated costs incurred from the earthquakes.

  10. Duration and predictors of emergency surgical operations - basis for medical management of mass casualty incidents

    PubMed Central

    2009-01-01

    Background Hospitals have a critically important role in the management of mass causality incidents (MCI), yet there is little information to assist emergency planners. A significantly limiting factor of a hospital's capability to treat those affected is its surgical capacity. We therefore intended to provide data about the duration and predictors of life saving operations. Methods The data of 20,815 predominantly blunt trauma patients recorded in the Trauma Registry of the German-Trauma-Society was retrospectively analyzed to calculate the duration of life-saving operations as well as their predictors. Inclusion criteria were an ISS ≥ 16 and the performance of relevant ICPM-coded procedures within 6 h of admission. Results From 1,228 patients fulfilling the inclusion criteria 1,793 operations could be identified as life-saving operations. Acute injuries to the abdomen accounted for 54.1% followed by head injuries (26.3%), pelvic injuries (11.5%), thoracic injuries (5.0%) and major amputations (3.1%). The mean cut to suture time was 130 min (IQR 65-165 min). Logistic regression revealed 8 variables associated with an emergency operation: AIS of abdomen ≥ 3 (OR 4,00), ISS ≥ 35 (OR 2,94), hemoglobin level ≤ 8 mg/dL (OR 1,40), pulse rate on hospital admission < 40 or > 120/min (OR 1,39), blood pressure on hospital admission < 90 mmHg (OR 1,35), prehospital infusion volume ≥ 2000 ml (OR 1,34), GCS ≤ 8 (OR 1,32) and anisocoria (OR 1,28) on-scene. Conclusions The mean operation time of 130 min calculated for emergency life-saving surgical operations provides a realistic guideline for the prospective treatment capacity which can be estimated and projected into an actual incident admission capacity. Knowledge of predictive factors for life-saving emergency operations helps to identify those patients that need most urgent operative treatment in case of blunt MCI. PMID:20149987

  11. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-04-01

    This report presents preliminary research results from the investigation in to the development of new models and guidance for concepts of operations (ConOps) in advanced small modular reactor (aSMR) designs. In support of this objective, three important research areas were included: operating principles of multi-modular plants, functional allocation models and strategies that would affect the development of new, non-traditional concept of operations, and the requiremetns for human performance, based upon work domain analysis and current regulatory requirements. As part of the approach for this report, we outline potential functions, including the theoretical and operational foundations for the development of a new functional allocation model and the identification of specific regulatory requirements that will influence the development of future concept of operations. The report also highlights changes in research strategy prompted by confirmationof the importance of applying the work domain analysis methodology to a reference aSMR design. It is described how this methodology will enrich the findings from this phase of the project in the subsequent phases and help in identification of metrics and focused studies for the determination of human performance criteria that can be used to support the design process.

  12. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-08-01

    This report presents preliminary research results from the investigation into the development of new models and guidance for Concepts of Operations in advanced small modular reactor (AdvSMR) designs. AdvSMRs are nuclear power plants (NPPs), but unlike conventional large NPPs that are constructed on site, AdvSMRs systems and components will be fabricated in a factory and then assembled on site. AdvSMRs will also use advanced digital instrumentation and control systems, and make greater use of automation. Some AdvSMR designs also propose to be operated in a multi-unit configuration with a single central control room as a way to be more cost-competitive with existing NPPs. These differences from conventional NPPs not only pose technical and operational challenges, but they will undoubtedly also have regulatory compliance implications, especially with respect to staffing requirements and safety standards.

  13. A probabilistic risk assessment of the LLNL Plutonium Facility`s evaluation basis fire operational accident. Revision 1

    SciTech Connect

    Brumburgh, G. P.

    1995-02-27

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous programmatic activities involving plutonium to include device fabrication, development of improved and/or unique fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed in July 1994 to address operational safety and acceptable risk to employees, the public, government property, and the environmental. This paper outlines the PRA analysis of the Evaluation Basis Fire (EBF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility.

  14. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  15. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  16. Construction and operation of a system for secure and precise medical material distribution in disaster areas after Wenchuan earthquake.

    PubMed

    Cheng, Yongzhong; Xu, Jiankang; Ma, Jian; Cheng, Shusen; Shi, Yingkang

    2009-11-01

    After the Wenchuan Earthquake on May 12th , 2008, under the strong leadership of the Sichuan Provincial Party Committee, the People's Government of Sichuan Province, and the Ministry of Health of the People's Republic of China, the Medical Security Team working at the Sichuan Provincial Headquarters for Wenchuan Earthquake and Disaster Relief Work constructed a secure medical material distribution system through coordination and interaction among and between regions, systems, and departments.

  17. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1994-09-01

    For waste containing small amounts of radioactivity, rad waste (RW), or mixed waste (MW) containing both radioactive and chemically hazardous components, incineration is a logical management candidate because of inherent safety, waste volume reduction, and low costs. Successful operation requires that the facility is properly designed and operated to protect workers and to limit releases of hazardous materials. The large decrease in waste volume achieved by incineration also results in a higher concentration of most of the radionuclides and non radioactive heavy metals in the ash products. These concentrations impact subsequent treatment and disposal. The various constituents (chemical elements) are not equal in concentration in the various incinerator feed materials, nor are they equal in their contribution to health risks on subsequent handling, or accidental release. Thus, for management of the wastes it is important to be able to predict how the nuclides partition between the primary combustion residue which may be an ash or a fused slag, the fine particulates or fly ash that is trapped in the burner off-gas by several different techniques, and the airborne fraction that escapes to the atmosphere. The objective of this report is to provide an estimate of how different elements of concern may behave in the chemical environment of the incinerator. The study briefly examines published incinerator operation data, then considers the properties of the elements of concern, and employs thermodynamic calculations, to help predict the fate of these RW and MW constituents. Many types and configurations of incinerators have been designed and tested.

  18. Diagnostics of PF-1000 Facility Operation and Plasma Concentration on the Basis of Spectral Measurements

    SciTech Connect

    Skladnik-Sadowska, E.; Malinowski, K.; Sadowski, M. J.; Scholz, M.; Tsarenko, A. V.

    2006-01-15

    The paper concerns the monitoring of high-current pulse discharges and the determination of the plasma concentration within the dense magnetized plasma by means of optical spectroscopy methods. In experiments with the large PF-1000 facility operated at IPPLM in Warsaw, Poland, attention was paid to the determination of the operational mode and electron concentration under different experimental conditions. To measure the visible radiation (VR) the use was made of the MECHELLE registered 900-spectrometer equipped with the CCD readout. The VR emission, observed at 65 deg. to the z-axis, originated from a part of the electrode surfaces, the collapsing current-sheath layer and the dense plasma pinch-region (40-50 mm from the electrode ends). Considerable differences were found in the optical spectra recorded for so-called 'good shots' and for cases of some failures. Estimates of the electron concentration, which were performed with different spectroscopic techniques, showed that it ranged from 5.56x1018 cm-3 to 4.8x1019 cm-3, depending on experimental conditions. The correlation of the fusion-neutron yield and the plasma density was proved.

  19. 14 CFR 331.35 - What is the basis upon which operators or providers will be reimbursed through the set-aside...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATIONS PROCEDURES FOR REIMBURSEMENT OF GENERAL AVIATION OPERATORS AND SERVICE PROVIDERS IN THE WASHINGTON, DC AREA Set-Aside for Operators or Providers at Certain Airports § 331.35 What is the basis...

  20. Representation of the Translation Operator in a Soft-Slater Basis: Application to the Translation of Coulomb Sturmians

    NASA Astrophysics Data System (ADS)

    Weatherford, Charles; Red Wynn, Eddie, III

    2003-05-01

    The unitary parallel translation operator is represented in a basis of Soft-Slaters as opposed to Coulomb Sturmians [Weatherford Bull. of Am. Phys. Soc. 47, 73 (2002)]. The algorithm involves breaking the translation operator up into a concatenation of small parallel translations, each of which may be represented with a relatively small basis. This algorithm is then used to translate Coulomb Sturmians. Orthonormal Soft-Slaters are constructed using the method of Designer Polynomials [Weatherford, Red, and Wynn IJQC 90, 1289 (2002)]. Soft-Slaters are used for translation because they are analytic throughout their domain and the introduction of an artificial cusp at the translated center is avoided thereby allowing for a pointwise convergent, single-range, addition theorem. This is in contrast to using Coulomb Sturmians to represent the translation operator - these functions are not analytic at their origin and any single-range addition theorem (via the Shibuya-Wulfman matrix) is inevitably weakly convergent and places an artificial cusp at the translated center which can only be removed in the limit of an infinite expansion. Numerical examples will be given. Supported by NSF CREST grant HRD-9707076, NASA grant NAG5-10148 and by the Army High Performance Computing Research Center and the US Army, Army Research Laboratory (DAAH04-95-2-0003/ contract number DAAH04-95-C-0008).

  1. Waste Encapsulation and Storage Facility (WESF) Basis for Interim Operation (BIO)

    SciTech Connect

    COVEY, L.I.

    2000-11-28

    The Waste Encapsulation and Storage Facility (WESF) is located in the 200 East Area adjacent to B Plant on the Hanford Site north of Richland, Washington. The current WESF mission is to receive and store the cesium and strontium capsules that were manufactured at WESF in a safe manner and in compliance with all applicable rules and regulations. The scope of WESF operations is currently limited to receipt, inspection, decontamination, storage, and surveillance of capsules in addition to facility maintenance activities. The capsules are expected to be stored at WESF until the year 2017, at which time they will have been transferred for ultimate disposition. The WESF facility was designed and constructed to process, encapsulate, and store the extracted long-lived radionuclides, {sup 90}Sr and {sup 137}Cs, from wastes generated during the chemical processing of defense fuel on the Hanford Site thus ensuring isolation of hazardous radioisotopes from the environment. The construction of WESF started in 1971 and was completed in 1973. Some of the {sup 137}Cs capsules were leased by private irradiators or transferred to other programs. All leased capsules have been returned to WESF. Capsules transferred to other programs will not be returned except for the seven powder and pellet Type W overpacks already stored at WESF.

  2. Modeling of the Reactor Core Isolation Cooling Response to Beyond Design Basis Operations - Interim Report

    SciTech Connect

    Ross, Kyle; Cardoni, Jeffrey N.; Wilson, Chisom Shawn; Morrow, Charles; Osborn, Douglas; Gauntt, Randall O.

    2015-12-01

    Efforts are being pursued to develop and qualify a system-level model of a reactor core isolation (RCIC) steam-turbine-driven pump. The model is being developed with the intent of employing it to inform the design of experimental configurations for full-scale RCIC testing. The model is expected to be especially valuable in sizing equipment needed in the testing. An additional intent is to use the model in understanding more fully how RCIC apparently managed to operate far removed from its design envelope in the Fukushima Daiichi Unit 2 accident. RCIC modeling is proceeding along two avenues that are expected to complement each other well. The first avenue is the continued development of the system-level RCIC model that will serve in simulating a full reactor system or full experimental configuration of which a RCIC system is part. The model reasonably represents a RCIC system today, especially given design operating conditions, but lacks specifics that are likely important in representing the off-design conditions a RCIC system might experience in an emergency situation such as a loss of all electrical power. A known specific lacking in the system model, for example, is the efficiency at which a flashing slug of water (as opposed to a concentrated jet of steam) could propel the rotating drive wheel of a RCIC turbine. To address this specific, the second avenue is being pursued wherein computational fluid dynamics (CFD) analyses of such a jet are being carried out. The results of the CFD analyses will thus complement and inform the system modeling. The system modeling will, in turn, complement the CFD analysis by providing the system information needed to impose appropriate boundary conditions on the CFD simulations. The system model will be used to inform the selection of configurations and equipment best suitable of supporting planned RCIC experimental testing. Preliminary investigations with the RCIC model indicate that liquid water ingestion by the turbine

  3. [Anesthesia and analgesia in addicts: basis for establishing a standard operating procedure].

    PubMed

    Jage, J; Heid, F

    2006-06-01

    Addicts have an exaggerated organic and psychological comorbidity and in cases of major operations or polytrauma they are classified as high-risk patients. Additional perioperative problems are a higher analgetics requirement, craving, physical and/or psychological withdrawal symptoms, hyperalgesia and tolerance. However, the clinical expression depends on the substance abused. For a better understanding of the necessary perioperative measures, it is helpful to classify the substances into central nervous system depressors (e.g. heroin, alcohol, sedatives, hypnotics), stimulants (e.g. cocaine, amphetamines, designer drugs) and other psychotropic substances (e.g. cannabis, hallucinogens, inhalants). The perioperative therapy should not be a therapy for the addiction, as this is senseless. On the contrary, the characteristics of this chronic disease must be accepted. Anesthesia and analgesia must be generously stress protective and sufficiently analgesically effective. Equally important perioperative treatment principles are stabilization of physical dependence by substitution with methadone (for heroin addicts) or benzodiazepines/clonidine (for alcohol, sedatives and hypnotics addiction), avoidance of stress and craving, thorough intraoperative and postoperative stress relief by using regional techniques or systematically higher than normal dosages of anesthetics and opioids, strict avoidance of inadequate dosage of analgetics, postoperative optimization of regional or systemic analgesia by non-opioids and coanalgetics and consideration of the complex physical and psychological characteristics and comorbidities. Even in cases of abstinence (clean) an inadequate dosage must be avoided as this, and not an adequate pain therapy sometimes even with strong opioids, can potentially activate addiction. A protracted abstinence syndrome after withdrawal of opioids can lead to increased response to administered opioids (e.g. analgesia, side-effects). PMID:16775729

  4. Non-perturbative renormalization of the complete basis of four-fermion operators and B-parameters

    NASA Astrophysics Data System (ADS)

    Conti, L.; Donini, A.; Gimenez, V.; Martinelli, G.; Talevi, M.; Vladikas, A.

    1998-04-01

    We present results on the B-parameters B K, B 7{3}/{2}and B 8{3}/{2}, at β = 6.0, with the tree-level Clover action. The renormalization of the complete basis of dimension-six four-fermion operators has been performed non-perturbatively. Our results for BK and B 7{3}/{2} are in reasonable agreement with those obtained with the (unimproved) Wilson action. This is not the case for B 8{3}/{2}. We also discuss some subtleties arising from a recently proposed modified definition of the B-parameters.

  5. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  6. Response to "Comment on 'Rethinking first-principles electron transport theories with projection operators: the problems caused by partitioning the basis set'" [J. Chem. Phys. 140, 177103 (2014)].

    PubMed

    Reuter, Matthew G; Harrison, Robert J

    2014-05-01

    The thesis of Brandbyge's comment [J. Chem. Phys. 140, 177103 (2014)] is that our operator decoupling condition is immaterial to transport theories, and it appeals to discussions of nonorthogonal basis sets in transport calculations in its arguments. We maintain that the operator condition is to be preferred over the usual matrix conditions and subsequently detail problems in the existing approaches. From this operator perspective, we conclude that nonorthogonal projectors cannot be used and that the projectors must be selected to satisfy the operator decoupling condition. Because these conclusions pertain to operators, the choice of basis set is not germane.

  7. Technical Basis for Safe Operations with Pu-239 in NMS and S Facilities (F and H Areas)

    SciTech Connect

    Bronikowski, M.G.

    1999-03-18

    Plutonium-239 is now being processed in HB-Line and H-Canyon as well as FB-Line and F-Canyon. As part of the effort to upgrade the Authorization Basis for H Area facilities relative to nuclear criticality, a literature review of Pu polymer characteristics was conducted to establish a more quantitative vs. qualitative technical basis for safe operations. The results are also applicable to processing in F Area facilities.The chemistry of Pu polymer formation, precipitation, and depolymerization is complex. Establishing limits on acid concentrations of solutions or changing the valence to Pu(III) or Pu(VI) can prevent plutonium polymer formation in tanks in the B lines and canyons. For Pu(IV) solutions of 7 g/L or less, 0.22 M HNO3 prevents polymer formation at ambient temperature. This concentration should remain the minimum acid limit for the canyons and B lines when processing Pu-239 solutions. If the minimum acid concentration is compromised, the solution may need to be sampled and tested for the presence of polymer. If polymer is not detected, processing may proceed. If polymer is detected, adding HNO3 to a final concentration above 4 M is the safest method for handling the solution. The solution could also be heated to speed up the depolymerization process. Heating with > 4 M HNO3 will depolymerize the solution for further processing.Adsorption of Pu(IV) polymer onto the steel walls of canyon and B line tanks is likely to be 11 mg/cm2, a literature value for unpolished steel. This value will be confirmed by experimental work. Tank-to-tank transfers via steam jets are not expected to produce Pu(IV) polymer unless a larger than normal dilution occurs (e.g., >3 percent) at acidities below 0.4 M.

  8. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  9. Application of Long-period Ground Motion Prediction using Earthquake Early Warning System to Elevator Emergency Operation Control System of a High-Rise Building

    NASA Astrophysics Data System (ADS)

    Kubo, Tomohiro; Hisada, Yoshiaki; Horiuchi, Shigeki; Yamamoto, Shunroku

    We propose the method of the elevator operation control for the long-period ground motion using Earthquake Early Warning System (EEWS) and apply this method to the elevator operation control system of the 29-story building of Kogakuin University in the downtown Tokyo, Shinjuku, Japan. First, we estimate the velocity of surface wave that travels through the crustal calculated by the theoretical method, and we estimate the long-period ground motion by Green's function and calculate the lumped mass model response by the estimated long-period ground motion. Next we develop the trigger condition stopping the elevator based on above results. When EEWS is received, we reference the trigger condition and stop the elevator. Next, we apply the elevator operation control for the long-period ground motion proposed method to Kogakuin University, which is high-rise building and located at the central of Tokyo. We compare the estimation the long-period ground motion by the wavenumber integration with the observation data. As a result, the estimated waves between 2 sec and 4 sec almost correspond the observed waves, but the estimated waves between 4 sec and 6 sec underestimate the observed waves because of the 3D effects of the Kanto sedimentary basin. Thus, we estimate the long-period ground motion to the estimation on the side of prudence given the assumption of the source model, because EEWS provides only the location and magnitude of an earthquake. We confirm that the proposed method is able to control the elevator for the long-period ground motion.

  10. Hidden earthquakes

    SciTech Connect

    Stein, R.S.; Yeats, R.S.

    1989-06-01

    Seismologists generally look for earthquakes to happen along visible fault lines, e.g., the San Andreas fault. The authors maintain that another source of dangerous quakes has been overlooked: the release of stress along a fault that is hidden under a fold in the earth's crust. The paper describes the differences between an earthquake which occurs on a visible fault and one which occurs under an anticline and warns that Los Angeles greatest earthquake threat may come from a small quake originating under downtown Los Angeles, rather than a larger earthquake which occurs 50 miles away at the San Andreas fault.

  11. [Efficient OP management. Suggestions for optimisation of organisation and administration as a basis for establishing statutes for operating theatres].

    PubMed

    Geldner, G; Eberhart, L H J; Trunk, S; Dahmen, K G; Reissmann, T; Weiler, T; Bach, A

    2002-09-01

    Economic aspects have gained increasing importance in recent years. The operating room (OR) is the most cost-intensive sector and determines the turnover process of a surgical patient within the hospital. Thus, optimisation of workflow processes is of particular interest for health care providers. If the results of surgery are viewed as a product, everything associated with surgery can be evaluated analogously to a manufacturing process. All steps involved in producing the end-result can and should be analysed with the goal of producing an efficient, economical and quality product. The leadership that physicians can provide to manage this process is important and leads to the introduction of a specialised "OR manager". This position must have the authority to issue directives to all other members of the OR team. An OR management subordinates directly to the administration of the hospital. By integrating and improving management of various elements of the surgical process, health care institutions are able to rationally trim costs while maintaining high-quality services. This paper gives a short introduction into the difficulties of organising an OR. Some suggestions are made to overcome common shortcomings in the daily practise. A proposal for an "OR statute" is presented that should be a basis for discussion within the OR team. It must be modified according to individual needs and prerequisites in every hospital. The single best opportunity for dramatic improvement in effective resource use in surgical services lies in the perioperative process. The management strategy must focus on process measurement using information technology and feed-back implementing modern quality management tools.However, no short-term effects can be expected from these changes. Improvements take about a year and continuous feed-back of all measures must accompany the reorganisation process.

  12. Earthquakes: A Teacher's Package for K-6.

    ERIC Educational Resources Information Center

    National Science Teachers Association, Washington, DC.

    Like rain, an earthquake is a natural occurrence which may be mild or catastrophic. Although an earthquake may last only a few seconds, the processes that cause it have operated within the earth for millions of years. Until recently, the cause of earthquakes was a mystery and the subject of fanciful folklore to people all around the world. This…

  13. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  14. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  15. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  16. Proposed plan/Statement of basis for the Grace Road Site (631-22G) operable unit: Final action

    SciTech Connect

    Palmer, E.

    1997-08-19

    This Statement of Basis/Proposed Plan is being issued by the U. S. Department of Energy (DOE), which functions as the lead agency for the Savannah River Site (SRS) remedial activities, with concurrence by the U. S. Environmental Protection Agency (EPA), and the South Carolina Department of Health and Environmental Control (SCDHEC). The purpose of this Statement of Basis/Proposed Plan is to describe the preferred alternative for addressing the Grace Road site (GRS) located at the Savannah River Site (SRS), in Aiken, South Carolina and to provide an opportunity for public input into the remedial action selection process.

  17. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable year which... made upon the final determination of the rate of absorption applicable to the taxable year....

  18. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  19. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  20. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  1. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  2. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways in... 29 Labor 3 2010-07-01 2010-07-01 false System must be nonprofit or operated on a share-crop...

  3. Modeling of oil spills in ice conditions in the Gulf of finland on the basis of an operative forecasting system

    NASA Astrophysics Data System (ADS)

    Stanovoy, V. V.; Eremina, T. R.; Isaev, A. V.; Neelov, I. A.; Vankevich, R. E.; Ryabchenko, V. A.

    2012-11-01

    A brief description of the GULFOOS operative forecasting oceanographic system of the Gulf of Finland and the OilMARS operative forecasting oil spill model is presented. Special attention is focused on oil spill simulation in ice conditions. All the assumptions and parameterizations used are described. Modeling results of training simulations for the ice conditions of January 2011 are presented.

  4. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  5. Rapid estimation of the economic consequences of global earthquakes

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    to reduce this time gap to more rapidly and effectively mobilize response. We present here a procedure to rapidly and approximately ascertain the economic impact immediately following a large earthquake anywhere in the world. In principle, the approach presented is similar to the empirical fatality estimation methodology proposed and implemented by Jaiswal and others (2009). In order to estimate economic losses, we need an assessment of the economic exposure at various levels of shaking intensity. The economic value of all the physical assets exposed at different locations in a given area is generally not known and extremely difficult to compile at a global scale. In the absence of such a dataset, we first estimate the total Gross Domestic Product (GDP) exposed at each shaking intensity by multiplying the per-capita GDP of the country by the total population exposed at that shaking intensity level. We then scale the total GDP estimated at each intensity by an exposure correction factor, which is a multiplying factor to account for the disparity between wealth and/or economic assets to the annual GDP. The economic exposure obtained using this procedure is thus a proxy estimate for the economic value of the actual inventory that is exposed to the earthquake. The economic loss ratio, defined in terms of a country-specific lognormal cumulative distribution function of shaking intensity, is derived and calibrated against the losses from past earthquakes. This report describes the development of a country or region-specific economic loss ratio model using economic loss data available for global earthquakes from 1980 to 2007. The proposed model is a potential candidate for directly estimating economic losses within the currently-operating PAGER system. PAGER's other loss models use indirect methods that require substantially more data (such as building/asset inventories, vulnerabilities, and the asset values exposed at the time of earthquake) to implement on a global basis

  6. Earthquake Scaling, Simulation and Forecasting

    NASA Astrophysics Data System (ADS)

    Sachs, Michael Karl

    Earthquakes are among the most devastating natural events faced by society. In 2011, just two events, the magnitude 6.3 earthquake in Christcurch New Zealand on February 22, and the magnitude 9.0 Tohoku earthquake off the coast of Japan on March 11, caused a combined total of $226 billion in economic losses. Over the last decade, 791,721 deaths were caused by earthquakes. Yet, despite their impact, our ability to accurately predict when earthquakes will occur is limited. This is due, in large part, to the fact that the fault systems that produce earthquakes are non-linear. The result being that very small differences in the systems now result in very big differences in the future, making forecasting difficult. In spite of this, there are patterns that exist in earthquake data. These patterns are often in the form of frequency-magnitude scaling relations that relate the number of smaller events observed to the number of larger events observed. In many cases these scaling relations show consistent behavior over a wide range of scales. This consistency forms the basis of most forecasting techniques. However, the utility of these scaling relations is limited by the size of the earthquake catalogs which, especially in the case of large events, are fairly small and limited to a few 100 years of events. In this dissertation I discuss three areas of earthquake science. The first is an overview of scaling behavior in a variety of complex systems, both models and natural systems. The focus of this area is to understand how this scaling behavior breaks down. The second is a description of the development and testing of an earthquake simulator called Virtual California designed to extend the observed catalog of earthquakes in California. This simulator uses novel techniques borrowed from statistical physics to enable the modeling of large fault systems over long periods of time. The third is an evaluation of existing earthquake forecasts, which focuses on the Regional

  7. Earthquakes for Kids

    MedlinePlus

    ... Hazards Data & Products Learn Monitoring Research Earthquakes for Kids Kid's Privacy Policy Earthquake Topics for Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters ...

  8. Generation Of Manufacturing Routing And Operations Using Structured Knowledge As Basis To Application Of Computer Aided In Process Planning

    NASA Astrophysics Data System (ADS)

    Oswaldo, Luiz Agostinho

    2011-01-01

    The development of computer aided resources in automation of generation of manufacturing routings and operations is being mainly accomplished through the search of similarities between existent ones, resulting standard process routings that are grouped by analysis of similarities between parts or routings. This article proposes the development of manufacturing routings and operations detailment using a methodology which steps will define the initial, intermediate and final operations, starting from the rough piece and going up to the final specifications, that must have binunivocal relationship with the part design specifications. Each step will use the so called rules of precedence to link and chain the routing operations. The rules of precedence order and prioritize the knowledge of various manufacturing processes, taking in account the theories of machining, forging, assembly, and heat treatments; also, utilizes the theories of accumulation of tolerances and process capabilities, between others. It is also reinforced the availability of manufacturing databases related to process tolerances, deviations of machine tool- cutting tool- fixturing devices—workpiece, and process capabilities. The statement and application of rules of precedence, linking and joining manufacturing concepts in a logical and structured way, and their application in the methodology steps will make viable the utilization of structured knowledge instead of tacit one currently available in the manufacturing engineering departments, in the generation of manufacturing routing and operations. Consequently, the development of Computer Aided in Process Planning will be facilitated, due to the structured knowledge applied with this methodology.

  9. A computational method for solving stochastic Itô–Volterra integral equations based on stochastic operational matrix for generalized hat basis functions

    SciTech Connect

    Heydari, M.H.; Hooshmandasl, M.R.; Maalek Ghaini, F.M.; Cattani, C.

    2014-08-01

    In this paper, a new computational method based on the generalized hat basis functions is proposed for solving stochastic Itô–Volterra integral equations. In this way, a new stochastic operational matrix for generalized hat functions on the finite interval [0,T] is obtained. By using these basis functions and their stochastic operational matrix, such problems can be transformed into linear lower triangular systems of algebraic equations which can be directly solved by forward substitution. Also, the rate of convergence of the proposed method is considered and it has been shown that it is O(1/(n{sup 2}) ). Further, in order to show the accuracy and reliability of the proposed method, the new approach is compared with the block pulse functions method by some examples. The obtained results reveal that the proposed method is more accurate and efficient in comparison with the block pule functions method.

  10. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 8 2011-04-01 2011-04-01 false Policyholders of mutual fire or flood insurance...) Other Insurance Companies § 1.832-6 Policyholders of mutual fire or flood insurance companies operating..., a taxpayer insured by a mutual fire or flood insurance company under a policy for which the...

  11. The Effects of Degraded Digital Instrumentation and Control Systems on Human-system Interfaces and Operator Performance: HFE Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.; W. Gunther, G. Martinez-Guridi

    2010-02-26

    New and advanced reactors will use integrated digital instrumentation and control (I&C) systems to support operators in their monitoring and control functions. Even though digital systems are typically highly reliable, their potential for degradation or failure could significantly affect operator performance and, consequently, impact plant safety. The U.S. Nuclear Regulatory Commission (NRC) supported this research project to investigate the effects of degraded I&C systems on human performance and plant operations. The objective was to develop human factors engineering (HFE) review guidance addressing the detection and management of degraded digital I&C conditions by plant operators. We reviewed pertinent standards and guidelines, empirical studies, and plant operating experience. In addition, we conducted an evaluation of the potential effects of selected failure modes of the digital feedwater system on human-system interfaces (HSIs) and operator performance. The results indicated that I&C degradations are prevalent in plants employing digital systems and the overall effects on plant behavior can be significant, such as causing a reactor trip or causing equipment to operate unexpectedly. I&C degradations can impact the HSIs used by operators to monitor and control the plant. For example, sensor degradations can make displays difficult to interpret and can sometimes mislead operators by making it appear that a process disturbance has occurred. We used the information obtained as the technical basis upon which to develop HFE review guidance. The guidance addresses the treatment of degraded I&C conditions as part of the design process and the HSI features and functions that support operators to monitor I&C performance and manage I&C degradations when they occur. In addition, we identified topics for future research.

  12. Postseismic Transient after the 2002 Denali Fault Earthquake from VLBI Measurements at Fairbanks

    NASA Technical Reports Server (NTRS)

    MacMillan, Daniel; Cohen, Steven

    2004-01-01

    The VLBI antenna (GILCREEK) at Fairbanks, Alaska observes in networks routinely twice a week with operational networks and on additional days with other networks on a more uneven basis. The Fairbanks antenna position is about 150 km north of the Denali fault and from the earthquake epicenter. We examine the transient behavior of the estimated VLBI position during the year following the earthquake to determine how the rate of change of postseismic deformation has changed. This is compared with what is seen in the GPS site position series.

  13. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  14. America's faulty earthquake plans

    SciTech Connect

    Rosen, J

    1989-10-01

    In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.

  15. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  16. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  17. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  18. Connecting slow earthquakes to huge earthquakes

    NASA Astrophysics Data System (ADS)

    Obara, Kazushige; Kato, Aitaro

    2016-07-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  19. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. PMID:27418504

  20. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  1. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  2. Earthquake ground motion: Chapter 3

    USGS Publications Warehouse

    Luco, Nicolas; Valley, Michael; Crouse, C.B.

    2012-01-01

    Most of the effort in seismic design of buildings and other structures is focused on structural design. This chapter addresses another key aspect of the design process—characterization of earthquake ground motion. Section 3.1 describes the basis of the earthquake ground motion maps in the Provisions and in ASCE 7. Section 3.2 has examples for the determination of ground motion parameters and spectra for use in design. Section 3.3 discusses and provides an example for the selection and scaling of ground motion records for use in response history analysis.

  3. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  4. Earthquake Archaeology: a logical approach?

    NASA Astrophysics Data System (ADS)

    Stewart, I. S.; Buck, V. A.

    2001-12-01

    Ancient earthquakes can leave their mark in the mythical and literary accounts of ancient peoples, the stratigraphy of their site histories, and the structural integrity of their constructions. Within this broad cross-disciplinary tramping ground, earthquake geologists have tended to focus on those aspects of the cultural record that are most familiar to them; the physical effects of seismic deformation on ancient constructions. One of the core difficulties with this 'earthquake archaeology' approach is that recent attempts to isolate structural criteria that are diagnostic or strongly suggestive of a seismic origin are undermined by the recognition that signs of ancient seismicity are generally indistinguishable from non-seismic mechanisms (poor construction, adverse geotechnical conditions). We illustrate the difficulties and inconsistencies in current proposed 'earthquake diagnostic' schemes by reference to two case studies of archaeoseismic damage in central Greece. The first concerns fallen columns at various Classical temple localities in mainland Greece (Nemea, Sounio, Olympia, Bassai) which, on the basis of observed structural criteria, are earthquake-induced but which are alternatively explained by archaeologists as the action of human disturbance. The second re-examines the almost type example of the Kyparissi site in the Atalanti region as a Classical stoa offset across a seismic surface fault, arguing instead for its deformation by ground instability. Finally, in highlighting the inherent ambiguity of archaeoseismic data, we consider the value of a logic-tree approach for quantifying and quantifying our uncertainities for seismic-hazard analysis.

  5. Automated Microwave Complex on the Basis of a Continuous-Wave Gyrotron with an Operating Frequency of 263 GHz and an Output Power of 1 kW

    NASA Astrophysics Data System (ADS)

    Glyavin, M. Yu.; Morozkin, M. V.; Tsvetkov, A. I.; Lubyako, L. V.; Golubiatnikov, G. Yu.; Kuftin, A. N.; Zapevalov, V. E.; V. Kholoptsev, V.; Eremeev, A. G.; Sedov, A. S.; Malygin, V. I.; Chirkov, A. V.; Fokin, A. P.; Sokolov, E. V.; Denisov, G. G.

    2016-02-01

    We study experimentally the automated microwave complex for microwave spectroscopy and diagnostics of various media, which was developed at the Institute of Applied Physics of the Russian Academy of Sciences in cooperation with GYCOM Ltd. on the basis of a gyrotron with a frequency of 263 GHz and operated at the first gyrofrequency harmonic. In the process of the experiments, a controllable output power of 0 .1 -1 kW was achieved with an efficiency of up to 17 % in the continuous-wave generation regime. The measured radiation spectrum with a relative width of about 10 -6 and the frequency values measured at various parameters of the device are presented. The results of measuring the parameters of the wave beam, which was formed by a built-in quasioptical converter, as well as the data obtained by measuring the heat loss in the cavity and the vacuum output window are analyzed.

  6. Chern-Simons gravity with (curvature){sup 2} and (torsion){sup 2} terms and a basis of degree-of-freedom projection operators

    SciTech Connect

    Helayeel-Neto, J. A.; Hernaski, C. A.; Pereira-Dias, B.; Vargas-Paredes, A. A.; Vasquez-Otoya, V. J.

    2010-09-15

    The effects of (curvature){sup 2}- and (torsion){sup 2}-terms in the Einstein-Hilbert-Chern-Simons Lagrangian are investigated. The purposes are two-fold: (i) to show the efficacy of an orthogonal basis of degree-of-freedom projection operators recently proposed and to ascertain its adequacy for obtaining propagators of general parity-breaking gravity models in three dimensions; (ii) to analyze the role of the topological Chern-Simons term for the unitarity and the particle spectrum of the model squared-curvature terms in connection with dynamical torsion. Our conclusion is that the Chern-Simons term does not influence the unitarity conditions imposed on the parameters of the Lagrangian but significantly modifies the particle spectrum.

  7. EARTHQUAKE HAZARDS IN THE OFFSHORE ENVIRONMENT.

    USGS Publications Warehouse

    Page, Robert A.; Basham, Peter W.

    1985-01-01

    This report discusses earthquake effects and potential hazards in the marine environment, describes and illustrates methods for the evaluation of earthquake hazards, and briefly reviews strategies for mitigating hazards. The report is broadly directed toward engineers, scientists, and others engaged in developing offshore resources. The continental shelves have become a major frontier in the search for new petroleum resources. Much of the current exploration is in areas of moderate to high earthquake activity. If the resources in these areas are to be developed economically and safely, potential earthquake hazards must be identified and mitigated both in planning and regulating activities and in designing, constructing, and operating facilities. Geologic earthquake effects that can be hazardous to marine facilities and operations include surface faulting, tectonic uplift and subsidence, seismic shaking, sea-floor failures, turbidity currents, and tsunamis.

  8. Earthquake Planning for Government Continuity

    PubMed

    PERRY; LINDELL

    1997-01-01

    / The problem of assuring government operational continuity following earthquakes has been given little research attention. Recent earthquake experience has documented that government organizations without a public safety mission do incur damaged facilities and routinely see increases in public demands following an earthquake. Impediments to service delivery associated with such dam-ages can be minimized if agencies address earthquake plan elements likely to enhance postimpact functioning, including: the potential to relocate operations, protection for the workplace, possession of an organizational inventory, emergency instructions for employees, the ability to use volunteers, and communication capacity. Factors associated with the adoption of these plan elements were studied in one county government and its municipal county seat in the southwestern United States. A census of departments within these jurisdictions was asked to complete a questionnaire reporting the level of planning activity relative to each of these plan elements. It was found that the overall level of preparedness was low, but statistically significantly related to agency size, perceived risk, and information seeking. The implications of these findings underscore the potential for disruption to government service delivery and permit the identification of potential avenues for increasing levels of preparedness.KEY WORDS: Emergency planning; Earthquakes; Government preparedness PMID:8939788

  9. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  10. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions. PMID:2347628

  11. Evaluation of near-field earthquake effects

    SciTech Connect

    Shrivastava, H.P.

    1994-11-01

    Structures and equipment, which are qualified for the design basis earthquake (DBE) and have anchorage designed for the DBE loading, do not require an evaluation of the near-field earthquake (NFE) effects. However, safety class 1 acceleration sensitive equipment such as electrical relays must be evaluated for both NFE and DBE since they are known to malfunction when excited by high frequency seismic motions.

  12. Stress Drops for Potentially Induced Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  13. A Simplified Approach to the Basis Functions of Symmetry Operations and Terms of Metal Complexes in an Octahedral Field with d[superscript 1] to d[superscript 9] Configurations

    ERIC Educational Resources Information Center

    Lee, Liangshiu

    2010-01-01

    The basis sets for symmetry operations of d[superscript 1] to d[superscript 9] complexes in an octahedral field and the resulting terms are derived for the ground states and spin-allowed excited states. The basis sets are of fundamental importance in group theory. This work addresses such a fundamental issue, and the results are pedagogically…

  14. Earthquake Shaking and Damage to Buildings: Recent evidence for severe ground shaking raises questions about the earthquake resistance of structures.

    PubMed

    Page, R A; Joyner, W B; Blume, J A

    1975-08-22

    Ground shaking close to the causative fault of an earthquake is more intense than it was previously believed to be. This raises the possibility that large numbers of buildings and other structures are not sufficiently resistant for the intense levels of shaking that can occur close to the fault. Many structures were built before earthquake codes were adopted; others were built according to codes formulated when less was known about the intensity of near-fault shaking. Although many building types are more resistant than conventional design analyses imply, the margin of safety is difficult to quantify. Many modern structures, such as freeways, have not been subjected to and tested by near-fault shaking in major earthquakes (magnitude 7 or greater). Damage patterns in recent moderate-sized earthquakes occurring in or adjacent to urbanized areas (17), however, indicate that many structures, including some modern ones designed to meet earthquake code requirements, cannot withstand the severe shaking that can occur close to a fault. It is necessary to review the ground motion assumed and the methods utilized in the design of important existing structures and, if necessary, to strengthen or modify the use of structures that are found to be weak. New structures situated close to active faults should be designed on the basis of ground motion estimates greater than those used in the past. The ultimate balance between risk of earthquake losses and cost for both remedial strengthening and improved earthquake-resistant construction must be decided by the public. Scientists and engineers must inform the public about earthquake shaking and its effect on structures. The exposure to damage from seismic shaking is steadily increasing because of continuing urbanization and the increasing complexity of lifeline systems, such as power, water, transportation, and communication systems. In the near future we should expect additional painful examples of the damage potential of moderate

  15. Material contrast does not predict earthquake rupture propagation direction

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    2005-01-01

    Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

  16. Geodetic measurement of deformation in the Loma Prieta, California earthquake with Very Long Baseline Interferometry (VLBI)

    SciTech Connect

    Clark, T.A.; Ma, C.; Sauber, J.M.; Ryan, J.W. ); Gordon, D.; Caprette, D.S. ); Shaffer, D.B.; Vandenberg, N.R. )

    1990-07-01

    Following the Loma Prieta earthquake, two mobile Very Long Baseline Interferometry (VLBI) systems operated by the NASA Crustal Dynamics Project and the NOAA National Geodetic Survey were deployed at three previously established VLBI sites in the earthquake area: Fort Ord (near Monterey), the Presidio (in San Francisco) and Point Reyes. From repeated VLBI occupations of these sites since 1983, the pre-earthquake rates of deformation have been determined with respect to a North American reference frame with 1{sigma} formal standard errors of {approximately}1 mm/yr. The VLBI measurements immediately following the earthquake showed that the Fort Ord site was displaced 49 {plus minus} 4 mm at an azimuth of 11 {plus minus} 4{degree} and that the Presidio site was displaced 12 {plus minus} 5 mm at an azimuth of 148 {plus minus} 13{degree}. No anomalous change was detected at Point Reyes with 1{sigma} uncertainty of 4 mm. The estimated displacements at Fort Ord and the Presidio are consistent with the static displacements predicted on the basis of a coseismic slip model in which slip on the southern segment is shallower than slip on the more northern segment is shallower than slip on the more northern segment of the fault rupture. The authors also give the Cartesian positions at epoch 1990.0 of a set of VLBI fiducial stations and the three mobile sites in the vicinity of the earthquake.

  17. Guidelines for earthquake ground motion definition for the eastern United States

    SciTech Connect

    Gwaltney, R.C.; Aramayo, G.A.; Williams, R.T.

    1985-01-01

    Guidelines for the determination of earthquake ground-motion definition for the eastern United States are established in this paper. Both far-field and near-field guidelines are given. The guidelines were based on an extensive review of the current procedures for specifying ground motion in the United States. Both empirical and theoretical procedures were used in establishing the guidelines because of the low seismicity in the eastern United States. Only a few large to great (M > 7.5) sized earthquakes have occurred in this region, no evidence of tectonic surface ruptures related to historic or Holocene earthquakes have been found, and no currently active plate boundaries of any kind are known in this region. Very little instrumented data has been gathered in the East. Theoretical procedures are proposed so that in regions of almost no data a reasonable level of seismic ground motion activity can be assumed. The guidelines are to be used to develop the Safe Shutdown Earthquake, SSE. A new procedure for establishing the Operating Basis Earthquake, OBE, is proposed, in particular for the eastern United States. The OBE would be developed using a probabilistic assessment of the geological conditions and the recurrence of seismic events at a site. These guidelines should be useful in development of seismic design requirements for future reactors. 17 refs., 2 figs., 1 tab.

  18. Research on Swedish earthquakes 1980 - 1981

    NASA Astrophysics Data System (ADS)

    Slunga, R.

    1982-11-01

    The research on Swedish earthquakes, recorded December 1979-1981 by the digital seismic network in Southern Sweden operated by National Defence Research Institute (FOA) is reported. The high-quality earthquake data produced by this network allows source inversion of all recorded earthquakes. A method based on both first-motion polarities and spectral amplitudes is presented. Besides the fault-plane solution and the seismic moment, also corner frequencies, fault dimensions, stress drops, and peak slip displacements are determined for 53 Swedish earthquakes. Epicentral ground motion is studied and a relation for a two-parametric scaling (seismic moment and stress drop) of the earthquake is proposed and applied to the bedrock peak accelerations. An NW-SE horizontal compression is indicated by the source mechanisms. Quite often, surface topographic lineaments are consistent with the fault-plane solutions. The frequency, epicentral and depth distribution, and peak accelerations are in agreement with previous studies on seismic risk.

  19. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  20. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  1. The use of volunteer interpreters during the 201 0 Haiti earthquake: lessons learned from the USNS COMFORT Operation Unified Response Haiti.

    PubMed

    Powell, Clydette; Pagliara-Miller, Claire

    2012-01-01

    On January 12, 2010, a 7.0 magnitude Richter earthquake devastated Haiti, leading to the world's largest humanitarian effort in 60 years. The catastrophe led to massive destruction of homes and buildings, the loss of more than 200,000 lives, and overwhelmed the host nation response and its public health infrastructure. Among the many responders, the United States Government acted immediately by sending assistance to Haiti including a naval hospital ship as a tertiary care medical center, the USNS COMFORT. To adequately respond to the acute needs of patients, healthcare professionals on the USNS COMFORT relied on Haitian Creole-speaking volunteers who were recruited by the American Red Cross (ARC). These volunteers complemented full-time Creole-speaking military staff on board. The ARC provided 78 volunteers who were each able to serve up to 4 weeks on board. Volunteers' demographics, such as age and gender, as well as linguistic skills, work background, and prior humanitarian assistance experience varied. Volunteer efforts were critical in assisting with informed consent for surgery, family reunification processes, explanation of diagnosis and treatment, comfort to patients and families in various stages of grieving and death, and helping healthcare professionals to understand the cultural context and sensitivities unique to Haiti. This article explores key lessons learned in the use of volunteer interpreters in earthquake disaster relief in Haiti and highlights the approaches that optimize volunteer services in such a setting, and which may be applicable in similar future events.

  2. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  3. EARTHQUAKE CAUSED RELEASES FROM A NUCLEAR FUEL CYCLE FACILITY

    SciTech Connect

    Charles W. Solbrig; Chad Pope; Jason Andrus

    2014-08-01

    The fuel cycle facility (FCF) at the Idaho National Laboratory is a nuclear facility which must be licensed in order to operate. A safety analysis is required for a license. This paper describes the analysis of the Design Basis Accident for this facility. This analysis involves a model of the transient behavior of the FCF inert atmosphere hot cell following an earthquake initiated breach of pipes passing through the cell boundary. The hot cell is used to process spent metallic nuclear fuel. Such breaches allow the introduction of air and subsequent burning of pyrophoric metals. The model predicts the pressure, temperature, volumetric releases, cell heat transfer, metal fuel combustion, heat generation rates, radiological releases and other quantities. The results show that releases from the cell are minimal and satisfactory for safety. This analysis method should be useful in other facilities that have potential for damage from an earthquake and could eliminate the need to back fit facilities with earthquake proof boundaries or lessen the cost of new facilities.

  4. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  5. Generally Contracted Valence-Core/Valence Basis Sets for Use with Relativistic Effective Core Potentials and Spin-Orbit Coupling Operators

    SciTech Connect

    Ermler, Walter V.; Tilson, Jeffrey L.

    2012-12-15

    A procedure for structuring generally contracted valence-core/valence basis sets of Gaussian-type functions for use with relativistic effective core potentials (gcv-c/v-RECP basis sets) is presented. Large valence basis sets are enhanced using a compact basis set derived for outer core electrons in the presence of small-core RECPs. When core electrons are represented by relativistic effective core potentials (RECPs), and appropriate levels of theory, these basis sets are shown to provide accurate representations of atomic and molecular valence and outer-core electrons. Core/valence polarization and correlation effects can be calculated using these basis sets through standard methods for treating electron correlation. Calculations of energies and spectra for Ru, Os, Ir, In and Cs are reported. Spectroscopic constants for RuO2+, OsO2+, Cs2 and InH are calculated and compared with experiment.

  6. The uncertainty in earthquake conditional probabilities

    USGS Publications Warehouse

    Savage, J.C.

    1992-01-01

    The Working Group on California Earthquake Probabilities (WGCEP) questioned the relevance of uncertainty intervals assigned to earthquake conditional probabilities on the basis that the uncertainty in the probability estimate seemed to be greater the smaller the intrinsic breadth of the recurrence-interval distribution. It is shown here that this paradox depends upon a faulty measure of uncertainty in the conditional probability and that with a proper measure of uncertainty no paradox exists. The assertion that the WGCEP probability assessment in 1988 correctly forecast the 1989 Loma Prieta earthquake is also challenged by showing that posterior probability of rupture inferred after the occurrence of the earthquake from the prior WGCEP probability distribution reverts to a nearly informationless distribution. -Author

  7. The size of earthquakes

    USGS Publications Warehouse

    Kanamori, H.

    1980-01-01

    How we should measure the size of an earthquake has been historically a very important, as well as a very difficult, seismological problem. For example, figure 1 shows the loss of life caused by earthquakes in recent times and clearly demonstrates that 1976 was the worst year for earthquake casualties in the 20th century. However, the damage caused by an earthquake is due not only to its physical size but also to other factors such as where and when it occurs; thus, figure 1 is not necessarily an accurate measure of the "size" of earthquakes in 1976. the point is that the physical process underlying an earthquake is highly complex; we therefore cannot express every detail of an earthquake by a simple straightforward parameter. Indeed, it would be very convenient if we could find a single number that represents the overall physical size of an earthquake. This was in fact the concept behind the Richter magnitude scale introduced in 1935. 

  8. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  9. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  10. Estimating earthquake potential

    USGS Publications Warehouse

    Page, R.A.

    1980-01-01

    The hazards to life and property from earthquakes can be minimized in three ways. First, structures can be designed and built to resist the effects of earthquakes. Second, the location of structures and human activities can be chosen to avoid or to limit the use of areas known to be subject to serious earthquake hazards. Third, preparations for an earthquake in response to a prediction or warning can reduce the loss of life and damage to property as well as promote a rapid recovery from the disaster. The success of the first two strategies, earthquake engineering and land use planning, depends on being able to reliably estimate the earthquake potential. The key considerations in defining the potential of a region are the location, size, and character of future earthquakes and frequency of their occurrence. Both historic seismicity of the region and the geologic record are considered in evaluating earthquake potential. 

  11. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  12. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  13. Earthquakes, October 1975

    USGS Publications Warehouse

    Person, W.J.

    1976-01-01

    October was an active month seismically, although there were no damaging earthquakes in the United States. Several States experienced earthquakes that were felt sharply. There were four major earthquakes in other parts of the world, including a magntidue 7.4 in the Philippine Islands that killed on person. 

  14. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  15. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  16. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  17. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  18. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  19. Mapping the earthquake hazards of the Los Angeles region

    USGS Publications Warehouse

    Ziony, J.I.; Tinsley, J.C.

    1983-01-01

    These studies are providing an improved basis for delineating geogrpahic variations in the earthquake hazards of hte region. Several examples of recent results, which represent the contributions of many researchers, are discussed below. 

  20. Safety Basis Report

    SciTech Connect

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  1. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  2. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  3. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  4. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  5. Earthquake source inversion of tsunami runup prediction

    NASA Astrophysics Data System (ADS)

    Sekar, Anusha

    Our goal is to study two inverse problems: using seismic data to invert for earthquake parameters and using tide gauge data to invert for earthquake parameters. We focus on the feasibility of using a combination of these inverse problems to improve tsunami runup prediction. A considerable part of the thesis is devoted to studying the seismic forward operator and its modeling using immersed interface methods. We develop an immersed interface method for solving the variable coefficient advection equation in one dimension with a propagating singularity and prove a convergence result for this method. We also prove a convergence result for the one-dimensional acoustic system of partial differential equations solved using immersed interface methods with internal boundary conditions. Such systems form the building blocks of the numerical model for the earthquake. For a simple earthquake-tsunami model, we observe a variety of possibilities in the recovery of the earthquake parameters and tsunami runup prediction. In some cases the data are insufficient either to invert for the earthquake parameters or to predict the runup. When more data are added, we are able to resolve the earthquake parameters with enough accuracy to predict the runup. We expect that this variety will be true in a real world three dimensional geometry as well.

  6. Effects of the 2011 Tohoku Earthquake on VLBI Geode- tic Measurements

    NASA Astrophysics Data System (ADS)

    MacMillan, D.; Behrend, D.; Kurihara, S.

    2012-12-01

    The VLBI antenna TSUKUB32 at Tsukuba, Japan observes in 24-hour observing sessions once per week with the R1 operational network and on additional days with other networks on a more irregular basis. Further, the antenna is an endpoint of the single-baseline, 1-hr Intensive Int2 sessions observed on the weekends for the determination of UT1. TSUKUB32 returned to normal operational observing one month after the earthquake. The antenna is 160 km west and 240 km south of the epicenter of the Tohoku earthquake. We looked at the transient behavior of the TSUKUB32 position time series following the earthquake and found that significant deformation is continuing. The eastward rate relative to the long-term rate prior to the earthquake was about 20 cm/yr four months after the earthquake and 9 cm/yr after one year. The VLBI series agrees closely with the corresponding JPL (Jet Propulsion Laboratory) GPS series measured by the co-located GPS antenna TSUK. The co-seismic UEN displacement at Tsukuba as determined by VLBI was (-90 mm, 640 mm, 44 mm). We examined the effect of the variation of the TSUKUB32 position on EOP estimates and then used the GPS data to correct its position for the estimation of UT1 in the Tsukuba-Wettzell Int2 Intensive experiments. For this purpose and to provide operational UT1, the IVS scheduled a series of weekend Intensive sessions observing on the Kokee-Wettzell baseline immediately before each of the two Tsukuba-Wettzell Intensive sessions. Comparisons between the UT1 estimates from these weekend sessions and the USNO (United States Naval Observatory) combination series were used to validate the GPS correction to the TSUKUB32 position.

  7. Comment on "Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set" [J. Chem. Phys. 139, 114104 (2013)

    NASA Astrophysics Data System (ADS)

    Brandbyge, Mads

    2014-05-01

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an "implicit decoupling assumption," leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, and that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.

  8. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed Central

    Grant, Rachel A.; Conlan, Hilary

    2013-01-01

    juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co-incidence. Statistical analysis of the data indicated frog swarms are unlikely to be connected with earthquakes. Reports of unusual behaviour giving rise to earthquake fears should be interpreted with caution, and consultation with experts in the field of earthquake biology is advised. PMID:26479746

  9. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  10. Evidence for remotely triggered micro-earthquakes during salt cavern collapse

    NASA Astrophysics Data System (ADS)

    Jousset, P.; Rohmer, J.

    2012-04-01

    Micro-seismicity is a good indicator of spatio-temporal evolution of physical properties of rocks prior to catastrophic events like volcanic eruptions or landslides and may be triggered by a number of causes including dynamic characteristics of processes in play or/and external forces. Micro-earthquake triggering has been in the recent years the subject of intense research and our work contribute to showing further evidence of possible triggering of micro-earthquakes by remote large earthquakes. We show evidence of triggered micro-seismicity in the vicinity of an underground salt cavern prone to collapse by a remote M~7.2 earthquake, which occurred ~12000 kilometres away. We demonstrate the near critical state of the cavern before the collapse by means of 2D axisymmetric elastic finite-element simulations. Pressure was lowered in the cavern by pumping operations of brine out of the cavern. We demonstrate that a very small stress increase would be sufficient to break the overburden. High-dynamic broadband records reveal a remarkable time-correlation between a dramatic increase of the local high-frequency micro-seismicity rate associated with the break of the stiffest layer stabilizing the overburden and the passage of low-frequency remote seismic waves, including body, Love and Rayleigh surface waves. Stress oscillations due to the seismic waves exceeded the strength required for the rupture of the complex media made of brine and rock triggering micro-earthquakes and leading to damage of the overburden and eventually collapse of the salt cavern. The increment of stress necessary for the failure of a Dolomite layer is of the same order or magnitude as the maximum dynamic stress magnitude observed during the passage of the earthquakes waves. On this basis, we discuss the possible contribution of the Love and Rayleigh low-frequency surfaces waves.

  11. Anthropogenic seismicity rates and operational parameters at the Salton Sea Geothermal Field.

    PubMed

    Brodsky, Emily E; Lajoie, Lia J

    2013-08-01

    Geothermal power is a growing energy source; however, efforts to increase production are tempered by concern over induced earthquakes. Although increased seismicity commonly accompanies geothermal production, induced earthquake rate cannot currently be forecast on the basis of fluid injection volumes or any other operational parameters. We show that at the Salton Sea Geothermal Field, the total volume of fluid extracted or injected tracks the long-term evolution of seismicity. After correcting for the aftershock rate, the net fluid volume (extracted-injected) provides the best correlation with seismicity in recent years. We model the background earthquake rate with a linear combination of injection and net production rates that allows us to track the secular development of the field as the number of earthquakes per fluid volume injected decreases over time.

  12. Anthropogenic seismicity rates and operational parameters at the Salton Sea Geothermal Field.

    PubMed

    Brodsky, Emily E; Lajoie, Lia J

    2013-08-01

    Geothermal power is a growing energy source; however, efforts to increase production are tempered by concern over induced earthquakes. Although increased seismicity commonly accompanies geothermal production, induced earthquake rate cannot currently be forecast on the basis of fluid injection volumes or any other operational parameters. We show that at the Salton Sea Geothermal Field, the total volume of fluid extracted or injected tracks the long-term evolution of seismicity. After correcting for the aftershock rate, the net fluid volume (extracted-injected) provides the best correlation with seismicity in recent years. We model the background earthquake rate with a linear combination of injection and net production rates that allows us to track the secular development of the field as the number of earthquakes per fluid volume injected decreases over time. PMID:23845943

  13. Building Loss Estimation for Earthquake Insurance Pricing

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Erdik, M.; Sesetyan, K.; Demircioglu, M. B.; Fahjan, Y.; Siyahi, B.

    2005-12-01

    After the 1999 earthquakes in Turkey several changes in the insurance sector took place. A compulsory earthquake insurance scheme was introduced by the government. The reinsurance companies increased their rates. Some even supended operations in the market. And, most important, the insurance companies realized the importance of portfolio analysis in shaping their future market strategies. The paper describes an earthquake loss assessment methodology that can be used for insurance pricing and portfolio loss estimation that is based on our work esperience in the insurance market. The basic ingredients are probabilistic and deterministic regional site dependent earthquake hazard, regional building inventory (and/or portfolio), building vulnerabilities associated with typical construction systems in Turkey and estimations of building replacement costs for different damage levels. Probable maximum and average annualized losses are estimated as the result of analysis. There is a two-level earthquake insurance system in Turkey, the effect of which is incorporated in the algorithm: the national compulsory earthquake insurance scheme and the private earthquake insurance system. To buy private insurance one has to be covered by the national system, that has limited coverage. As a demonstration of the methodology we look at the case of Istanbul and use its building inventory data instead of a portfolio. A state-of-the-art time depent earthquake hazard model that portrays the increased earthquake expectancies in Istanbul is used. Intensity and spectral displacement based vulnerability relationships are incorporated in the analysis. In particular we look at the uncertainty in the loss estimations that arise from the vulnerability relationships, and at the effect of the implemented repair cost ratios.

  14. Earthquakes and the office-based surgeon.

    PubMed Central

    Conover, W A

    1992-01-01

    A major earthquake may strike while a surgeon is performing an operation in an office surgical facility. A sudden major fault disruption will lead to thousands of casualties and widespread destruction. Surgeons who operate in offices can help lessen havoc by careful preparation. These plans should coordinate with other disaster plans for effective triage, evacuation, and the treatment of casualties. PMID:1413756

  15. Maximum magnitude earthquakes induced by fluid injection

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2014-02-01

    Analysis of numerous case histories of earthquake sequences induced by fluid injection at depth reveals that the maximum magnitude appears to be limited according to the total volume of fluid injected. Similarly, the maximum seismic moment seems to have an upper bound proportional to the total volume of injected fluid. Activities involving fluid injection include (1) hydraulic fracturing of shale formations or coal seams to extract gas and oil, (2) disposal of wastewater from these gas and oil activities by injection into deep aquifers, and (3) the development of enhanced geothermal systems by injecting water into hot, low-permeability rock. Of these three operations, wastewater disposal is observed to be associated with the largest earthquakes, with maximum magnitudes sometimes exceeding 5. To estimate the maximum earthquake that could be induced by a given fluid injection project, the rock mass is assumed to be fully saturated, brittle, to respond to injection with a sequence of earthquakes localized to the region weakened by the pore pressure increase of the injection operation and to have a Gutenberg-Richter magnitude distribution with a b value of 1. If these assumptions correctly describe the circumstances of the largest earthquake, then the maximum seismic moment is limited to the volume of injected liquid times the modulus of rigidity. Observations from the available case histories of earthquakes induced by fluid injection are consistent with this bound on seismic moment. In view of the uncertainties in this analysis, however, this should not be regarded as an absolute physical limit.

  16. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  17. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  18. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  19. Lightning Activities and Earthquakes

    NASA Astrophysics Data System (ADS)

    Liu, Jann-Yenq

    2016-04-01

    The lightning activity is one of the key parameters to understand the atmospheric electric fields and/or currents near the Earth's surface as well as the lithosphere-atmosphere coupling during the earthquake preparation period. In this study, to see whether or not lightning activities are related to earthquakes, we statistically examine lightning activities 30 days before and after 78 land and 230 sea M>5.0 earthquakes in Taiwan during the 12-year period of 1993-2004. Lightning activities versus the location, depth, and magnitude of earthquakes are investigated. Results show that lightning activities tend to appear around the forthcoming epicenter and are significantly enhanced a few, especially 17-19, days before the M>6.0 shallow (depth D< 20 km) land earthquakes. Moreover, the size of the area around the epicenter with the statistical significance of lightning activity enhancement is proportional to the earthquake magnitude.

  20. Compiling the 'Global Earthquake History' (1000-1903)

    NASA Astrophysics Data System (ADS)

    Albini, P.; Musson, R.; Locati, M.; Rovida, A.

    2013-12-01

    The study of historical earthquakes from historical sources, or historical seismology, is of wider interest than just the seismic hazard and risk community. In the scope of the two-year project (October 2010-March 2013) "Global Earthquake History", developed in the framework of GEM, a reassessment of world historical seismicity was made, from available published studies. The scope of the project is the time window 1000-1903, with magnitudes 7.0 and above. Events with lower magnitudes are included on a case by case, or region by region, basis. The Global Historical Earthquake Archive (GHEA) provides a complete account of the global situation in historical seismology. From GHEA, the Global Historical Earthquake Catalogue (GHEC, v1, available at http://www.emidius.eu/GEH/, under Creative Commons licence) was derived, i.e. a world catalogue of earthquakes for the period 1000-1903, with magnitude 7 and over, using publically-available materials, as for the Archive. This is intended to be the best global historical catalogue of large earthquakes presently available, with the best parameters selected, duplications and fakes removed, and in some cases, new earthquakes discovered. GHEA and GHEC are conceived as providing a basis for co-ordinating future research into historical seismology in any part of the world, and hopefully, encouraging new historical earthquake research initiatives that will continue to improve the information available.

  1. Earthquake at 40 feet

    USGS Publications Warehouse

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  2. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  3. Investigations on Real-time GPS for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.; Aranha, M. A.; Melgar, D.; Allen, R. M.

    2015-12-01

    The Geodetic Alarm System (G-larmS) is a software system developed in a collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech (NMT) primarily for real-time Earthquake Early Warning (EEW). It currently uses high rate (1Hz), low latency (< ~5 seconds), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from existing EEW algorithms, e.g. ElarmS, to compute static offsets upon S-wave arrival. G-larmS performs a least squares inversion on these offsets to determine slip on a finite fault, which we use to estimate moment magnitude. These computations are repeated every second for the duration of the event. G-larmS has been in continuous operation at the BSL for over a year using event triggers from the California Integrated Seismic Network (CISN) ShakeAlert system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California. Pairs of stations are processed as baselines using trackRT (MIT software package). G-larmS produced good results in real-time during the South Napa (M 6.0, August 2014) earthquake as well as on several replayed and simulated test cases. We evaluate the performance of G-larmS for EEW by analysing the results using a set of well defined test cases to investigate the following: (1) using multiple fault regimes and concurrent processing with the ultimate goal of achieving model generation (slip and magnitude computations) within each 1 second GPS epoch on very large magnitude earthquakes (up to M 9.0), (2) the use of Precise Point Positioning (PPP) real-time data streams of various operators, accuracies, latencies and formats along with baseline data streams, (3) collaboratively expanding EEW coverage along the U.S. West Coast on a regional network basis for Northern California, Southern California and Cascadia.

  4. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  5. Earthquakes, November-December 1973

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria. 

  6. Earthquake history of Oregon

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Although situated between two States (California and Washington) that have has many violent earthquakes, Oregon is noticeably less active seismically. the greatest damage experienced resulted from a major shock near Olympia, Wash., in 1949. During the short history record available (since 1841), 34 earthquakes of intensity V, Modified Mercalli Scale, or greater have centered within Oregon or near its borders. Only 13 of the earthquakes had an intensity above V, and many of the shocks were local. However, a 1936 earthquake in the eastern Oregon-Washington region caused extensive damage and was felt over an area of 272,000 square kilometers. 

  7. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  8. Earthquakes and Plate Boundaries

    ERIC Educational Resources Information Center

    Lowman, Paul; And Others

    1978-01-01

    Contains the contents of the Student Investigation booklet of a Crustal Evolution Education Project (CEEP) instructional modules on earthquakes. Includes objectives, procedures, illustrations, worksheets, and summary questions. (MA)

  9. Investigating landslides caused by earthquakes - A historical review

    USGS Publications Warehouse

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  10. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  11. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  12. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes.

  13. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  14. Preliminary results on earthquake triggered landslides for the Haiti earthquake (January 2010)

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Gorum, Tolga

    2010-05-01

    This study presents the first results on an analysis of the landslides triggered by the Ms 7.0 Haiti earthquake that occurred on January 12, 2010 in the boundary region of the Pacific Plate and the North American plate. The fault is a left lateral strike slip fault with a clear surface expression. According to the USGS earthquake information the Enriquillo-Plantain Garden fault system has not produced any major earthquake in the last 100 years, and historical earthquakes are known from 1860, 1770, 1761, 1751, 1684, 1673, and 1618, though none of these has been confirmed in the field as associated with this fault. We used high resolution satellite imagery available for the pre and post earthquake situations, which were made freely available for the response and rescue operations. We made an interpretation of all co-seismic landslides in the epicentral area. We conclude that the earthquake mainly triggered landslide in the northern slope of the fault-related valley and in a number of isolated area. The earthquake apparently didn't trigger many visible landslides within the slum areas on the slopes in the southern part of Port-au-Prince and Carrefour. We also used ASTER DEM information to relate the landslide occurrences with DEM derivatives.

  15. The USGS Earthquake Scenario Project

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Petersen, M. D.; Wald, L. A.; Frankel, A. D.; Quitoriano, V. R.; Lin, K.; Luco, N.; Mathias, S.; Bausch, D.

    2009-12-01

    The U.S. Geological Survey’s (USGS) Earthquake Hazards Program (EHP) is producing a comprehensive suite of earthquake scenarios for planning, mitigation, loss estimation, and scientific investigations. The Earthquake Scenario Project (ESP), though lacking clairvoyance, is a forward-looking project, estimating earthquake hazard and loss outcomes as they may occur one day. For each scenario event, fundamental input includes i) the magnitude and specified fault mechanism and dimensions, ii) regional Vs30 shear velocity values for site amplification, and iii) event metadata. A grid of standard ShakeMap ground motion parameters (PGA, PGV, and three spectral response periods) is then produced using the well-defined, regionally-specific approach developed by the USGS National Seismic Hazard Mapping Project (NHSMP), including recent advances in empirical ground motion predictions (e.g., the NGA relations). The framework also allows for numerical (3D) ground motion computations for specific, detailed scenario analyses. Unlike NSHMP ground motions, for ESP scenarios, local rock and soil site conditions and commensurate shaking amplifications are applied based on detailed Vs30 maps where available or based on topographic slope as a proxy. The scenario event set is comprised primarily by selection from the NSHMP events, though custom events are also allowed based on coordination of the ESP team with regional coordinators, seismic hazard experts, seismic network operators, and response coordinators. The event set will be harmonized with existing and future scenario earthquake events produced regionally or by other researchers. The event list includes approximate 200 earthquakes in CA, 100 in NV, dozens in each of NM, UT, WY, and a smaller number in other regions. Systematic output will include all standard ShakeMap products, including HAZUS input, GIS, KML, and XML files used for visualization, loss estimation, ShakeCast, PAGER, and for other systems. All products will be

  16. Earthquake research in China

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    The prediction of the Haicheng earthquake was an extraordinary achievement by the geophysical workers of the People's Republic of China, whose national program in earthquake reserach was less than 10 years old at the time. To study the background to this prediction, a delgation of 10 U.S scientists, which I led, visited China in June 1976. 

  17. Can we control earthquakes?

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    In 1966, it was discovered that high pressure injection of industrial waste fluids into the subsurface near Denver, Colo., was triggering earthquakes. While this was disturbing at the time, it was also exciting because there was immediate speculation that here at last was a mechanism to control earthquakes.  

  18. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  19. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  20. Earthquake history of Mississippi

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Since its admission into the Union in 1817, Mississippi has had only four earthquakes of intensity V or greater within its borders. Although the number of earthquakes known to have been centered within Mississippi's boundaries is small, the State has been affected by numerous shocks located in neighboring States. In 1811 and 1812, a series of great earthquakes near the New Madrid Missouri area was felt in Mississippi as far south as the gulf coast. The New Madrid series caused the banks of the Mississippi River to cave in as far as Vicksburg, mroe than 300 miles from the epicentral region. As a result of this great earthquake series, the northwest corner of Mississippi is in seismic risk zone 3, the highest risk zone. Expect for the new Madrid series, effects in Mississippi from earthquakes located outside of the State have been less than intensity V. 

  1. Earthquake history of Pennsylvania

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Record of early earthquakes in Northeastern United States provide limited information on effects in pennsylvania until 1737, 55 years after the first permanent settlement was established. A very severe earthquake that centered in the St.Lawrence River region in 1663 may have been felt in Pennsylvania, but historical accounts are not definite. Likewise, a damaging shock at Newbury, Mass., in 1727 probably affected towns in Pennsylvania. A strong earthquake on December 18, 1737, toppled chimneys at New York City and was reported felt at Boston, Mass., Philadelphia, Pa. and New Castle, Del. Other shocks with origins outside the State were felt in 1758, 1783, and 1791. Since 1800, when two earthquakes (March 17 and November 29) were reported as "severe" at Philadelphia, 16 tremors of intensity V or greater (Modified Mercalli Scale) have originated within the State. On November 11 and 14, 1840, sever earthquakes at Philadelphia were accompnaied by a great and unusual swell on the Delaware River. 

  2. Research on geo-electrical resistivity observation system specially used for earthquake monitoring in China

    NASA Astrophysics Data System (ADS)

    Zhao, Jialiu; Wang, Lanwei; Qian, Jiadong

    2011-12-01

    This paper deals with the design and development of the observational system of geo-electrical resistivity on the basis of the demands for exploring the temporal variations of electrical properties of Earth media in the fixed points of the networks, which would be associated with the earthquake preparation. The observation system is characterized by the high accuracy in measurement, long term stability in operation and high level of rejection to the environmental interference. It consists of three main parts, configuration system measurement system, the calibration and inspection system.

  3. Earthquake Hazard and Risk Assessment for Turkey

    NASA Astrophysics Data System (ADS)

    Betul Demircioglu, Mine; Sesetyan, Karin; Erdik, Mustafa

    2010-05-01

    Using a GIS-environment to present the results, seismic risk analysis is considered as a helpful tool to support the decision making for planning and prioritizing seismic retrofit intervention programs at large scale. The main ingredients of seismic risk analysis consist of seismic hazard, regional inventory of buildings and vulnerability analysis. In this study, the assessment of the national earthquake hazard based on the NGA ground motion prediction models and the comparisons of the results with the previous models have been considered, respectively. An evaluation of seismic risk based on the probabilistic intensity ground motion prediction for Turkey has been investigated. According to the Macroseismic approach of Giovinazzi and Lagomarsino (2005), two alternative vulnerability models have been used to estimate building damage. The vulnerability and ductility indices for Turkey have been taken from the study of Giovinazzi (2005). These two vulnerability models have been compared with the observed earthquake damage database. A good agreement between curves has been clearly observed. In additional to the building damage, casualty estimations based on three different methods for each return period and for each vulnerability model have been presented to evaluate the earthquake loss. Using three different models of building replacement costs, the average annual loss (AAL) and probable maximum loss ratio (PMLR) due to regional earthquake hazard have been provided to form a basis for the improvement of the parametric insurance model and the determination of premium rates for the compulsory earthquake insurance in Turkey.

  4. The mass balance of earthquakes and earthquake sequences

    NASA Astrophysics Data System (ADS)

    Marc, O.; Hovius, N.; Meunier, P.

    2016-04-01

    Large, compressional earthquakes cause surface uplift as well as widespread mass wasting. Knowledge of their trade-off is fragmentary. Combining a seismologically consistent model of earthquake-triggered landsliding and an analytical solution of coseismic surface displacement, we assess how the mass balance of single earthquakes and earthquake sequences depends on fault size and other geophysical parameters. We find that intermediate size earthquakes (Mw 6-7.3) may cause more erosion than uplift, controlled primarily by seismic source depth and landscape steepness, and less so by fault dip and rake. Such earthquakes can limit topographic growth, but our model indicates that both smaller and larger earthquakes (Mw < 6, Mw > 7.3) systematically cause mountain building. Earthquake sequences with a Gutenberg-Richter distribution have a greater tendency to lead to predominant erosion, than repeating earthquakes of the same magnitude, unless a fault can produce earthquakes with Mw > 8 or more.

  5. Strategies for rapid global earthquake impact estimation: the Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, D.J.

    2013-01-01

    This chapter summarizes the state-of-the-art for rapid earthquake impact estimation. It details the needs and challenges associated with quick estimation of earthquake losses following global earthquakes, and provides a brief literature review of various approaches that have been used in the past. With this background, the chapter introduces the operational earthquake loss estimation system developed by the U.S. Geological Survey (USGS) known as PAGER (for Prompt Assessment of Global Earthquakes for Response). It also details some of the ongoing developments of PAGER’s loss estimation models to better supplement the operational empirical models, and to produce value-added web content for a variety of PAGER users.

  6. Monitoring the Earthquake source process in North America

    USGS Publications Warehouse

    Herrmann, Robert B.; Benz, H.; Ammon, C.J.

    2011-01-01

    With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.

  7. Earthquakes; January-February 1982

    USGS Publications Warehouse

    Person, W.J.

    1982-01-01

    In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine. 

  8. Earthquakes, November-December 1975

    USGS Publications Warehouse

    Person, W.J.

    1976-01-01

    Hawaii experienced its strongest earthquake in more than a century. The magnitude 7.2 earthquake on November 29, killed at least 2 and injured about 35. These were the first deaths from an earthquake in the United States dince the San Fernando earthquake of Febraury 1971. 

  9. Earthquakes, September-October 1986

    USGS Publications Warehouse

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  10. Earthquakes; July-August, 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California. 

  11. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal.

  12. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal. PMID:27108213

  13. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  14. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  15. Phase Transformations and Earthquakes

    NASA Astrophysics Data System (ADS)

    Green, H. W.

    2011-12-01

    Phase transformations have been cited as responsible for, or at least involved in, "deep" earthquakes for many decades (although the concept of "deep" has varied). In 1945, PW Bridgman laid out in detail the string of events/conditions that would have to be achieved for a solid/solid transformation to lead to a faulting instability, although he expressed pessimism that the full set of requirements would be simultaneously achieved in nature. Raleigh and Paterson (1965) demonstrated faulting during dehydration of serpentine under stress and suggested dehydration embrittlement as the cause of intermediate depth earthquakes. Griggs and Baker (1969) produced a thermal runaway model of a shear zone under constant stress, culminating in melting, and proposed such a runaway as the origin of deep earthquakes. The discovery of Plate Tectonics in the late 1960s established the conditions (subduction) under which Bridgman's requirements for earthquake runaway in a polymorphic transformation could be possible in nature and Green and Burnley (1989) found that instability during the transformation of metastable olivine to spinel. Recent seismic correlation of intermediate-depth-earthquake hypocenters with predicted conditions of dehydration of antigorite serpentine and discovery of metastable olivine in 4 subduction zones, suggests strongly that dehydration embrittlement and transformation-induced faulting are the underlying mechanisms of intermediate and deep earthquakes, respectively. The results of recent high-speed friction experiments and analysis of natural fault zones suggest that it is likely that similar processes occur commonly during many shallow earthquakes after initiation by frictional failure.

  16. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  17. Landslides caused by earthquakes.

    USGS Publications Warehouse

    Keefer, D.K.

    1984-01-01

    Data from 40 historical world-wide earthquakes were studied to determine the characteristics, geologic environments, and hazards of landslides caused by seismic events. This sample was supplemented with intensity data from several hundred US earthquakes to study relations between landslide distribution and seismic parameters. Correlations between magnitude (M) and landslide distribution show that the maximum area likely to be affected by landslides in a seismic event increases from approximately 0 at M = 4.0 to 500 000 km2 at M = 9.2. Each type of earthquake-induced landslide occurs in a particular suite of geologic environments. -from Author

  18. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  19. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  20. Earthquake history of Tennessee

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

     The western part of the State was shaken strongly by the New Madrid, Mo., earthquakes of 1811-12 and by earthquakes in 1843 and 1895. The area has also experienced minor shocks. Additional activity has occurred in the eastern part of the State, near the North Carolina border. Forty shocks of intensity V (Modified Mercalli scale) or greater have been cataloged as occurring within the State. Many other earthquakes centered in bordering States have affected points in Tennessee. The following summary covers only hose shocks of intensity VI or greater. 

  1. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    SciTech Connect

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-15

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ε) ≔ T{sub 0} + εT{sub 1} + ε{sup 2}T{sub 2} + … + ε{sup k}T{sub k} + … forms a Riesz basis in L{sup 2}(0, T), T > 0, where ε∈C, T{sub 0} is a closed densely defined linear operator on a separable Hilbert space H with domain D(T{sub 0}) having isolated eigenvalues with multiplicity one, while T{sub 1}, T{sub 2}, … are linear operators on H having the same domain D⊃D(T{sub 0}) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  2. The earthquake vulnerability of a utility system

    SciTech Connect

    Burhenn, T.A.; Hawkins, H.G.; Ostrom, D.K.; Richau, E.M. )

    1992-01-01

    This paper describes a method to assess the earthquake vulnerability of a utility system and presents an example application. First, the seismic hazard of the system is modeled. Next, the damage and operational disruption to the facilities are estimated. The approach described herein formulates the problem so that the best documented and judgmental information on the earthquake performance of a utility's components can be utilized. finally, the activities and estimates of the time necessary to restore the system to different levels of service are developed. This method of analysis provides a realistic picture of the resiliency of utility service, not just vulnerabilities of various types of equipment.

  3. On near-source earthquake triggering

    USGS Publications Warehouse

    Parsons, T.; Velasco, A.A.

    2009-01-01

    When one earthquake triggers others nearby, what connects them? Two processes are observed: static stress change from fault offset and dynamic stress changes from passing seismic waves. In the near-source region (r ??? 50 km for M ??? 5 sources) both processes may be operating, and since both mechanisms are expected to raise earthquake rates, it is difficult to isolate them. We thus compare explosions with earthquakes because only earthquakes cause significant static stress changes. We find that large explosions at the Nevada Test Site do not trigger earthquakes at rates comparable to similar magnitude earthquakes. Surface waves are associated with regional and long-range dynamic triggering, but we note that surface waves with low enough frequency to penetrate to depths where most aftershocks of the 1992 M = 5.7 Little Skull Mountain main shock occurred (???12 km) would not have developed significant amplitude within a 50-km radius. We therefore focus on the best candidate phases to cause local dynamic triggering, direct waves that pass through observed near-source aftershock clusters. We examine these phases, which arrived at the nearest (200-270 km) broadband station before the surface wave train and could thus be isolated for study. Direct comparison of spectral amplitudes of presurface wave arrivals shows that M ??? 5 explosions and earthquakes deliver the same peak dynamic stresses into the near-source crust. We conclude that a static stress change model can readily explain observed aftershock patterns, whereas it is difficult to attribute near-source triggering to a dynamic process because of the dearth of aftershocks near large explosions.

  4. The SCEC-USGS Dynamic Earthquake Rupture Code Comparison Exercise - Simulations of Large Earthquakes and Strong Ground Motions

    NASA Astrophysics Data System (ADS)

    Harris, R.

    2015-12-01

    I summarize the progress by the Southern California Earthquake Center (SCEC) and U.S. Geological Survey (USGS) Dynamic Rupture Code Comparison Group, that examines if the results produced by multiple researchers' earthquake simulation codes agree with each other when computing benchmark scenarios of dynamically propagating earthquake ruptures. These types of computer simulations have no analytical solutions with which to compare, so we use qualitative and quantitative inter-code comparisons to check if they are operating satisfactorily. To date we have tested the codes against benchmark exercises that incorporate a range of features, including single and multiple planar faults, single rough faults, slip-weakening, rate-state, and thermal pressurization friction, elastic and visco-plastic off-fault behavior, complete stress drops that lead to extreme ground motion, heterogeneous initial stresses, and heterogeneous material (rock) structure. Our goal is reproducibility, and we focus on the types of earthquake-simulation assumptions that have been or will be used in basic studies of earthquake physics, or in direct applications to specific earthquake hazard problems. Our group's goals are to make sure that when our earthquake-simulation codes simulate these types of earthquake scenarios along with the resulting simulated strong ground shaking, that the codes are operating as expected. For more introductory information about our group and our work, please see our group's overview papers, Harris et al., Seismological Research Letters, 2009, and Harris et al., Seismological Research Letters, 2011, along with our website, scecdata.usc.edu/cvws.

  5. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  6. Building losses assessment for Lushan earthquake utilization multisource remote sensing data and GIS

    NASA Astrophysics Data System (ADS)

    Nie, Juan; Yang, Siquan; Fan, Yida; Wen, Qi; Xu, Feng; Li, Lingling

    2015-12-01

    On 20 April 2013, a catastrophic earthquake of magnitude 7.0 struck the Lushan County, northwestern Sichuan Province, China. This earthquake named Lushan earthquake in China. The Lushan earthquake damaged many buildings. The situation of building loss is one basis for emergency relief and reconstruction. Thus, the building losses of the Lushan earthquake must be assessed. Remote sensing data and geographic information systems (GIS) can be employed to assess the building loss of the Lushan earthquake. The building losses assessment results for Lushan earthquake disaster utilization multisource remote sensing dada and GIS were reported in this paper. The assessment results indicated that 3.2% of buildings in the affected areas were complete collapsed. 12% and 12.5% of buildings were heavy damaged and slight damaged, respectively. The complete collapsed buildings, heavy damaged buildings, and slight damaged buildings mainly located at Danling County, Hongya County, Lushan County, Mingshan County, Qionglai County, Tianquan County, and Yingjing County.

  7. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  8. Earthquake resistant design

    SciTech Connect

    Dowrick, D.J.

    1988-01-01

    The author discusses recent advances in earthquake-resistant design. This book covers the entire design process, from aspects of loading to details of construction. Early chapters offer a broad theoretical background; later chapters provide rigorous coverage of practical aspects.

  9. Seafloor earthquake measurement system, SEMS IV

    SciTech Connect

    Platzbecker, M.R.; Ehasz, J.P.; Franco, R.J.

    1997-07-01

    Staff of the Telemetry Technology Development Department (2664) have, in support of the U.S. Interior Department Mineral Management Services (MMS), developed and deployed the Seafloor Earthquake Measurement System IV (SEMS IV). The result of this development project is a series of three fully operational seafloor seismic monitor systems located at offshore platforms: Eureka, Grace, and Irene. The instrument probes are embedded from three to seven feet into the seafloor and hardwired to seismic data recorders installed top side at the offshore platforms. The probes and underwater cables were designed to survive the seafloor environment with an operation life of five years. The units have been operational for two years and have produced recordings of several minor earthquakes in that time. Sandia Labs will transfer operation of SEMS IV to MMS contractors in the coming months. 29 figs., 25 tabs.

  10. Dynamic stress changes during earthquake rupture

    USGS Publications Warehouse

    Day, S.M.; Yu, G.; Wald, D.J.

    1998-01-01

    We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.

  11. Metrics for comparing dynamic earthquake rupture simulations

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  12. The New Madrid earthquakes

    SciTech Connect

    Obermeier, S.F.

    1989-01-01

    Two interpreted 1811-12 epicenters generally agree well with zones of seismicity defined by modern, small earthquakes. Bounds on accelerations are placed at the limits of sand blows, generated by the 1811-12 earthquakes in the St. Francis Basin. Conclusions show how the topstratum thickness, sand size of the substratum, and thickness of alluvium affected the distribution of sand blows in the St. Francis Basin.

  13. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  14. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  15. Source processes of strong earthquakes in the North Tien-Shan region

    NASA Astrophysics Data System (ADS)

    Kulikova, G.; Krueger, F.

    2013-12-01

    Tien-Shan region attracts attention of scientists worldwide due to its complexity and tectonic uniqueness. A series of very strong destructive earthquakes occurred in Tien-Shan at the turn of XIX and XX centuries. Such large intraplate earthquakes are rare in seismology, which increases the interest in the Tien-Shan region. The presented study focuses on the source processes of large earthquakes in Tien-Shan. The amount of seismic data is limited for those early times. In 1889, when a major earthquake has occurred in Tien-Shan, seismic instruments were installed in very few locations in the world and these analog records did not survive till nowadays. Although around a hundred seismic stations were operating at the beginning of XIX century worldwide, it is not always possible to get high quality analog seismograms. Digitizing seismograms is a very important step in the work with analog seismic records. While working with historical seismic records one has to take into account all the aspects and uncertainties of manual digitizing and the lack of accurate timing and instrument characteristics. In this study, we develop an easy-to-handle and fast digitization program on the basis of already existing software which allows to speed up digitizing process and to account for all the recoding system uncertainties. Owing to the lack of absolute timing for the historical earthquakes (due to the absence of a universal clock at that time), we used time differences between P and S phases to relocate the earthquakes in North Tien-Shan and the body-wave amplitudes to estimate their magnitudes. Combining our results with geological data, five earthquakes in North Tien-Shan were precisely relocated. The digitizing of records can introduce steps into the seismograms which makes restitution (removal of instrument response) undesirable. To avoid the restitution, we simulated historic seismograph recordings with given values for damping and free period of the respective instrument and

  16. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  17. Extending earthquakes' reach through cascading.

    PubMed

    Marsan, David; Lengliné, Olivier

    2008-02-22

    Earthquakes, whatever their size, can trigger other earthquakes. Mainshocks cause aftershocks to occur, which in turn activate their own local aftershock sequences, resulting in a cascade of triggering that extends the reach of the initial mainshock. A long-lasting difficulty is to determine which earthquakes are connected, either directly or indirectly. Here we show that this causal structure can be found probabilistically, with no a priori model nor parameterization. Large regional earthquakes are found to have a short direct influence in comparison to the overall aftershock sequence duration. Relative to these large mainshocks, small earthquakes collectively have a greater effect on triggering. Hence, cascade triggering is a key component in earthquake interactions.

  18. An efficient repeating signal detector to investigate earthquake swarms

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2016-08-01

    Repetitive earthquake swarms have been recognized as key signatures in fluid injection induced seismicity, precursors to volcanic eruptions, and slow slip events preceding megathrust earthquakes. We investigate earthquake swarms by developing a Repeating Signal Detector (RSD), a computationally efficient algorithm utilizing agglomerative clustering to identify similar waveforms buried in years of seismic recordings using a single seismometer. Instead of relying on existing earthquake catalogs of larger earthquakes, RSD identifies characteristic repetitive waveforms by rapidly identifying signals of interest above a low signal-to-noise ratio and then grouping based on spectral and time domain characteristics, resulting in dramatically shorter processing time than more exhaustive autocorrelation approaches. We investigate seismicity in four regions using RSD: (1) volcanic seismicity at Mammoth Mountain, California, (2) subduction-related seismicity in Oaxaca, Mexico, (3) induced seismicity in Central Alberta, Canada, and (4) induced seismicity in Harrison County, Ohio. In each case, RSD detects a similar or larger number of earthquakes than existing catalogs created using more time intensive methods. In Harrison County, RSD identifies 18 seismic sequences that correlate temporally and spatially to separate hydraulic fracturing operations, 15 of which were previously unreported. RSD utilizes a single seismometer for earthquake detection which enables seismicity to be quickly identified in poorly instrumented regions at the expense of relying on another method to locate the new detections. Due to the smaller computation overhead and success at distances up to ~50 km, RSD is well suited for real-time detection of low-magnitude earthquake swarms with permanent regional networks.

  19. Nisqually, Washington Intraplate Earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Hofmeister, R.

    2001-05-01

    On February 28, 2001, the M6.8 Nisqually earthquake shook the Pacific Northwest. This intraplate event occurred within the subducting Juan de Fuca plate along the Cascadia margin. Although the damage was less than observed at most large urban earthquakes, serious damage was found in Olympia, Seattle, and Tacoma. To better serve Oregon public safety needs, DOGAMI and others surveyed the Puget Sound damage to expand our technical understanding of seismic ground response, building and lifeline behavior, and secondary hazards (landslides and liquefaction). Damage was observed in structures and areas that, for the most part, would be predicted to be vulnerable. These included: old buildings (URMs), old lifelines (4th Ave bridge in Olympia), areas with poor soil conditions (Harbor Island, Seattle; Sunset Lake, Tumwater), and steep slopes (Salmon Beach; Burien). Damage types included: structural, nonstructural, contents, lifelines, landslides, liquefaction, lateral spreading, sand boils, and settlement. In several notable places, seismic-induced ground failures significantly increased the damage. Estimated costs developed from HAZUS evaluations ranged from \\2 billion to \\3.9 billion. Historic intraplate earthquakes in the Puget Sound region, including the 1949 M7.1, 1965 M6.5, and 1999 M5.9, were not accompanied by significant aftershock events or associated with earthquake sequences. However, a recent El Salvador earthquake sequence suggests there may be particular cases of increased seismicity following large intraplate events, with implications for post-earthquake response and mitigation. The January 13, 2001 M7.6 El Salvador intraplate earthquake was followed by a M6.6 crustal event February 13, 2001 and a M5.4 intraplate event February 28, 2001.

  20. Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment

    USGS Publications Warehouse

    Lin, K.-W.; Wald, D.J.

    2012-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.

  1. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  2. The earthquake potential of the New Madrid seismic zone

    USGS Publications Warehouse

    Tuttle, M.P.; Schweig, E.S.; Sims, J.D.; Lafferty, R.H.; Wolf, L.W.; Haynes, M.L.

    2002-01-01

    The fault system responsible for New Madrid seismicity has generated temporally clustered very large earthquakes in A.D. 900 ?? 100 years and A.D. 1450 ?? 150 years as well as in 1811-1812. Given the uncertainties in dating liquefaction features, the time between the past three New Madrid events may be as short as 200 years and as long as 800 years, with an average of 500 years. This advance in understanding the Late Holocene history of the New Madrid seismic zone and thus, the contemporary tectonic behavior of the associated fault system was made through studies of hundreds of earthquake-induced liquefaction features at more than 250 sites across the New Madrid region. We have found evidence that prehistoric sand blows, like those that formed during the 1811-1812 earthquakes, are probably compound structures resulting from multiple earthquakes closely clustered in time or earthquake sequences. From the spatial distribution and size of sand blows and their sedimentary units, we infer the source zones and estimate the magnitudes of earthquakes within each sequence and thereby characterize the detailed behavior of the fault system. It appears that fault rupture was complex and that the central branch of the seismic zone produced very large earthquakes during the A.D. 900 and A.D. 1450 events as well as in 1811-1812. On the basis of a minimum recurrence rate of 200 years, we are now entering the period during which the next 1811-1812-type event could occur.

  3. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  4. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  5. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  6. Estimating surface faulting impacts from the shakeout scenario earthquake

    USGS Publications Warehouse

    Treiman, J.A.; Pontib, D.J.

    2011-01-01

    An earthquake scenario, based on a kinematic rupture model, has been prepared for a Mw 7.8 earthquake on the southern San Andreas Fault. The rupture distribution, in the context of other historic large earthquakes, is judged reasonable for the purposes of this scenario. This model is used as the basis for generating a surface rupture map and for assessing potential direct impacts on lifelines and other infrastructure. Modeling the surface rupture involves identifying fault traces on which to place the rupture, assigning slip values to the fault traces, and characterizing the specific displacements that would occur to each lifeline impacted by the rupture. Different approaches were required to address variable slip distribution in response to a variety of fault patterns. Our results, involving judgment and experience, represent one plausible outcome and are not predictive because of the variable nature of surface rupture. ?? 2011, Earthquake Engineering Research Institute.

  7. Earthquakes, September-October 1984

    USGS Publications Warehouse

    Person, W.J.

    1985-01-01

    In the United States, Wyoming experienced a couple of moderate earthquakes, and off the coast of northern California, a strong earthquake shook much of the northern coast of California and parts of the Oregon coast. 

  8. Earthquakes: Megathrusts and mountain building

    NASA Astrophysics Data System (ADS)

    Briggs, Rich

    2016-05-01

    Coastlines above subduction zones slowly emerge from the sea despite repeated drowning by great, shallow earthquakes. Analysis of the Chilean coast suggests that moderate-to-large, deeper earthquakes may be responsible for the net uplift.

  9. Earthquakes, July-August 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There was one major earthquake during this reporting period-a magnitude 7.1 shock off the coast of Northern California on August 17. Earthquake-related deaths were reported from Indonesia, Romania, Peru, and Iraq. 

  10. Distribution of similar earthquakes in aftershocks of inland earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, M.; Hiramatsu, Y.; Aftershock Observations Of 2007 Noto Hanto, G.

    2010-12-01

    Frictional properties control the slip behavior on a fault surface such as seismic slip and aseismic slip. Asperity, as a seismic slip area, is characterized by a strong coupling in the interseismic period and large coseismic slip. On the other hand, steady slip or afterslip occurs in an aseismic slip area around the asperity. If an afterslip area includes small asperities, a repeating rupture of single asperity can generate similar earthquakes due to the stress accumulation caused by the afterslip. We here investigate a detail distribution of similar earthquakes in the aftershocks of the 2007 Noto Hanto earthquake (Mjma 6.9) and the 2000 Western Tottori earthquake (Mjma 7.3), inland large earthquakes in Japan. We use the data obtained by the group for the aftershock observations of the 2007 Noto Hanto Earthquake and by the group for the aftershock observations of the 2000 Western Tottori earthquake. First, we select pairs of aftershocks whose cross correlation coefficients in 10 s time window of band-pass filtered waveforms of 1~4 Hz are greater than 0.95 at more than 5 stations and divide those into groups by a link of the cross correlation coefficients. Second, we reexamine the arrival times of P and S waves and the maximum amplitude for earthquakes of each group and apply the double-difference method (Waldhouser and Ellsworth, 2000) to relocate them. As a result of the analysis, we find 24 groups of similar earthquakes in the aftershocks on the source fault of the 2007 Noto Hanto Earthquake and 86 groups of similar earthquakes in the aftershocks on the source fault of the 2000 Western Tottori Earthquake. Most of them are distributed around or outside the asperity of the main shock. Geodetic studies reported that postseismic deformation was detected for the both earthquakes (Sagiya et al., 2002; Hashimoto et al., 2008). The source area of similar earthquakes seems to correspond to the afterslip area. These features suggest that the similar earthquakes observed

  11. Seismic survey probes urban earthquake hazards in Pacific Northwest

    USGS Publications Warehouse

    Fisher, M.A.; Brocher, T.M.; Hyndman, R.D.; Trehu, A.M.; Weaver, C.S.; Creager, K.C.; Crosson, R.S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B.C.; Hammer, P.T.; Childs, J. R.; Cochrane, G.R.; Chopra, S.; Walia, R.

    1999-01-01

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region. The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  12. Reducing the Risks of Nonstructural Earthquake Damage: A Practical Guide. Earthquake Hazards Reduction Series 1.

    ERIC Educational Resources Information Center

    Reitherman, Robert

    The purpose of this booklet is to provide practical information to owners, operators, and occupants of office and commercial buildings on the vulnerabilities posed by earthquake damage to nonstructural items and the means available to deal with these potential problems. Examples of dangerous nonstructural damages that have occurred in past…

  13. Earthquakes; January-February, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The first major earthquake (magnitude 7.0 to 7.9) of the year struck in southeastern Alaska in a sparsely populated area on February 28. On January 16, Iran experienced the first destructive earthquake of the year causing a number of casualties and considerable damage. Peru was hit by a destructive earthquake on February 16 that left casualties and damage. A number of earthquakes were experienced in parts of the Untied States, but only minor damage was reported. 

  14. Earthquakes, May-June 1984

    USGS Publications Warehouse

    Person, W.J.

    1984-01-01

    No major earthquakes (7.0-7.9) occurred during this reporting period. earthquake-rated deaths were reported from Italy, the Dominican Republic, and Yugoslavia. A number of earthquakes occurred in the United States but none caused casualties or any significant damage. 

  15. Earthquakes, March-April, 1993

    USGS Publications Warehouse

    Person, Waverly J.

    1993-01-01

    Worldwide, only one major earthquake (7.0earthquake, a magnitude 7.2 shock, struck the Santa Cruz Islands region in the South Pacific on March 6. Earthquake-related deaths occurred in the Fiji Islands, China, and Peru.

  16. Earthquakes, March-April 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    Two major earthquakes (7.0-7.9) occurred during this reporting period: a magnitude 7.6 in Costa Rica on April 22 and a magntidue 7.0 in the USSR on April 29. Destructive earthquakes hit northern Peru on April 4 and 5. There were no destructive earthquakes in the United States during this period. 

  17. Earthquakes, May-June 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage. 

  18. Earthquakes, September-October 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region. 

  19. Earthquakes, March-April 1978

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    Earthquakes caused fatalities in Mexico and Sicily; injuries and damage were sustained in eastern Kazakh SSR and Yugoslavia. There were four major earthquakes; one south of Honshu, Japan, two in the Kuril Islands region, and one in the Soviet Union. The United States experienced a number of earthquakes, but only very minor damage was reported. 

  20. Earthquakes, September-October 1993

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    The fatalities in the United States were caused by two earthquakes in southern Oregon on September 21. These earthquakes, both with magnitude 6.0 and separated in time by about 2 hrs, led to the deaths of two people. One of these deaths was apparently due to a heart attack induced by the earthquake

  1. Organizational changes at Earthquakes & Volcanoes

    USGS Publications Warehouse

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  2. Turkish Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  3. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  4. InSAR observations of the 2009 Racha earthquake, Georgia

    NASA Astrophysics Data System (ADS)

    Nikolaeva, Elena; Walter, Thomas R.

    2016-09-01

    Central Georgia is an area strongly affected by earthquake and landslide hazards. On 29 April 1991 a major earthquake (Mw  =  7.0) struck the Racha region in Georgia, followed by aftershocks and significant afterslip. The same region was hit by another major event (Mw  =  6.0) on 7 September 2009. The aim of the study reported here was to utilize interferometric synthetic aperture radar (InSAR) data to improve knowledge about the spatial pattern of deformation due to the 2009 earthquake. There were no actual earthquake observations by InSAR in Georgia. We considered all available SAR data images from different space agencies. However, due to the long wavelength and the frequent acquisitions, only the multi-temporal ALOS L-band SAR data allowed us to produce interferograms spanning the 2009 earthquake. We detected a local uplift around 10 cm (along the line-of-sight propagation) in the interferogram near the earthquake's epicenter, whereas evidence of surface ruptures could not be found in the field along the active thrust fault. We simulated a deformation signal which could be created by the 2009 Racha earthquake on the basis of local seismic records and by using an elastic dislocation model. We compared our modeled fault surface of the September 2009 with the April 1991 Racha earthquake fault surfaces and identify the same fault or a sub-parallel fault of the same system as the origin. The patch that was active in 2009 is just adjacent to the 1991 patch, indicating a possible mainly westward propagation direction, with important implications for future earthquake hazards.

  5. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    NASA Astrophysics Data System (ADS)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  6. Forecasters of earthquakes

    NASA Astrophysics Data System (ADS)

    Maximova, Lyudmila

    1987-07-01

    For the first time Soviet scientists have set up a bioseismological proving ground which will stage a systematic extensive experiment of using birds, ants, mountain rodents including marmots, which can dig holes in the Earth's interior to a depth of 50 meters, for the purpose of earthquake forecasting. Biologists have accumulated extensive experimental data on the impact of various electromagnetic fields, including fields of weak intensity, on living organisms. As far as mammals are concerned, electromagnetic waves with frequencies close to the brain's biorhythms have the strongest effect. How these observations can be used to forecast earthquakes is discussed.

  7. Earthquakes in New England

    USGS Publications Warehouse

    Fratto, E. S.; Ebel, J.E.; Kadinsky-Cade, K.

    1990-01-01

    New England has a long history of earthquakes. Some of the first explorers were startled when they experienced strong shaking and rumbling of the earth below their feet. they soon learned from the Indians that this was not an uncommon occurrence in the New World. the Plymouth Pilgrims felt their first earthquake in 1638. that first shock rattled dishes, doors, and buildings. The shaking so frightened those working in the fields that they threw down their tools and ran panic-stricken through the countryside. 

  8. The 1976 Tangshan earthquake

    USGS Publications Warehouse

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  9. Headaches prior to earthquakes

    NASA Astrophysics Data System (ADS)

    Morton, L. L.

    1988-06-01

    In two surveys of headaches it was noted that their incidence had increased significantly within 48 h prior to earthquakes from an incidence of 17% to 58% in the first survey using correlated samples and from 20.4% to 44% in the second survey using independent samples. It is suggested that an increase in positive air ions from rock compression may trigger head pain via a decrease in brain levels of the neurotransmitter serotonin. The findings are presented as preliminary, with the hope of generating further research efforts in areas more prone to earthquakes.

  10. WEST Physics Basis

    NASA Astrophysics Data System (ADS)

    Bourdelle, C.; Artaud, J. F.; Basiuk, V.; Bécoulet, M.; Brémond, S.; Bucalossi, J.; Bufferand, H.; Ciraolo, G.; Colas, L.; Corre, Y.; Courtois, X.; Decker, J.; Delpech, L.; Devynck, P.; Dif-Pradalier, G.; Doerner, R. P.; Douai, D.; Dumont, R.; Ekedahl, A.; Fedorczak, N.; Fenzi, C.; Firdaouss, M.; Garcia, J.; Ghendrih, P.; Gil, C.; Giruzzi, G.; Goniche, M.; Grisolia, C.; Grosman, A.; Guilhem, D.; Guirlet, R.; Gunn, J.; Hennequin, P.; Hillairet, J.; Hoang, T.; Imbeaux, F.; Ivanova-Stanik, I.; Joffrin, E.; Kallenbach, A.; Linke, J.; Loarer, T.; Lotte, P.; Maget, P.; Marandet, Y.; Mayoral, M. L.; Meyer, O.; Missirlian, M.; Mollard, P.; Monier-Garbet, P.; Moreau, P.; Nardon, E.; Pégourié, B.; Peysson, Y.; Sabot, R.; Saint-Laurent, F.; Schneider, M.; Travère, J. M.; Tsitrone, E.; Vartanian, S.; Vermare, L.; Yoshida, M.; Zagorski, R.; Contributors, JET

    2015-06-01

    With WEST (Tungsten Environment in Steady State Tokamak) (Bucalossi et al 2014 Fusion Eng. Des. 89 907-12), the Tore Supra facility and team expertise (Dumont et al 2014 Plasma Phys. Control. Fusion 56 075020) is used to pave the way towards ITER divertor procurement and operation. It consists in implementing a divertor configuration and installing ITER-like actively cooled tungsten monoblocks in the Tore Supra tokamak, taking full benefit of its unique long-pulse capability. WEST is a user facility platform, open to all ITER partners. This paper describes the physics basis of WEST: the estimated heat flux on the divertor target, the planned heating schemes, the expected behaviour of the L-H threshold and of the pedestal and the potential W sources. A series of operating scenarios has been modelled, showing that ITER-relevant heat fluxes on the divertor can be achieved in WEST long pulse H-mode plasmas.

  11. Earthquakes, November-December 1992

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California. 

  12. Earthquakes; March-April 1975

    USGS Publications Warehouse

    Person, W.J.

    1975-01-01

    There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971. 

  13. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  14. Earthquake hazard hunt

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    The Earthquake Hazard Hunt should begin at home, with all family members participating. Foresight, imagination, and commonsense are all that are needed as you go from room to room and imagine what would happen when the Earth and house started to shake. 

  15. ALMA measures Calama earthquake

    NASA Astrophysics Data System (ADS)

    Brito, R.; Shillue, B.

    2010-04-01

    On 4 March 2010, the ALMA system response to an extraordinarily large disturbance was measured when a magnitude 6.3 earthquake struck near Calama, Chile, relatively close to the ALMA site. Figures 1 through 4 demonstrate the remarkable performance of the ALMA system to a huge disturbance that was more than 100 times the specification for correction accuracy.

  16. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  17. Earthquake damage to schools

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    These unusual slides show earthquake damage to school and university buildings around the world. They graphically illustrate the potential danger to our schools, and to the welfare of our children, that results from major earthquakes. The slides range from Algeria, where a collapsed school roof is held up only by students' desks; to Anchorage, Alaska, where an elementary school structure has split in half; to California and other areas, where school buildings have sustained damage to walls, roofs, and chimneys. Interestingly, all the United States earthquakes depicted in this set of slides occurred either on a holiday or before or after school hours, except the 1935 tremor in Helena, Montana, which occurred at 11:35 am. It undoubtedly would have caused casualties had the schools not been closed days earlier by Helena city officials because of a damaging foreshock. Students in Algeria, the People's Republic of China, Armenia, and other stricken countries were not so fortunate. This set of slides represents 17 destructive earthquakes that occurred in 9 countries, and covers more than a century--from 1886 to 1988. Two of the tremors, both of which occurred in the United States, were magnitude 8+ on the Richter Scale, and four were magnitude 7-7.9. The events represented by the slides (see table below) claimed more than a quarter of a million lives.

  18. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  19. Fractal dynamics of earthquakes

    SciTech Connect

    Bak, P.; Chen, K.

    1995-05-01

    Many objects in nature, from mountain landscapes to electrical breakdown and turbulence, have a self-similar fractal spatial structure. It seems obvious that to understand the origin of self-similar structures, one must understand the nature of the dynamical processes that created them: temporal and spatial properties must necessarily be completely interwoven. This is particularly true for earthquakes, which have a variety of fractal aspects. The distribution of energy released during earthquakes is given by the Gutenberg-Richter power law. The distribution of epicenters appears to be fractal with dimension D {approx} 1--1.3. The number of after shocks decay as a function of time according to the Omori power law. There have been several attempts to explain the Gutenberg-Richter law by starting from a fractal distribution of faults or stresses. But this is a hen-and-egg approach: to explain the Gutenberg-Richter law, one assumes the existence of another power-law--the fractal distribution. The authors present results of a simple stick slip model of earthquakes, which evolves to a self-organized critical state. Emphasis is on demonstrating that empirical power laws for earthquakes indicate that the Earth`s crust is at the critical state, with no typical time, space, or energy scale. Of course the model is tremendously oversimplified; however in analogy with equilibrium phenomena they do not expect criticality to depend on details of the model (universality).

  20. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    NASA Astrophysics Data System (ADS)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  1. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  2. Episodic tremor triggers small earthquakes

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2011-08-01

    It has been suggested that episodic tremor and slip (ETS), the weak shaking not associated with measurable earthquakes, could trigger nearby earthquakes. However, this had not been confirmed until recently. Vidale et al. monitored seismicity in the 4-month period around a 16-day episode of episodic tremor and slip in March 2010 in the Cascadia region. They observed five small earthquakes within the subducting slab during the ETS episode. They found that the timing and locations of earthquakes near the tremor suggest that the tremor and earthquakes are related. Furthermore, they observed that the rate of earthquakes across the area was several times higher within 2 days of tremor activity than at other times, adding to evidence of a connection between tremor and earthquakes. (Geochemistry, Geophysics, Geosystems, doi:10.1029/2011GC003559, 2011)

  3. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  4. Force and pressure characteristics for a series of nose inlets at Mach numbers from 1.59 to 1.99 V : analysis and comparison on basis of ram-jet aircraft range and operational characteristics

    NASA Technical Reports Server (NTRS)

    Howard, E; Luidens, R W; Allen, J L

    1951-01-01

    Performance of four experimentally investigated axially symmetric spike-type nose inlets is compared on basis of ram-jet-engine aircraft range and operational problems. At design conditions, calculated peak engine efficiencies varied 25 percent from the highest value which indicates importance of inlet design. Calculations for a typical supersonic aircraft indicate possible increase in range if engine is flown at moderate angle of attack and result in engine lift utilized. For engines with fixed exhaust nozzle, propulsive thrust increases with increasing heat addition in subcritical flow region in spite of increasing additive drag. For the perforated inlet there is a range of increasing total-temperature ratios in subcritical flow region that does not yield an increase in propulsive thrust. Effects of inlet characteristics on speed stability of a typical aircraft for three types of fuel control is discussed.

  5. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Mayeda, K.; Walter, W. R.

    2003-04-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by applying the same methodology to a series of datasets that spans roughly 10 orders in seismic moment, M0. We will summarize recent results using a coda envelope methodology of Mayeda et al, (2003) which provide the most stable source spectral estimates to date. This methodology eliminates the complicating effects of lateral path heterogeneity, source radiation pattern, directivity, and site response (e.g., amplification, f-max and kappa). We find that in tectonically active continental crustal areas the total radiated energy scales as M00.25 whereas in regions of relatively younger oceanic crust, the stress drop is generally lower and exhibits a 1-to-1 scaling with moment. In addition to answering a fundamental question in earthquake source dynamics, this study addresses how one would scale small earthquakes in a particular region up to a future, more damaging earthquake. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  6. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  7. Detailed source process of the 2007 Tocopilla earthquake.

    NASA Astrophysics Data System (ADS)

    Peyrat, S.; Madariaga, R.; Campos, J.; Asch, G.; Favreau, P.; Bernard, P.; Vilotte, J.

    2008-05-01

    We investigated the detail rupture process of the Tocopilla earthquake (Mw 7.7) of the 14 November 2007 and of the main aftershocks that occurred in the southern part of the North Chile seismic gap using strong motion data. The earthquake happen in the middle of the permanent broad band and strong motion network IPOC newly installed by GFZ and IPGP, and of a digital strong-motion network operated by the University of Chile. The Tocopilla earthquake is the last large thrust subduction earthquake that occurred since the major Iquique 1877 earthquake which produced a destructive tsunami. The Arequipa (2001) and Antofagasta (1995) earthquakes already ruptured the northern and southern parts of the gap, and the intraplate intermediate depth Tarapaca earthquake (2005) may have changed the tectonic loading of this part of the Peru-Chile subduction zone. For large earthquakes, the depth of the seismic rupture is bounded by the depth of the seismogenic zone. What controls the horizontal extent of the rupture for large earthquakes is less clear. Factors that influence the extent of the rupture include fault geometry, variations of material properties and stress heterogeneities inherited from the previous ruptures history. For subduction zones where structures are not well known, what may have stopped the rupture is not obvious. One crucial problem raised by the Tocopilla earthquake is to understand why this earthquake didn't extent further north, and at south, what is the role of the Mejillones peninsula that seems to act as a barrier. The focal mechanism was determined using teleseismic waveforms inversion and with a geodetic analysis (cf. Campos et al.; Bejarpi et al., in the same session). We studied the detailed source process using the strong motion data available. This earthquake ruptured the interplate seismic zone over more than 150 km and generated several large aftershocks, mainly located south of the rupture area. The strong-motion data show clearly two S

  8. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    NASA Astrophysics Data System (ADS)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Barış, Şerif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is

  9. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  10. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

  11. The health effects of earthquakes in the mid-1990s.

    PubMed

    Alexander, D

    1996-09-01

    This paper gives an overview of the global pattern of casualties in earthquakes which occurred during the 30-month period from 1 September 1993 to 29 February 1996. It also describes some of the behavioural and logistical regularities associated with mortality and morbidity in these events. Of 83 earthquakes studied, there were casualties in 49. Lethal earthquakes occurred in rapid succession in Indonesia, China, Colombia and Iran. In the events studied, a disproportionate number of deaths and injuries occurred during the first six hours of the day and in earthquakes with magnitudes between 6.5 and 7.4. Ratios of death to injury varied markedly (though with some averages close to 1:3), as did the nature and causes of mortality and morbidity and the proportion of serious to slight injuries. As expected on the basis of previous knowledge, few problems were caused by post-earthquake illness and disease. Also, as expected, building collapse was the principal source of casualties: tsunamis, landslides, debris flows and bridge collapses were the main secondary causes. In addition, new findings are presented on the temporal sequence of casualty estimates after seismic disaster. In synthesis, though mortality in earthquakes may have been low in relation to long-term averages, the interval of time studied was probably typical of other periods in which seismic catastrophes were relatively limited in scope.

  12. Microbiological study of pathogenic bacteria isolated from paediatric wound infections following the 2008 Wenchuan earthquake.

    PubMed

    Ran, Ying-Chun; Ao, Xiao-Xiao; Liu, Lan; Fu, Yi-Long; Tuo, Hui; Xu, Feng

    2010-05-01

    On 12 May 2008, the Wenchuan earthquake struck in Sichuan, China. Within 1 month after the earthquake, 98 injured children were admitted to the Children's Hospital of Chongqing Medical University. According to clinical manifestations, 50 children were diagnosed with wound infections. Wound secretions were cultured for bacteria. Pathogen distribution and drug resistance were analyzed. A total of 99 pathogens were isolated; 16 (16%) were Gram-positive bacteria and 81 (82%) were Gram-negative bacteria. The distribution of pathogens isolated within 1 month after the earthquake was different to the distribution of pathogens in 546 general hospitalized cases in the y before the earthquake. The pathogens most frequently isolated 1 month after the earthquake were Acinetobacter baumannii (27%), Enterobacter cloacae (18%) and Pseudomonas aeruginosa (13%). The pathogens most frequently isolated in the y prior to the earthquake were Escherichia coli (27%), Staphylococcus aureus (23%) and coagulase-negative staphylococci (9%). The rate of isolated drug-resistant bacteria was higher in the earthquake cases than in the general hospitalized cases. In the cases injured in the earthquake, the rates of isolation of methicillin-resistant Staphylococcus aureus and extended-spectrum beta-lactamase-producing E. cloacae, E. coli and Klebsiella pneumoniae were higher than in the cases from before the earthquake. Multidrug-resistant and pandrug-resistant A. baumannii were isolated at a higher rate in cases after the earthquake than in those before the earthquake. These changes in the spectrum of pathogens and in the drug resistance of the pathogens isolated following an earthquake will provide the basis for emergency treatment after earthquakes. PMID:20095936

  13. Earthquake Hazard Mitigation and Real-Time Warnings of Tsunamis and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2015-09-01

    With better understanding of earthquake physics and the advent of broadband seismology and GPS, seismologists can forecast the future activity of large earthquakes on a sound scientific basis. Such forecasts are critically important for long-term hazard mitigation, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainties, and unexpected events will inevitably occur. Recent developments in real-time seismology helps seismologists cope with and prepare for such unexpected events, including tsunamis and earthquakes. For a tsunami warning, the required warning time is fairly long (usually 5 min or longer) and enables use of a rigorous method for this purpose. Significant advances have already been made. In contrast, early warning of earthquakes is far more challenging because the required warning time is very short (as short as three seconds). Despite this difficulty the methods used for regional warnings have advanced substantially, and several systems have been already developed and implemented. A future strategy for more challenging, rapid (a few second) warnings, which are critically important for saving properties and lives, is discussed.

  14. Scientific aspects of the Tohoku earthquake and Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Koketsu, Kazuki

    2016-04-01

    We investigated the 2011 Tohoku earthquake, the accident of the Fukushima Daiichi nuclear power plant, and assessments conducted beforehand for earthquake and tsunami potential in the Pacific offshore region of the Tohoku District. The results of our investigation show that all the assessments failed to foresee the earthquake and its related tsunami, which was the main cause of the accident. Therefore, the disaster caused by the earthquake, and the accident were scientifically unforeseeable at the time. However, for a zone neighboring the reactors, a 2008 assessment showed tsunamis higher than the plant height. As a lesson learned from the accident, companies operating nuclear power plants should be prepared using even such assessment results for neighboring zones.

  15. Reevaluated macroseismic map of the strongest Vrancea (Romania) earthquake occurred in 20th century

    NASA Astrophysics Data System (ADS)

    Pantea, Aurelian; Constantin, Angela

    2010-05-01

    In order to set the basis of some rigorous standards and norms of antiseismic design, capable of assuring maximum security to buildings, in accordance with the idea of promoting and developing a national system, compatible with the European standardizing systems, we initiated a very large research activity especially of reevaluating and harmonizing of the macroseismic maps of the significant earthquakes occurred on the Romanian territory. In this paper there have been reevaluated the macroseismic effects of the strongest vrancean earthquake occurred at 10th of November 1940. The reevaluating operation of the macroseismic data consisted in the reinterpretation of over 4500 macroseismic questionnaires, as well as the critical and serious research of the expertise reports, monographies, photos, scientific papers published both inside and outside the country regarding the severity of the macroseismic effects that were noticed "in situ" in the damaged areas. The increasing intensity of the macroseismic effects towards NNE in the case of the earthquake from the 10th of November 1940, was determined by the constructive interference of the cowaves produced by successive shocks. Only by admitting that the earthquake from the 10th of November 1940 was of a multishock type, we can explain the major macroseismic effects produced in Focsani, Odobesti, Marasesti, Panciu, Barlad, as well as in the areas from the neighborhood of the source where the seismic intensity exceeded the X (MCS) degree (Lopatari, Targu Bujor, Neculele). In all these places 70% from the houses have been completely destroyed, burying a great part of the inhabitants. Taking into consideration the geological and tectonic complexity, as well as the distribution of the seismic active areas on the Romanian territory and in the transborder areas that influence the seismicity, we considered that it is necessary, for a better graphic representation of the distribution of the macroseismic field generated by the

  16. VLF/LF EM emissions as main precursor of earthquakes and their searching possibilities for Georgian s/a region

    NASA Astrophysics Data System (ADS)

    Kachakhidze, Manana; Kachakhidze, Nino

    2016-04-01

    Authors of abstract have created work which offers model of earth electromagnetic emissions generation detected in the process of earthquake preparation on the basis of electrodynamics. The model gives qualitative explanation of a mechanism of generation of electromagnetic waves emitted in the earthquake preparation period. Besides, scheme of the methodology of earthquake forecasting is created based on avalanche-like unstable model of fault formation and an analogous model of electromagnetic contour, synthesis of which, is rather harmonious. According to the authors of the work electromagnetic emissions in radiodiapason is more universal and reliable than other anomalous variations of various geophysical phenomena in earthquake preparation period; Besides, VLF/LF electromagnetic emissions might be declared as the main precursor of earthquake because it might turn out very useful with the view of prediction of large (M ≥5) inland earthquakes and to govern processes going on in lithosphere-atmosphere - ionosphere coupling (LAIC) system. Since the other geophysical phenomena, which may accompany earthquake preparation process and expose themselves several months, weeks or days prior to earthquakes are less informative with the view of earthquake forecasting, it is admissible to consider them as earthquake indicators. Physical mechanisms of mentioned phenomena are explained on the basis of the model of generation of electromagnetic emissions detected before earthquake, where a process of earthquake preparation and its realization are considered taking into account distributed and conservative systems properties. Up to these days electromagnetic emissions detection network did not exist in Georgia. European colleagues helped us (Prof. Dr. PF Biagi, Prof. Dr. Aydın BÜYÜKSARAÇ) and made possible the installation of a receiver. We are going to develop network and put our share in searching of earthquakes problem. Participation in conference is supported by financial

  17. The Earthquake That Tweeted

    NASA Astrophysics Data System (ADS)

    Petersen, D.

    2011-12-01

    Advances in mobile technology and social networking are enabling new behaviors that were not possible even a few short years ago. When people experience a tiny earthquake, it's more likely they're going to reach for their phones and tell their friends about it than actually take cover under a desk. With 175 million Twitter accounts, 750 million Facebook users and more than five billion mobile phones in the world today, people are generating terrific amounts of data simply by going about their everyday lives. Given the right tools and guidance these connected individuals can act as the world's largest sensor network, doing everything from reporting on earthquakes to anticipating global crises. Drawing on the author's experience as a user researcher and experience designer, this presentation will discuss these trends in crowdsourcing the collection and analysis of data, and consider their implications for how the public encounters the earth sciences in their everyday lives.

  18. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  19. Earthquake triggering at alaskan volcanoes following the 3 November 2002 denali fault earthquake

    USGS Publications Warehouse

    Moran, S.C.; Power, J.A.; Stihler, S.D.; Sanchez, J.J.; Caplan-Auerbach, J.

    2004-01-01

    The 3 November 2002 Mw 7.9 Denali fault earthquake provided an excellent opportunity to investigate triggered earthquakes at Alaskan volcanoes. The Alaska Volcano Observatory operates short-period seismic networks on 24 historically active volcanoes in Alaska, 247-2159 km distant from the mainshock epicenter. We searched for evidence of triggered seismicity by examining the unfiltered waveforms for all stations in each volcano network for ???1 hr after the Mw 7.9 arrival time at each network and for significant increases in located earthquakes in the hours after the mainshock. We found compelling evidence for triggering only at the Katmai volcanic cluster (KVC, 720-755 km southwest of the epicenter), where small earthquakes with distinct P and 5 arrivals appeared within the mainshock coda at one station and a small increase in located earthquakes occurred for several hours after the mainshock. Peak dynamic stresses of ???0.1 MPa at Augustine Volcano (560 km southwest of the epicenter) are significantly lower than those recorded in Yellowstone and Utah (>3000 km southeast of the epicenter), suggesting that strong directivity effects were at least partly responsible for the lack of triggering at Alaskan volcanoes. We describe other incidents of earthquake-induced triggering in the KVC, and outline a qualitative magnitude/distance-dependent triggering threshold. We argue that triggering results from the perturbation of magmatic-hydrothermal systems in the KVC and suggest that the comparative lack of triggering at other Alaskan volcanoes could be a result of differences in the nature of magmatic-hydrothermal systems.

  20. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  1. Pain after earthquake

    PubMed Central

    2012-01-01

    Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009). Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%). Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations. PMID:22747796

  2. Do Earthquakes Shake Stock Markets?

    PubMed

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  3. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  4. Tien Shan Geohazards Database: Earthquakes and landslides

    NASA Astrophysics Data System (ADS)

    Havenith, H. B.; Strom, A.; Torgoev, I.; Torgoev, A.; Lamair, L.; Ischuk, A.; Abdrakhmatov, K.

    2015-11-01

    In this paper we present new and review already existing landslide and earthquake data for a large part of the Tien Shan, Central Asia. For the same area, only partial databases for sub-regions have been presented previously. They were compiled and new data were added to fill the gaps between the databases. Major new inputs are products of the Central Asia Seismic Risk Initiative (CASRI): a tentative digital map of active faults (even with indication of characteristic or possible maximum magnitude) and the earthquake catalogue of Central Asia until 2009 that was now updated with USGS data (to May 2014). The new compiled landslide inventory contains existing records of 1600 previously mapped mass movements and more than 1800 new landslide data. Considering presently available seismo-tectonic and landslide data, a target region of 1200 km (E-W) by 600 km (N-S) was defined for the production of more or less continuous geohazards information. This target region includes the entire Kyrgyz Tien Shan, the South-Western Tien Shan in Tajikistan, the Fergana Basin (Kyrgyzstan, Tajikistan and Uzbekistan) as well as the Western part in Uzbekistan, the North-Easternmost part in Kazakhstan and a small part of the Eastern Chinese Tien Shan (for the zones outside Kyrgyzstan and Tajikistan, only limited information was available and compiled). On the basis of the new landslide inventory and the updated earthquake catalogue, the link between landslide and earthquake activity is analysed. First, size-frequency relationships are studied for both types of geohazards, in terms of Gutenberg-Richter Law for the earthquakes and in terms of probability density function for the landslides. For several regions and major earthquake events, case histories are presented to outline further the close connection between earthquake and landslide hazards in the Tien Shan. From this study, we concluded first that a major hazard component is still now insufficiently known for both types of geohazards

  5. Foreshocks of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Guglielmi, A. V.; Sobisevich, L. E.; Sobisevich, A. L.; Lavrov, I. P.

    2014-07-01

    The specific enhancement of ultra-low-frequency (ULF) electromagnetic oscillations a few hours prior to the strong earthquakes, which was previously mentioned in the literature, motivated us to search for the distinctive features of the mechanical (foreshock) activity of the Earth's crust in the epicentral zones of the future earthquakes. Activation of the foreshocks three hours before the main shock is revealed, which is roughly similar to the enhancement of the specific electromagnetic ULF emission. It is hypothesized that the round-the-world seismic echo signals from the earthquakes, which form the peak of energy release 2 h 50 min before the main events, act as the triggers of the main shocks due to the cumulative action of the surface waves converging to the epicenter. It is established that the frequency of the fluctuations in the foreshock activity decreases at the final stages of the preparation of the main shocks, which probably testifies to the so-called mode softening at the approach of the failure point according to the catastrophe theory.

  6. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  7. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  8. Distant, delayed and ancient earthquake-induced landslides

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Torgoev, Almaz; Braun, Anika; Schlögel, Romy; Micu, Mihai

    2016-04-01

    On the basis of a new classification of seismically induced landslides we outline particular effects related to the delayed and distant triggering of landslides. Those cannot be predicted by state-of-the-art methods. First, for about a dozen events the 'predicted' extension of the affected area is clearly underestimated. The most problematic cases are those for which far-distant triggering of landslides had been reported, such as for the 1988 Saguenay earthquake. In Central Asia reports for such cases are known for areas marked by a thick cover of loess. One possible contributing effect could be a low-frequency resonance of the thick soils induced by distant earthquakes, especially those in the Pamir - Hindu Kush seismic region. Such deep focal and high magnitude (>>7) earthquakes are also found in Europe, first of all in the Vrancea region (Romania). For this area and others in Central Asia we computed landslide event sizes related to scenario earthquakes with M>7.5. The second particular and challenging type of triggering is the one delayed with respect to the main earthquake event: case histories have been reported for the Racha earthquake in 1991 when several larger landslides only started moving 2 or 3 days after the main shock. Similar observations were also made after other earthquake events in the U.S., such as after the 1906 San Francisco, the 1949 Tacoma, the 1959 Hebgen Lake and the 1983 Bora Peak earthquakes. Here, we will present a series of detailed examples of (partly monitored) mass movements in Central Asia that mainly developed after earthquakes, some even several weeks after the main shock: e.g. the Tektonik and Kainama landslides triggered in 1992 and 2004, respectively. We believe that the development of the massive failures is a consequence of the opening of tension cracks during the seismic shaking and their filling up with water during precipitations that followed the earthquakes. The third particular aspect analysed here is the use of large

  9. Earthquakes, September-October 1980

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    There were two major (magnitudes 7.0-7.9) earthquakes during this reporting period; a magnitude (M) 7.3 in Algeria where many people were killed or injured and extensive damage occurred, and an M=7.2 in the Loyalty Islands region of the South Pacific. Japan was struck by a damaging earthquake on September 24, killing two people and causing injuries. There were no damaging earthquakes in the United States. 

  10. Earthquakes; July-August 1977

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    July and August were somewhat active seismically speaking, compared to previous months of this year. There were seven earthquakes having magnitudes of 6.5 or greater. The largest was a magnitudes of 6.5 or greater. The largest was a magnitude 8.0 earthquake south of Sumbawa Island on August 19 that killed at least 111. The United States experienced a number of earthquakes during this period, but only one, in California, caused some minor damage. 

  11. Earthquakes, November-December 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were three major earthquakes (7.0-7.9) during the last two months of the year: a magntidue 7.0 on November 19 in Columbia, a magnitude 7.4 in the Kuril Islands on December 22, and a magnitude 7.1 in the South Sandwich Islands on December 27. Earthquake-related deaths were reported in Colombia, Yemen, and Iran. there were no significant earthquakes in the United States during this reporting period. 

  12. Proceedings of lifeline earthquake engineering

    SciTech Connect

    Cassaro, M.A.

    1991-01-01

    This book contains the proceedings of the Lifeline Earthquake Engineering Conference. Topics covered include: Overview of Lifeline Earthquake Engineering; Transportation Lifelines; Seismic Retrofit and Strengthening of Transportation Lifelines; Electric Power Lifelines; Communications Lifelines; Water Delivery and Sewer Lifelines; Seismic Hazards Evaluation; Risk and Reliability Analysis of Lifelines; Lifeline Experience During Earthquakes and System Behavior; Seismic Analysis and Design of Lifelines; Vulnerability of Lifelines; and Vulnerability Reduction, Mitigation Planning, and Emergency Response.

  13. Human casualties in earthquakes: modelling and mitigation

    USGS Publications Warehouse

    Spence, R.J.S.; So, E.K.M.

    2011-01-01

    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  14. Evaluation of earthquake and tsunami on JSFR

    SciTech Connect

    Chikazawa, Y.; Enuma, Y.; Kisohara, N.; Yamano, H.; Kubo, S.; Hayafune, H.; Sagawa, H.; Okamura, S.; Shimakawa, Y.

    2012-07-01

    Evaluation of earthquake and tsunami on JSFR has been analyzed. For seismic design, safety components are confirmed to maintain their functions even against recent strong earthquakes. As for Tsunami, some parts of reactor building might be submerged including component cooling water system whose final heat sink is sea water. However, in the JSFR design, safety grade components are independent from component cooling water system (CCWS). The JSFR emergency power supply adopts a gas turbine system with air cooling, since JSFR does not basically require quick start-up of the emergency power supply thanks to the natural convection DHRS. Even in case of long station blackout, the DHRS could be activated by emergency batteries or manually and be operated continuously by natural convection. (authors)

  15. Earthquake prediction: Simple methods for complex phenomena

    NASA Astrophysics Data System (ADS)

    Luen, Bradley

    2010-09-01

    Earthquake predictions are often either based on stochastic models, or tested using stochastic models. Tests of predictions often tacitly assume predictions do not depend on past seismicity, which is false. We construct a naive predictor that, following each large earthquake, predicts another large earthquake will occur nearby soon. Because this "automatic alarm" strategy exploits clustering, it succeeds beyond "chance" according to a test that holds the predictions _xed. Some researchers try to remove clustering from earthquake catalogs and model the remaining events. There have been claims that the declustered catalogs are Poisson on the basis of statistical tests we show to be weak. Better tests show that declustered catalogs are not Poisson. In fact, there is evidence that events in declustered catalogs do not have exchangeable times given the locations, a necessary condition for the Poisson. If seismicity followed a stochastic process, an optimal predictor would turn on an alarm when the conditional intensity is high. The Epidemic-Type Aftershock (ETAS) model is a popular point process model that includes clustering. It has many parameters, but is still a simpli_cation of seismicity. Estimating the model is di_cult, and estimated parameters often give a non-stationary model. Even if the model is ETAS, temporal predictions based on the ETAS conditional intensity are not much better than those of magnitude-dependent automatic (MDA) alarms, a much simpler strategy with only one parameter instead of _ve. For a catalog of Southern Californian seismicity, ETAS predictions again o_er only slight improvement over MDA alarms

  16. Earthquakes, July-August 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  17. Earthquakes, May-June 1981

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    The months of May and June were somewhat quiet, seismically speaking. There was one major earthquake (7.0-7.9) off the west coast of South Island, New Zealand. The most destructive earthquake during this reporting period was in southern Iran on June 11 which caused fatalities and extensive damage. Peru also experienced a destructive earthquake on June 22 which caused fatalities and damage. In the United States, a number of earthquakes were experienced, but none caused significant damage. 

  18. Earthquakes in the United States

    USGS Publications Warehouse

    Stover, C.

    1977-01-01

    To supplement data in the report Preliminary Determination of Epicenters (PDE), the National earthquake Information Service (NEIS) also publishes a quarterly circular, Earthquakes in the United States. This provides information on the felt area of U.S earthquakes and their intensity. The main purpose is to describe the larger effects of these earthquakes so that they can be used in seismic risk studies, site evaluations for nuclear power plants, and answering inquiries by the general public.

  19. Synthetic earthquake catalogs simulating seismic activity in the Corynth Gulf, Greece, fault system

    NASA Astrophysics Data System (ADS)

    Console, R.; Carluccio, R.; Papadimitriou, E. E.; Karakostas, V. G.

    2014-12-01

    The characteristic earthquake hypothesis is the basis of time-dependent modeling of earthquake recurrence on major faults, using the renewal process methodology. However, the characteristic earthquake hypothesis is not strongly supported by observational data. Few fault segments have long historical or paleoseismic records of individually dated ruptures, and when data and parameter uncertainties are allowed for, the form of the recurrence-distribution is difficult to establish. This is the case, for instance, of the Corinth gulf fault system, for which documents about strong earthquakes exist for at least two thousand years, but they can be considered complete for magnitudes > 6.0 only for the latest 300 years, during which only few characteristic earthquakes are reported for single fault segments. The use of a physics-based earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 500,000 events of magnitudes > 4.0. The main features of our simulation algorithm are (1) the imposition of an average slip rate released by earthquakes to every single segment recognized in the investigated fault system, (2) the interaction between earthquake sources, (3) a self-organized earthquake magnitude distribution, and (4) the effect of minor earthquakes in redistributing stress. The application of our simulation algorithm to the Corinth gulf fault system has shown realistic features in time, space and magnitude behavior of the seismicity. These features include long-term periodicity of strong earthquakes, short-term clustering of both strong and smaller events, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the higher magnitude range.

  20. Catalog of earthquakes along the San Andreas fault system in Central California: January-March, 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Meagher, K.L.

    1973-01-01

    Numerous small earthquakes occur each day in the Coast Ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period January - March, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b,c,d). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1,718 earthquakes in Central California. Of particular interest is a sequence of earthquakes in the Bear Valley area which contained single shocks with local magnitudes of S.O and 4.6. Earthquakes from this sequence make up roughly 66% of the total and are currently the subject of an interpretative study. Arrival times at 118 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 94 are telemetered stations operated by NCER. Readings from the remaining 24 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley,have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the

  1. Catalog of earthquakes along the San Andreas fault system in Central California, April-June 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period April - June, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). A catalog for the first quarter of 1972 has been prepared by Wesson and others (1972). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 910 earthquakes in Central California. A substantial portion of the earthquakes reported in this catalog represents a continuation of the sequence of earthquakes in the Bear Valley area which began in February, 1972 (Wesson and others, 1972). Arrival times at 126 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 101 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement

  2. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    NASA Astrophysics Data System (ADS)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  3. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  4. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    USGS Publications Warehouse

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  5. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    NASA Astrophysics Data System (ADS)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  6. Self-Organized Earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.; Klein, W.

    2011-12-01

    Self-Organized Criticality was proposed by the Per Bak et al. [1] as a means of explaining scaling laws observed in driven natural systems, usually in (slowly) driven threshold systems. The example used by Bak was a simple cellular automaton model of a sandpile, in which grains of sand were slowly dropped (randomly) onto a flat plate. After a period of time, during which the 'critical state' was approached, a series of self-similar avalanches would begin. Scaling exponents for the frequency-area statistics of the sandpile avalanches were found to be approximately 1, a value that characterizes 'flicker noise' in natural systems. SOC is associated with a critical point in the phase diagram of the system, and it was found that the usual 2-scaling field theory applies. A model related to SOC is the Self-Organized Spinodal (SOS), or intermittent criticality model. Here a slow but persistent driving force leads to quasi-periodic approach to, and retreat from, the classical limit of stability, or spinodal. Scaling exponents for this model can be related to Gutenberg-Richter and Omori exponents observed in earthquake systems. In contrast to SOC models, nucleation, both classical and non-classical types, is possible in SOS systems. Tunneling or nucleation rates can be computed from Langer-Klein-Landau-Ginzburg theories for comparison to observations. Nucleating droplets play a role similar to characteristic earthquake events. Simulations of these systems reveals much of the phenomenology associated with earthquakes and other types of "burst" dynamics. Whereas SOC is characterized by the full scaling spectrum of avalanches, SOS is characterized by both system-size events above the nominal frequency-size scaling curve, and scaling of small events. Applications to other systems including integrate-and-fire neural networks and financial crashes will be discussed. [1] P. Bak, C. Tang and K. Weisenfeld, Self-Organized Criticality, Phys. Rev. Lett., 59, 381 (1987).

  7. Earthquakes Threaten Many American Schools

    ERIC Educational Resources Information Center

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  8. Earthquakes, July-August, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, on August 6, central California experienced a moderately strong earthquake, which injured several people and caused some damage. A number of earthquakes occurred in other parts of the United States but caused very little damage. 

  9. Heavy tails and earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.

    2012-01-01

    The 21st century has already seen its share of devastating earthquakes, some of which have been labeled as “unexpected,” at least in the eyes of some seismologists and more than a few journalists. A list of seismological surprises could include the 2004 Sumatra-Andaman Islands; 2008 Wenchuan, China; 2009 Haiti; 2011 Christchurch, New Zealand; and 2011 Tohoku, Japan, earthquakes

  10. Earthquakes; May-June 1977

    USGS Publications Warehouse

    Person, W.J.

    1977-01-01

    The months of May and June were somewhat quiet seismically speaking. There was only on significant earthquake, a magnitude 7.2 on June 22 in teh Tonga Islands. In teh United States, the two largest earthquakes occurred in California and on Hawaii. 

  11. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  12. Earthquakes March-April 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of March and April were quite active seismically speaking. There was one major earthquake (7.0Earthquake-related deaths were reported in Iran, Costa Rica, Turkey, and Germany.

  13. Earthquake Preparedness Checklist for Schools.

    ERIC Educational Resources Information Center

    1999

    A brochure provides a checklist highlighting the important questions and activities that should be addressed and undertaken as part of a school safety and preparedness program for earthquakes. It reminds administrators and other interested parties on what not to forget in preparing schools for earthquakes, such as staff knowledge needs, evacuation…

  14. Who cares about Mid-Ocean Ridge Earthquakes? And Why?

    NASA Astrophysics Data System (ADS)

    Tolstoy, M.

    2004-12-01

    Every day the surface of our planet is being slowly ripped apart by the forces of plate tectonics. Much of this activity occurs underwater and goes unnoticed except for by a few marine seismologists who avidly follow the creaks and groans of the ocean floor in an attempt to understand the spreading and formation of oceanic crust. Are marine seismologists really the only ones that care? As it turns out, deep beneath the ocean surface, earthquakes play a fundamental role in a myriad of activity centered on mid-ocean ridges where new crust forms and breaks on a regular basis. This activity takes the form of exotic geological structures hosting roasting hot fluids and bizarre chemosynthetic life forms. One of the fundamental drivers for this other world on the seafloor is earthquakes. Earthquakes provide cracks that allow seawater to penetrate the rocks, heat up, and resurface as hydrothermal vent fluids, thus providing chemicals to feed a thriving biological community. Earthquakes can cause pressure changes along cracks that can fundamentally alter fluid flow rates and paths. Thus earthquakes can both cut off existing communities from their nutrient source and provide new oases on the seafloor around which life can thrive. This poster will present some of the fundamental physical principals of how earthquakes can impact fluid flow, and hence life on the seafloor. Using these other-wordly landscapes and alien-like life forms to woe the unsuspecting passerby, we will sneak geophysics into the picture and tell the story of why earthquakes are so fundamental to life on the seafloor, and perhaps life elsewhere in the universe.

  15. Earthquake Scaling Relations

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Boettcher, M.; Richardson, E.

    2002-12-01

    Using scaling relations to understand nonlinear geosystems has been an enduring theme of Don Turcotte's research. In particular, his studies of scaling in active fault systems have led to a series of insights about the underlying physics of earthquakes. This presentation will review some recent progress in developing scaling relations for several key aspects of earthquake behavior, including the inner and outer scales of dynamic fault rupture and the energetics of the rupture process. The proximate observations of mining-induced, friction-controlled events obtained from in-mine seismic networks have revealed a lower seismicity cutoff at a seismic moment Mmin near 109 Nm and a corresponding upper frequency cutoff near 200 Hz, which we interpret in terms of a critical slip distance for frictional drop of about 10-4 m. Above this cutoff, the apparent stress scales as M1/6 up to magnitudes of 4-5, consistent with other near-source studies in this magnitude range (see special session S07, this meeting). Such a relationship suggests a damage model in which apparent fracture energy scales with the stress intensity factor at the crack tip. Under the assumption of constant stress drop, this model implies an increase in rupture velocity with seismic moment, which successfully predicts the observed variation in corner frequency and maximum particle velocity. Global observations of oceanic transform faults (OTFs) allow us to investigate a situation where the outer scale of earthquake size may be controlled by dynamics (as opposed to geologic heterogeneity). The seismicity data imply that the effective area for OTF moment release, AE, depends on the thermal state of the fault but is otherwise independent of fault's average slip rate; i.e., AE ~ AT, where AT is the area above a reference isotherm. The data are consistent with β = 1/2 below an upper cutoff moment Mmax that increases with AT and yield the interesting scaling relation Amax ~ AT1/2. Taken together, the OTF

  16. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  17. Are Earthquakes a Critical Phenomenon?

    NASA Astrophysics Data System (ADS)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  18. Earthquake Simulator Finds Tremor Triggers

    SciTech Connect

    Johnson, Paul

    2015-03-27

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  19. Testing earthquake source inversion methodologies

    USGS Publications Warehouse

    Page, M.; Mai, P.M.; Schorlemmer, D.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  20. Exaggerated Claims About Earthquake Predictions

    NASA Astrophysics Data System (ADS)

    Kafka, Alan L.; Ebel, John E.

    2007-01-01

    The perennial promise of successful earthquake prediction captures the imagination of a public hungry for certainty in an uncertain world. Yet, given the lack of any reliable method of predicting earthquakes [e.g., Geller, 1997; Kagan and Jackson, 1996; Evans, 1997], seismologists regularly have to explain news stories of a supposedly successful earthquake prediction when it is far from clear just how successful that prediction actually was. When journalists and public relations offices report the latest `great discovery' regarding the prediction of earthquakes, seismologists are left with the much less glamorous task of explaining to the public the gap between the claimed success and the sober reality that there is no scientifically proven method of predicting earthquakes.

  1. Testing an earthquake prediction algorithm

    USGS Publications Warehouse

    Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

    1997-01-01

    A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

  2. Early Earthquakes of the Americas

    NASA Astrophysics Data System (ADS)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  3. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  4. Towards Modelling slow Earthquakes with Geodynamics

    NASA Astrophysics Data System (ADS)

    Regenauer-Lieb, K.; Yuen, D. A.

    2006-12-01

    We explore a new, properly scaled, thermal-mechanical geodynamic model{^1} that can generate timescales now very close to those of earthquakes and of the same order as slow earthquakes. In our simulations we encounter two basically different bifurcation phenomena. One in which the shear zone nucleates in the ductile field, and the second which is fully associated with elasto-plastic (brittle, pressure- dependent) displacements. A quartz/feldspar composite slab has all two modes operating simultaneously in three different depth levels. The bottom of the crust is predominantly controlled by the elasto-visco-plastic mode while the top is controlled by the elasto-plastic mode. The exchange of the two modes appears to communicate on a sub-horizontal layer in a flip-flop fashion, which may yield a fractal-like signature in time and collapses into a critical temperature which for crustal rocks is around 500-580 K; in the middle of the brittle-ductile transition-zone. Near the critical temperature, stresses close to the ideal strength can be reached at local, meter-scale. Investigations of the thermal-mechanical properties under such extreme conditions are pivotal for understanding the physics of earthquakes. 1. Regenauer-Lieb, K., Weinberg, R. & Rosenbaum, G. The effect of energy feedbacks on continental strength. Nature 442, 67-70 (2006).

  5. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  6. [Recommendations for earthquake preparedness in Israel].

    PubMed

    Adler, J; Eldar, R

    2001-09-01

    Earthquakes have occurred in the past in our region, along the Afro-Syrian fault line and along the eastern border of Israel. Several earthquakes had a magnitude between 6.25 and 6.5 on the Richter scale and caused severe damage to the populated areas in the Galilee, the Judean Hills and along the coastline. The last major earthquake occurred in 1927 in the Jordan Valley and caused more than 300 fatalities and extensive property damage. If we consider the present densely populated regions in the areas at risk, the occurrence of an earthquake with a magnitude of > 6.25 will constitute a major disaster, causing thousands of casualties and extensive property and economic damage. Israel is presently planning a comprehensive response to mitigate the damages by enforcing existing anti-seismic building codes, retrofitting of public buildings, including hospitals and utilizing all available manpower and material resources in case of such an event. The health sector is a vital part of the overall preparedness and response. Hospitals have to plan alternative sites for continued activity and increase the number of beds. Army medical teams will have to operate in the disaster area in conjunction and coordination with the Home Front Command rescue teams and the EMS. Public and primary health services will have to be reinforced to deal with acute and chronic health problems in the wake of the disaster. The burial of the dead and their identification will become a major logistic and emotional problem and must be planned in advance. Preparedness includes establishing contact with NGOs and agencies in countries, which may render medical assistance in such an event.

  7. The Lusi mud eruption was not triggered by an earthquake

    NASA Astrophysics Data System (ADS)

    Manga, M.; Rudolph, M. L.; Tingay, M. R.; Davies, R.; Wang, C.; Shirzaei, M.; Fukushima, Y.

    2013-12-01

    The Lusi mud eruption in East Java, Indonesia has displaced tens of thousands of people with economic costs that exceed $4 billion USD to date. Consequently, understanding the cause and future of the eruption are important. There has been considerable debate as to whether the eruption was triggered by the MW 6.3 Yogyakarta earthquake, which struck two days prior to the eruption, or by drilling operations at a gas exploration well (BJP-1) 200 m from the 700 m lineament, along which mud first erupted. A recent letter by Lupi et al. (Nature Geoscience, 2013) argues for an earthquake trigger, invoking the presence of a seismically fast structure that amplifies seismic shaking in the mud source region. The absence of an eruption during larger and closer earthquakes reveals that an earthquake trigger is unlikely. Furthermore, the high seismic velocities, central to the model of Lupi et al. , are impossibly high and are primarily artifacts associated with steel casing installed in the well where the velocities were measured. Finally, the stress changes caused by drilling operations greatly exceeded those produced by the earthquake. Assuming no major changes in plumbing, we conclude by using satellite InSAR to reveal the evolution of surface deformation caused by the eruption and predict a 10 fold decrease in discharge in the next 5 years.

  8. Performance Basis for Airborne Separation

    NASA Technical Reports Server (NTRS)

    Wing, David J.

    2008-01-01

    Emerging applications of Airborne Separation Assistance System (ASAS) technologies make possible new and powerful methods in Air Traffic Management (ATM) that may significantly improve the system-level performance of operations in the future ATM system. These applications typically involve the aircraft managing certain components of its Four Dimensional (4D) trajectory within the degrees of freedom defined by a set of operational constraints negotiated with the Air Navigation Service Provider. It is hypothesized that reliable individual performance by many aircraft will translate into higher total system-level performance. To actually realize this improvement, the new capabilities must be attracted to high demand and complexity regions where high ATM performance is critical. Operational approval for use in such environments will require participating aircraft to be certified to rigorous and appropriate performance standards. Currently, no formal basis exists for defining these standards. This paper provides a context for defining the performance basis for 4D-ASAS operations. The trajectory constraints to be met by the aircraft are defined, categorized, and assessed for performance requirements. A proposed extension of the existing Required Navigation Performance (RNP) construct into a dynamic standard (Dynamic RNP) is outlined. Sample data is presented from an ongoing high-fidelity batch simulation series that is characterizing the performance of an advanced 4D-ASAS application. Data of this type will contribute to the evaluation and validation of the proposed performance basis.

  9. Induced Earthquakes Are Not All Alike: Examples from Texas Since 2008 (Invited)

    NASA Astrophysics Data System (ADS)

    Frohlich, C.

    2013-12-01

    The EarthScope Transportable Array passed through Texas between 2008 and 2011, providing an opportunity to identify and accurately locate earthquakes near and/or within oil/gas fields and injection waste disposal operations. In five widely separated geographical locations, the results suggest seismic activity may be induced/triggered. However, the different regions exhibit different relationships between injection/production operations and seismic activity: In the Barnett Shale of northeast Texas, small earthquakes occurred only near higher-volume (volume rate > 150,000 BWPM) injection disposal wells. These included widely reported earthquakes occurring near Dallas-Fort Worth and Cleburne in 2008 and 2009. Near Alice in south Texas, M3.9 earthquakes occurred in 1997 and 2010 on the boundary of the Stratton Field, which had been highly productive for both oil and gas since the 1950's. Both earthquakes occurred during an era of net declining production, but their focal depths and location at the field boundary suggest an association with production activity. In the Eagle Ford of south central Texas, earthquakes occurred near wells following significant increases in extraction (water+produced oil) volumes as well as injection. The largest earthquake, the M4.8 Fashing earthquake of 20 October 2011, occurred after significant increases in extraction. In the Cogdell Field near Snyder (west Texas), a sequence of earthquakes beginning in 2006 followed significant increases in the injection of CO2 at nearby wells. The largest with M4.4 occurred on 11 September 2011. This is the largest known earthquake possibly attributable to CO2 injection. Near Timpson in east Texas a sequence of earthquakes beginning in 2008, including an M4.8 earthquake on 17 May 2012, occurred within three km of two high-volume injection disposal wells that had begun operation in 2007. These were the first known earthquakes at this location. In summary, the observations find possible induced

  10. MyShake: A smartphone seismic network for earthquake early warning and beyond

    PubMed Central

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis; Kwon, Young-Woo

    2016-01-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  11. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    PubMed

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  12. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    PubMed

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  13. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  14. Probability based earthquake load and resistance factor design criteria for offshore platforms

    SciTech Connect

    Bea, R.G.

    1996-12-31

    This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

  15. Fracking, wastewater disposal, and earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  16. The threat of silent earthquakes

    USGS Publications Warehouse

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  17. Earthquake history of South Carolina

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    An estimated $23 million damage was caused by one of the great earthquakes in United States history in 1886. Charleston, S.C, and nearby cities suffered most of the damage, although points as far as 160 km away were strongly shaken. Many of the 20 earthquakes of intensity V or greater (Modified Mercalli scale) that centered within South Carolina occurred near Charleston. A 1924 shock in the western part of the State was felt over 145,000 km2. Several earthquakes outside the State borders were felt strongly in South Carolina. 

  18. Seismology: dynamic triggering of earthquakes.

    PubMed

    Gomberg, Joan; Johnson, Paul

    2005-10-01

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar.

  19. Earthquake damage to transportation systems

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    Earthquakes represent one of the most destructive natural hazards known to man. A large magnitude earthquake near a populated area can affect residents over thousands of square kilometers and cause billions of dollars in property damage. Such an event can kill or injure thousands of residents and disrupt the socioeconomic environment for months, sometimes years. A serious result of a large-magnitude earthquake is the disruption of transportation systems, which limits post-disaster emergency response. Movement of emergency vehicles, such as police cars, fire trucks and ambulances, is often severely restricted. Damage to transportation systems is categorized below by cause including: ground failure, faulting, vibration damage, and tsunamis.

  20. Earthquakes, November-December 1977

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    In the United States, the largest earthquake during this reporting period was a magntidue 6.6 in the Andreanof Islands, which are part of the Aleutian Islands chain, on November 4 that caused some minor damage. Northern California was struck by a magnitude 4.8 earthquake on November 22 causing moderate damage in the Willits area. This was the most damaging quake in the United States during the year. Two major earthquakes of magntidues 7.0 or above to 14 for the year. 

  1. The key role of eyewitnesses in rapid earthquake impact assessment

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  2. OSR encapsulation basis -- 100-KW

    SciTech Connect

    Meichle, R.H.

    1995-01-27

    The purpose of this report is to provide the basis for a change in the Operations Safety Requirement (OSR) encapsulated fuel storage requirements in the 105 KW fuel storage basin which will permit the handling and storing of encapsulated fuel in canisters which no longer have a water-free space in the top of the canister. The scope of this report is limited to providing the change from the perspective of the safety envelope (bases) of the Safety Analysis Report (SAR) and Operations Safety Requirements (OSR). It does not change the encapsulation process itself.

  3. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  4. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1977-01-01

    In a computer simulation study of earthquakes a seismically active strike slip fault is represented by coupled mechanical blocks which are driven by a moving plate and which slide on a friction surface. Elastic forces and time independent friction are used to generate main shock events, while viscoelastic forces and time dependent friction add aftershock features. The study reveals that the size, length, and time and place of event occurrence are strongly influenced by the magnitude and degree of homogeneity in the elastic, viscous, and friction parameters of the fault region. For example, periodically reoccurring similar events are observed in simulations with near-homogeneous parameters along the fault, whereas seismic gaps are a common feature of simulations employing large variations in the fault parameters. The study also reveals correlations between strain energy release and fault length and average displacement and between main shock and aftershock displacements.

  5. The Kashmir Earthquake Experience.

    PubMed

    Dhar, Shabir A; Halwai, Manzoor A; Mir, Mohammed R; Wani, Zaid A; Butt, M F; Bhat, Masood I; Hamid, Arshiya

    2007-02-01

    On October 8, 2005, a major earthquake measuring 7.6 on the Richter scale struck the Himalayan region of Kashmir. Around 90,000 people died in the mass disaster. The Bone and Joint Hospital in Kashmir found itself in a relatively unique situation of having to deal with the orthopedic morbidity generated by this quake. The hospital received 468 patients over a period of 10 days, out of which 463 were received over the initial 5 days. The admission for a single day peaked at 153 patients on the third day. Due to the unprecedented admission in terms of numbers the hospital utilized outreach methods to streamline admission by sending out specialists to the affected areas. Manpower was judiciously utilized to concentrate specialist advise where required. Besides documenting the pattern of trauma, this paper throws light on some unforeseen problems faced in dealing with a large number of patients far exceeding the normal capacity of the hospital.

  6. Nonextensive models for earthquakes.

    PubMed

    Silva, R; França, G S; Vilar, C S; Alcaniz, J S

    2006-02-01

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment epsilon proportional to r3. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.

  7. Reconsidering earthquake scaling

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Wech, A.; Creager, K.; Obara, K.; Agnew, D.

    2016-06-01

    The relationship (scaling) between scalar moment, M0, and duration, T, potentially provides key constraints on the physics governing fault slip. The prevailing interpretation of M0-T observations proposes different scaling for fast (earthquakes) and slow (mostly aseismic) slip populations and thus fundamentally different driving mechanisms. We show that a single model of slip events within bounded slip zones may explain nearly all fast and slow slip M0-T observations, and both slip populations have a change in scaling, where the slip area growth changes from 2-D when too small to sense the boundaries to 1-D when large enough to be bounded. We present new fast and slow slip M0-T observations that sample the change in scaling in each population, which are consistent with our interpretation. We suggest that a continuous but bimodal distribution of slip modes exists and M0-T observations alone may not imply a fundamental difference between fast and slow slip.

  8. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    NASA Astrophysics Data System (ADS)

    Serata, S.

    2006-12-01

    basis to disclose an acting earthquake shear stress S at top of the tectonic plate is established at the depth of 600-800m (Window). This concept is supported by outcome of the Japanese government stress measurement made at the epicenter of the Kobe earthquake of 1995, where S is found to be less than 5 MPa. At the same time S at the earthquake active Ashio mining district was found to be 36 MPa (90 percent of maximum S) at Window. These findings led to formulation of a quantitative method proposed to monitor earthquake triggering potential in and around any growing earthquake stress nucleus along shallow active faults. For future earthquake time prediction, the Stressmeter can be applied first to survey general distribution of earthquake shear stress S along major active faults. A site with its shear stress greater than 30 MPa may be identified as a site of growing stress nucleus. A Stressmeter must be permanently buried at the site to monitor future stress growth toward a possible triggering by mathematical analysis of the stress excursion dynamics. This is made possible by the automatic stress measurement capability of the Stressmeter at a frequency up to 100 times per day. The significance of this approach is a possibility to save lives by time-prediction of a forthcoming major earthquake with accuracy in hours and minutes.

  9. Earthquake Shaking and Damage to Buildings: Recent evidence for severe ground shaking raises questions about the earthquake resistance of structures.

    PubMed

    Page, R A; Joyner, W B; Blume, J A

    1975-08-22

    Ground shaking close to the causative fault of an earthquake is more intense than it was previously believed to be. This raises the possibility that large numbers of buildings and other structures are not sufficiently resistant for the intense levels of shaking that can occur close to the fault. Many structures were built before earthquake codes were adopted; others were built according to codes formulated when less was known about the intensity of near-fault shaking. Although many building types are more resistant than conventional design analyses imply, the margin of safety is difficult to quantify. Many modern structures, such as freeways, have not been subjected to and tested by near-fault shaking in major earthquakes (magnitude 7 or greater). Damage patterns in recent moderate-sized earthquakes occurring in or adjacent to urbanized areas (17), however, indicate that many structures, including some modern ones designed to meet earthquake code requirements, cannot withstand the severe shaking that can occur close to a fault. It is necessary to review the ground motion assumed and the methods utilized in the design of important existing structures and, if necessary, to strengthen or modify the use of structures that are found to be weak. New structures situated close to active faults should be designed on the basis of ground motion estimates greater than those used in the past. The ultimate balance between risk of earthquake losses and cost for both remedial strengthening and improved earthquake-resistant construction must be decided by the public. Scientists and engineers must inform the public about earthquake shaking and its effect on structures. The exposure to damage from seismic shaking is steadily increasing because of continuing urbanization and the increasing complexity of lifeline systems, such as power, water, transportation, and communication systems. In the near future we should expect additional painful examples of the damage potential of moderate

  10. Sichuan Earthquake in China

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Sichuan earthquake in China occurred on May 12, 2008, along faults within the mountains, but near and almost parallel the mountain front, northwest of the city of Chengdu. This major quake caused immediate and severe damage to many villages and cities in the area. Aftershocks pose a continuing danger, but another continuing hazard is the widespread occurrence of landslides that have formed new natural dams and consequently new lakes. These lakes are submerging roads and flooding previously developed lands. But an even greater concern is the possible rapid release of water as the lakes eventually overflow the new dams. The dams are generally composed of disintegrated rock debris that may easily erode, leading to greater release of water, which may then cause faster erosion and an even greater release of water. This possible 'positive feedback' between increasing erosion and increasing water release could result in catastrophic debris flows and/or flooding. The danger is well known to the Chinese earthquake response teams, which have been building spillways over some of the new natural dams.

    This ASTER image, acquired on June 1, 2008, shows two of the new large landslide dams and lakes upstream from the town of Chi-Kua-Kan at 32o12'N latitude and 104o50'E longitude. Vegetation is green, water is blue, and soil is grayish brown in this enhanced color view. New landslides appear bright off-white. The northern (top) lake is upstream from the southern lake. Close inspection shows a series of much smaller lakes in an elongated 'S' pattern along the original stream path. Note especially the large landslides that created the dams. Some other landslides in this area, such as the large one in the northeast corner of the image, occur only on the mountain slopes, so do not block streams, and do not form lakes.

  11. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Mayeda, K.; Ruppert, S.

    2002-12-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by analyzing aftershock sequences in the Western U.S. and Turkey using two different techniques. First we examine the observed regional S-wave spectra by fitting with a parametric model (Walter and Taylor, 2002) with and without variable stress drop scaling. Because the aftershock sequences have common stations and paths we can examine the S-wave spectra of events by size to determine what type of apparent stress scaling, if any, is most consistent with the data. Second we use regional coda envelope techniques (e.g. Mayeda and Walter, 1996; Mayeda et al, 2002) on the same events to directly measure energy and moment. The coda techniques corrects for path and site effects using an empirical Green function technique and independent calibration with surface wave derived moments. Our hope is that by carefully analyzing a very large number of events in a consistent manner using two different techniques we can start to resolve this apparent stress scaling issue. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  12. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    NASA Astrophysics Data System (ADS)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    approach of statistics of universal precursors or stress level. The approach is more related to failure physics, by studying the ongoing failure. But it requires watching and relevant modeling for years, even decades. Useful information on fault process and warnings can be issued along the way, starting when we discover a fault showing signs of preparatory processes, up to the time of the earthquake. Such information and warnings could be issued by government agencies in cooperation with scientists to the local Civil Protection committee closest to the fault with information about how to prepare, including directives about enhanced watching. For such a warning service we need a continuously operating geo-watching system, applying modern computing technology to the multidisciplinary data, and a rule based schedule to prepare adequate warnings.

  13. Earthquakes in Stable Continental Crust.

    ERIC Educational Resources Information Center

    Johnston, Arch C.; Kanter, Lisa R.

    1990-01-01

    Discussed are some of the reasons for earthquakes which occur in stable crust away from familiar zones at the ends of tectonic plates. Crust stability and the reactivation of old faults are described using examples from India and Australia. (CW)

  14. The next new Madrid earthquake

    SciTech Connect

    Atkinson, W.

    1988-01-01

    Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questions as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.

  15. Seismology: Remote-controlled earthquakes

    NASA Astrophysics Data System (ADS)

    Hayes, Gavin

    2016-04-01

    Large earthquakes cause other quakes near and far. Analyses of quakes in Pakistan and Chile suggest that such triggering can occur almost instantaneously, making triggered events hard to detect, and potentially enhancing the associated hazards.

  16. Geochemical challenge to earthquake prediction.

    PubMed Central

    Wakita, H

    1996-01-01

    The current status of geochemical and groundwater observations for earthquake prediction in Japan is described. The development of the observations is discussed in relation to the progress of the earthquake prediction program in Japan. Three major findings obtained from our recent studies are outlined. (i) Long-term radon observation data over 18 years at the SKE (Suikoen) well indicate that the anomalous radon change before the 1978 Izu-Oshima-kinkai earthquake can with high probability be attributed to precursory changes. (ii) It is proposed that certain sensitive wells exist which have the potential to detect precursory changes. (iii) The appearance and nonappearance of coseismic radon drops at the KSM (Kashima) well reflect changes in the regional stress state of an observation area. In addition, some preliminary results of chemical changes of groundwater prior to the 1995 Kobe (Hyogo-ken nanbu) earthquake are presented. PMID:11607665

  17. Earthquakes, September-October, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, California experienced the strongest earthquake in that State since 1971. The quake, a M=6.8, occurred on October 15, in Baja California, Mexico, near the California border and caused injuries and damage. 

  18. Earthquakes; March-April, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    In the United States, a number of earthquakes were experienced, the most damaging one in southern California on March 15. The aftershocks continued in southeastern Alaska but caused no additional damage. 

  19. Electrostatics in sandstorms and earthquakes

    NASA Astrophysics Data System (ADS)

    Shinbrot, Troy; Thyagu, Nirmal; Paehtz, Thomas; Herrmann, Hans

    2010-11-01

    We present new data demonstrating (1) that electrostatic charging in sandstorms is a necessary outcome in a class of rapid collisional flows, and (2) that electrostatic precursors to slip events - long reported in earthquakes - can be reproduced in the laboratory.

  20. Earthquakes, May-June, 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of May and June were very active in terms of earthquake occurrence. Six major earthquakes (7.0earthquakes included a magnitude 7.1 in Papua New Guinea on May 15, a magnitude 7.1 followed by a magnitude 7.5 in the Philippine Islands on May 17, a magnitude 7.0 in the Cuba region on May 25, and a magnitude 7.3 in the Santa Cruz Islands of the Pacific on May 27. In the United States, a magnitude 7.6 earthquake struck in southern California on June 28 followed by a magnitude 6.7 quake about three hours later.

  1. Sociological aspects of earthquake prediction

    USGS Publications Warehouse

    Spall, H.

    1979-01-01

    Henry Spall talked recently with Denis Mileti who is in the Department of Sociology, Colorado State University, Fort Collins, Colo. Dr. Mileti is a sociologst involved with research programs that study the socioeconomic impact of earthquake prediction. 

  2. Modeling of the Coseismic Electromagnetic Field Observed during the 28 September 2004, M 6.0 Parkfield Earthquake

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Harris, J. M.; Wen, J.; Chen, X.; Hu, H.

    2014-12-01

    On 28 September 2004, the M6.0 Parkfield earthquake took place on the San Andreas fault, California. A seismic station which is named PKD and located near the epicenter recorded both of the seismic and electromagnetic (EM) signals during this earthquake. This station is operated by Berkeley Seismological Laboratory and installed with broadband seismometer and EM sensors which are close to each other. Significant seismic signals as well as clear coseismic EM signals were recorded during this earthquake, providing a good opportunity to study the coseismic EM phenomenon. We modeled the coseismic EM signals from the viewpoint of the electrokinetic effect on the basis of Pride's equations. The earthquake source is taken as a finite fault with length of 40 km along the strike direction and width of 15 km along the dip direction. The source parameters that we use for calculation were inverted by Liu et al. [2006, BSSA] by utilizing the seismic data. While in their inversion the earth crust are treated as 7 horizontally-layered elastic solids, in our calculation these solid layers are regarded as porous media. Each porous layer has the same P-velocity, S-velocity and density to its counterpart solid layer. The salinity is set to be 0.1 mol/L for all the layers so that conductivity is uniformly distributed with the value of 0.036 S/m. To evaluate the electric and magnetic responses during the rupturing of the earthquake, we use the algorithm developed by Hu and Gao [2011, JGR] which calculates both the seismic and EM wavefields simultaneously. Since the inversion of source parameters was operated in the frequency band 0.16 Hz-1 Hz, we filter both of the synthetic seismoelectric wavefields and the real data before making comparison between them. Our preliminary result shows that in this frequency range, the amplitude of the simulated coseismic electric field is of the order of 1μV/m, which is the same to the real electric data. This supports the electrokinetic effect to be

  3. Lessons learned by the DOE complex from recent earthquakes

    SciTech Connect

    Eli, M.W.

    1993-07-01

    Recent earthquake damage investigations at various industrial facilities have resulted in providing the DOE complex with reminders of practical lessons for structures, systems, and components (SSCs) involving: confinement of hazardous materials; continuous, safe operations; occupant safety; and protection of DOE investments and mission-dependent items. Recent assessments are summarized, showing examples of damage caused by the 1992 California Earthquakes (Cape Mendocino, Landers, and Big Bear) and the 1991 Costa Rica Earthquake (Valle de la Estrella). These lessons if applied along with the new DOE NPH Standards (1020--92 Series) can help assure that DOE facilities will meet the intent of the seismic requirements in the new DOE NPH Order 5480.28.

  4. Elastic energy release in great earthquakes and eruptions

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust

    2014-05-01

    The sizes of earthquakes are measured using well-defined, measurable quantities such as seismic moment and released (transformed) elastic energy. No similar measures exist for the sizes of volcanic eruptions, making it difficult to compare the energies released in earthquakes and eruptions. Here I provide a new measure of the elastic energy (the potential mechanical energy) associated with magma chamber rupture and contraction (shrinkage) during an eruption. For earthquakes and eruptions, elastic energy derives from two sources: (1) the strain energy stored in the volcano/fault zone before rupture, and (2) the external applied load (force, pressure, stress, displacement) on the volcano/fault zone. From thermodynamic considerations it follows that the elastic energy released or transformed (dU) during an eruption is directly proportional to the excess pressure (pe) in the magma chamber at the time of rupture multiplied by the volume decrease (-dVc) of the chamber, so that . This formula can be used as a basis for a new eruption magnitude scale, based on elastic energy released, which can be related to the moment-magnitude scale for earthquakes. For very large eruptions (>100 km3), the volume of the feeder-dike is negligible, so that the decrease in chamber volume during an eruption corresponds roughly to the associated volume of erupted materials , so that the elastic energy is . Using a typical excess pressures of 5 MPa, it is shown that the largest known eruptions on Earth, such as the explosive La Garita Caldera eruption (27-28 million years ago) and largest single (effusive) Colombia River basalt lava flows (15-16 million years ago), both of which have estimated volumes of about 5000 km3, released elastic energy of the order of 10EJ. For comparison, the seismic moment of the largest earthquake ever recorded, the M9.5 1960 Chile earthquake, is estimated at 100 ZJ and the associated elastic energy release at 10EJ.

  5. Coseismic ionospheric and geomagnetic disturbances caused by great earthquakes

    NASA Astrophysics Data System (ADS)

    Hao, Yongqiang; Zhang, Donghe; Xiao, Zuo

    2016-04-01

    Despite primary energy disturbances from the Sun, oscillations of the Earth surface due to a large earthquake will couple with the atmosphere and therefore the ionosphere, then the so-called coseismic ionospheric disturbances (CIDs) can be detected in the ionosphere. Using a combination of techniques, total electron content, HF Doppler, and ground magnetometer, a new time-sequence of such effects propagation were developed on observational basis and ideas on explanation provided. In the cases of 2008 Wenchuan and 2011 Tohoku earthquakes, infrasonic waves accompanying the propagation of seismic Rayleigh waves were observed in the ionosphere by all the three kinds of techniques. This is the very first report to present CIDs recorded by different techniques at co-located sites and profiled with regard to changes of both ionospheric plasma and current (geomagnetic field) simultaneously. Comparison between the oceanic (2011 Tohoku) and inland (2008 Wenchuan) earthquakes revealed that the main directional lobe of latter case is more distinct which is perpendicular to the direction of the fault rupture. We argue that the different fault slip (inland or submarine) may affect the way of couplings of lithosphere with atmosphere. References Zhao, B., and Y. Hao (2015), Ionospheric and geomagnetic disturbances caused by the 2008 Wenchuan earthquake: A revisit, J. Geophys. Res. Space Physics, 120, doi:10.1002/2015JA021035. Hao, Y. Q., Z. Xiao, and D. H. Zhang (2013), Teleseismic magnetic effects (TMDs) of 2011 Tohoku earthquake, J. Geophys. Res. Space Physics, 118, 3914-3923, doi:10.1002/jgra.50326. Hao, Y. Q., Z. Xiao, and D. H. Zhang (2012), Multi-instrument observation on co-seismic ionospheric effects after great Tohoku earthquake, J. Geophys. Res., 117, A02305, doi:10.1029/2011JA017036.

  6. Memory effect in M ≥ 7 earthquakes of Taiwan

    NASA Astrophysics Data System (ADS)

    Wang, Jeen-Hwa

    2014-07-01

    The M ≥ 7 earthquakes that occurred in the Taiwan region during 1906-2006 are taken to study the possibility of memory effect existing in the sequence of those large earthquakes. Those events are all mainshocks. The fluctuation analysis technique is applied to analyze two sequences in terms of earthquake magnitude and inter-event time represented in the natural time domain. For both magnitude and inter-event time, the calculations are made for three data sets, i.e., the original order data, the reverse-order data, and that of the mean values. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of both magnitude and inter-event time data. In addition, the phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Results lead to a negative answer. Together with all types of information in study, we make a conclusion that the earthquake sequence in study is short-term corrected and thus the short-term memory effect would be operative.

  7. Post-earthquake dilatancy recovery

    NASA Technical Reports Server (NTRS)

    Scholz, C. H.

    1974-01-01

    Geodetic measurements of the 1964 Niigata, Japan earthquake and of three other examples are briefly examined. They show exponentially decaying subsidence for a year after the quakes. The observations confirm the dilatancy-fluid diffusion model of earthquake precursors and clarify the extent and properties of the dilatant zone. An analysis using one-dimensional consolidation theory is included which agrees well with this interpretation.

  8. Mitigating earthquakes; the federal role

    USGS Publications Warehouse

    Press, F.

    1977-01-01

    With rapid approach of a capability to make reliable earthquake forecasts, it essential that the Federal Government play a strong, positive role in formulating and implementing plans to reduce earthquake hazards. Many steps are being taken in this direction, with the President looking to the Office of Science and Technology Policy (OSTP) in his Executive Office to provide leadership in establishing and coordinating Federal activities. 

  9. Coulomb Stress Interactions Among Earthquakes in the Gorda Deformation Zone, the San Andreas, Mendocino Fracture Zone, and Cascadia Megathrust

    NASA Astrophysics Data System (ADS)

    Rollins, C.; Stein, R. S.

    2008-12-01

    is also important. We also infer, on the basis of continuous small earthquakes and three M>6 earthquakes since 1980, that the MFZ is capable of both large earthquakes and creep along the same sections, behavior typical of oceanic and some continental transforms such as the Hayward and Calaveras faults.

  10. Two models for earthquake forerunners

    USGS Publications Warehouse

    Mjachkin, V.I.; Brace, W.F.; Sobolev, G.A.; Dieterich, J.H.

    1975-01-01

    Similar precursory phenomena have been observed before earthquakes in the United States, the Soviet Union, Japan, and China. Two quite different physical models are used to explain these phenomena. According to a model developed by US seismologists, the so-called dilatancy diffusion model, the earthquake occurs near maximum stress, following a period of dilatant crack expansion. Diffusion of water in and out of the dilatant volume is required to explain the recovery of seismic velocity before the earthquake. According to a model developed by Soviet scientists growth of cracks is also involved but diffusion of water in and out of the focal region is not required. With this model, the earthquake is assumed to occur during a period of falling stress and recovery of velocity here is due to crack closure as stress relaxes. In general, the dilatancy diffusion model gives a peaked precursor form, whereas the dry model gives a bay form, in which recovery is well under way before the earthquake. A number of field observations should help to distinguish between the two models: study of post-earthquake recovery, time variation of stress and pore pressure in the focal region, the occurrence of pre-existing faults, and any changes in direction of precursory phenomena during the anomalous period. ?? 1975 Birkha??user Verlag.

  11. Mapping Tectonic Stress Using Earthquakes

    SciTech Connect

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-11-23

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust.

  12. Hydrological signatures of earthquake strain

    SciTech Connect

    Muir-Wood, R.; King, G.C.P. |

    1993-12-01

    The character of the hydrological changes that follow major earthquakes has been investigated and found to be dependent on the style of faulting. The most significant response is found to accompany major normal fault earthquakes. Increases in spring and river discharges peak a few days after the earthquake, and typically, excesss flow is sustained for a period of 6-12 months. In contrast, hydrological changes accompanying pure reverse fault earthquakes are either undetected or indicate lowering of well levels and spring flows. Strike-slip and oblique-slip fault movements are associated with a mixture of responses but appear to release no more than 10% of the water volume of the same sized normal fault event. For two major normal fault earthquakes in the western United States (those of Hebgen Lake on August 17, 1959, and Borah Peak on October 28, 1983), there is sufficient river flow information to allow the magnitude and extent of the postseismic discharge to be quantified. The discharge has been converted to a rainfall equivalent, which is found to exceed 100 mm close to the fault and to remain above 10 mm at distances greater than 50 km. Results suggest that water-filled craks are ubiquitous throughout the brittle continental crust and that these cracks open and close throughout the earthquake cycle. The existence of tectonically induced fluid flows on the scale that we demonstrate has major implications for our understanding of the mechanical and chemical behavior of crustal rocks.

  13. Earthquakes - Volcanoes (Causes and Forecast)

    NASA Astrophysics Data System (ADS)

    Tsiapas, E.

    2009-04-01

    EARTHQUAKES - VOLCANOES (CAUSES AND FORECAST) ELIAS TSIAPAS RESEARCHER NEA STYRA, EVIA,GREECE TEL.0302224041057 tsiapas@hol.gr The earthquakes are caused by large quantities of liquids (e.g. H2O, H2S, SO2, ect.) moving through lithosphere and pyrosphere (MOHO discontinuity) till they meet projections (mountains negative projections or projections coming from sinking lithosphere). The liquids are moved from West Eastward carried away by the pyrosphere because of differential speed of rotation of the pyrosphere by the lithosphere. With starting point an earthquake which was noticed at an area and from statistical studies, we know when, where and what rate an earthquake may be, which earthquake is caused by the same quantity of liquids, at the next east region. The forecast of an earthquake ceases to be valid if these components meet a crack in the lithosphere (e.g. limits of lithosphere plates) or a volcano crater. In this case the liquids come out into the atmosphere by the form of gasses carrying small quantities of lava with them (volcano explosion).

  14. Global earthquake fatalities and population

    USGS Publications Warehouse

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  15. Building with Earthquakes in Mind

    NASA Astrophysics Data System (ADS)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  16. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  17. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  18. An evaluation of coordination relationships during earthquake emergency rescue using entropy theory.

    PubMed

    Rong, Huang; Xuedong, Liang; Guizhi, Zeng; Yulin, Ye; Da, Wang

    2015-05-01

    Emergency rescue after an earthquake is complex work which requires the participation of relief and social organizations. Studying earthquake emergency coordination efficiency can not only help rescue organizations to define their own rescue missions, but also strengthens inter-organizational communication and collaboration tasks, improves the efficiency of emergency rescue, and reduces loss. In this paper, collaborative entropy is introduced to study earthquake emergency rescue operations. To study the emergency rescue coordination relationship, collaborative matrices and collaborative entropy functions are established between emergency relief work and relief organizations, and the collaborative efficiency of the emergency rescue elements is determined based on this entropy function. Finally, the Lushan earthquake is used as an example to evaluate earthquake emergency rescue coordination efficiency.

  19. An evaluation of coordination relationships during earthquake emergency rescue using entropy theory.

    PubMed

    Rong, Huang; Xuedong, Liang; Guizhi, Zeng; Yulin, Ye; Da, Wang

    2015-05-01

    Emergency rescue after an earthquake is complex work which requires the participation of relief and social organizations. Studying earthquake emergency coordination efficiency can not only help rescue organizations to define their own rescue missions, but also strengthens inter-organizational communication and collaboration tasks, improves the efficiency of emergency rescue, and reduces loss. In this paper, collaborative entropy is introduced to study earthquake emergency rescue operations. To study the emergency rescue coordination relationship, collaborative matrices and collaborative entropy functions are established between emergency relief work and relief organizations, and the collaborative efficiency of the emergency rescue elements is determined based on this entropy function. Finally, the Lushan earthquake is used as an example to evaluate earthquake emergency rescue coordination efficiency. PMID:26083170

  20. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  1. Earthquake Scenario for the City of Gyumri Including Seismic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Babayan, H.; Karakhanyan, A.; Arakelyan, A.; Avanesyan, M.; Durgaryan, R.; Babayan, S.; Gevorgyan, M.; Hovhannisyan, G.

    2012-12-01

    The city of Gyumri situated in the north of Armenia falls in the zone of active Akhouryan Fault and during the 20th century it suffered from catastrophic earthquakes twice. The Mw=6.2 earthquake in 1926 and the Spitak earthquake with a magnitude of 6.9 in 1988 killed more than 20,000 people in total. Therefore, current seismic hazard and risk assessment for the city are of great importance. It is also very important to answer how properly the lessons of the Spitak earthquake have been learned for this largest city in the Spitak earthquake disaster zone, what the real level of seismic risk is now, and what losses the city could expect if a similar or stronger earthquake occurred nowadays. For this purposes, the most probable earthquakes scenarios have been developed by means of comprehensive assessment of seismic hazard and risk, The conducted study helped to produce the actual pattern of effects caused by the Spitak earthquake in terms of losses and damage caused to diverse types of buildings and thus enabled correct selection of required parameter values to estimate vulnerability of the structures and test the ELER software package (designated for estimation of losses and damages developed in the framework of GEM-EMME Project). The work was realized by the following sequence of steps: probabilistic and deterministic assessment of seismic hazard for the Gyumri city region - choice of earthquake scenario (based on the disaggregation and seismotectonic model) - risk estimation for each selected earthquake scenario. In the framework of this study, different parameters of seismic hazard such as peak ground acceleration and spectral acceleration were investigated and mapped, and soil model for city was developed. Subsequently, these maps were used as the basic inputs to assess the expected life, building, and lifeline losses. The presented work was realized with the financial support of UNDP. The results of the Project will serve the basis for national and local

  2. PRELIMINARY SELECTION OF MGR DESIGN BASIS EVENTS

    SciTech Connect

    J.A. Kappes

    1999-09-16

    The purpose of this analysis is to identify the preliminary design basis events (DBEs) for consideration in the design of the Monitored Geologic Repository (MGR). For external events and natural phenomena (e.g., earthquake), the objective is to identify those initiating events that the MGR will be designed to withstand. Design criteria will ensure that radiological release scenarios resulting from these initiating events are beyond design basis (i.e., have a scenario frequency less than once per million years). For internal (i.e., human-induced and random equipment failures) events, the objective is to identify credible event sequences that result in bounding radiological releases. These sequences will be used to establish the design basis criteria for MGR structures, systems, and components (SSCs) design basis criteria in order to prevent or mitigate radiological releases. The safety strategy presented in this analysis for preventing or mitigating DBEs is based on the preclosure safety strategy outlined in ''Strategy to Mitigate Preclosure Offsite Exposure'' (CRWMS M&O 1998f). DBE analysis is necessary to provide feedback and requirements to the design process, and also to demonstrate compliance with proposed 10 CFR 63 (Dyer 1999b) requirements. DBE analysis is also required to identify and classify the SSCs that are important to safety (ITS).

  3. Biological Indicators in Studies of Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Sidorin, A. Ya.; Deshcherevskii, A. V.

    2012-04-01

    Time series of data on variations in the electric activity (EA) of four species of weakly electric fish Gnathonemus leopoldianus and moving activity (MA) of two cat-fishes Hoplosternum thoracatum and two groups of Columbian cockroaches Blaberus craniifer were analyzed. The observations were carried out in the Garm region of Tajikistan within the frameworks of the experiments aimed at searching for earthquake precursors. An automatic recording system continuously recorded EA and DA over a period of several years. Hourly means EA and MA values were processed. Approximately 100 different parameters were calculated on the basis of six initial EA and MA time series, which characterize different variations in the EA and DA structure: amplitude of the signal and fluctuations of activity, parameters of diurnal rhythms, correlated changes in the activity of various biological indicators, and others. A detailed analysis of the statistical structure of the total array of parametric time series obtained in the experiment showed that the behavior of all animals shows a strong temporal variability. All calculated parameters are unstable and subject to frequent changes. A comparison of the data obtained with seismicity allow us to make the following conclusions: (1) The structure of variations in the studied parameters is represented by flicker noise or even a more complex process with permanent changes in its characteristics. Significant statistics are required to prove the cause-and-effect relationship of the specific features of such time series with seismicity. (2) The calculation of the reconstruction statistics in the EA and MA series structure demonstrated an increase in their frequency in the last hours or a few days before the earthquake if the hypocenter distance is comparable to the source size. Sufficiently dramatic anomalies in the behavior of catfishes and cockroaches (changes in the amplitude of activity variation, distortions of diurnal rhythms, increase in the

  4. Catalog of earthquakes along the San Andreas fault system in Central California, July-September 1972

    USGS Publications Warehouse

    Wesson, R.L.; Meagher, K.L.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period July - September, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). Catalogs for the first and second quarters of 1972 have been prepared by Wessan and others (1972 a & b). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1254 earthquakes in Central California. Arrival times at 129 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 104 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB), the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

  5. Evidence for Ancient Mesoamerican Earthquakes

    NASA Astrophysics Data System (ADS)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  6. Seismic gaps and source zones of recent large earthquakes in coastal Peru

    USGS Publications Warehouse

    Dewey, J.W.; Spence, W.

    1979-01-01

    The earthquakes of central coastal Peru occur principally in two distinct zones of shallow earthquake activity that are inland of and parallel to the axis of the Peru Trench. The interface-thrust (IT) zone includes the great thrust-fault earthquakes of 17 October 1966 and 3 October 1974. The coastal-plate interior (CPI) zone includes the great earthquake of 31 May 1970, and is located about 50 km inland of and 30 km deeper than the interface thrust zone. The occurrence of a large earthquake in one zone may not relieve elastic strain in the adjoining zone, thus complicating the application of the seismic gap concept to central coastal Peru. However, recognition of two seismic zones may facilitate detection of seismicity precursory to a large earthquake in a given zone; removal of probable CPI-zone earthquakes from plots of seismicity prior to the 1974 main shock dramatically emphasizes the high seismic activity near the rupture zone of that earthquake in the five years preceding the main shock. Other conclusions on the seismicity of coastal Peru that affect the application of the seismic gap concept to this region are: (1) Aftershocks of the great earthquakes of 1966, 1970, and 1974 occurred in spatially separated clusters. Some clusters may represent distinct small source regions triggered by the main shock rather than delimiting the total extent of main-shock rupture. The uncertainty in the interpretation of aftershock clusters results in corresponding uncertainties in estimates of stress drop and estimates of the dimensions of the seismic gap that has been filled by a major earthquake. (2) Aftershocks of the great thrust-fault earthquakes of 1966 and 1974 generally did not extend seaward as far as the Peru Trench. (3) None of the three great earthquakes produced significant teleseismic activity in the following month in the source regions of the other two earthquakes. The earthquake hypocenters that form the basis of this study were relocated using station

  7. 77 FR 62523 - Scientific Earthquake Studies Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-15

    ... the USGS's participation in the National Earthquake Hazards Reduction Program. The Committee will... direction of the Earthquake Hazards Program. Meetings of the Scientific Earthquake Studies Advisory... Geological Survey Scientific Earthquake Studies Advisory Committee AGENCY: U.S. Geological Survey....

  8. Volunteers in the earthquake hazard reduction program

    USGS Publications Warehouse

    Ward, P.L.

    1978-01-01

    With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 

  9. Earthquakes with non--double-couple mechanisms.

    PubMed

    Frohlich, C

    1994-05-01

    Seismological observations confirm that the pattern of seismic waves from some earthquakes cannot be produced by slip along a planar fault surface. More than one physical mechanism is required to explain the observed varieties of these non-double-couple earthquakes. The simplest explanation is that some earthquakes are complex, with stress released on two or more suitably oriented, nonparallel fault surfaces. However, some shallow earthquakes in volcanic and geothermal areas require other explanations. Current research focuses on whether fault complexity explains most observed non-double-couple earthquakes and to what extent ordinary earthquakes have non-double-couple components.

  10. Earthquake early warning for Romania - most recent improvements

    NASA Astrophysics Data System (ADS)

    Marmureanu, Alexandru; Elia, Luca; Martino, Claudio; Colombelli, Simona; Zollo, Aldo; Cioflan, Carmen; Toader, Victorin; Marmureanu, Gheorghe; Marius Craiu, George; Ionescu, Constantin

    2014-05-01

    EWS for Vrancea earthquakes uses the time interval (28-32 sec.) between the moment when the earthquake is detected by the local seismic network installed in the epicenter area (Vrancea) and the arrival time of the seismic waves in the protected area (Bucharest) to send earthquake warning to users. In the last years, National Institute for Earth Physics (NIEP) upgraded its seismic network in order to cover better the seismic zones of Romania. Currently the National Institute for Earth Physics (NIEP) operates a real-time seismic network designed to monitor the seismic activity on the Romania territory, dominated by the Vrancea intermediate-depth (60-200 km) earthquakes. The NIEP real-time network consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Ranger, gs21, Mark l22) and acceleration sensors (Episensor). Recent improvement of the seismic network and real-time communication technologies allows implementation of a nation-wide EEWS for Vrancea and other seismic sources from Romania. We present a regional approach to Earthquake Early Warning for Romania earthquakes. The regional approach is based on PRESTo (Probabilistic and Evolutionary early warning SysTem) software platform: PRESTo processes in real-time three channel acceleration data streams: once the P-waves arrival have been detected, it provides earthquake location and magnitude estimations, and peak ground motion predictions at target sites. PRESTo is currently implemented in real- time at National Institute for Earth Physics, Bucharest for several months in parallel with a secondary EEWS. The alert notification is issued only when both systems validate each other. Here we present the results obtained using offline earthquakes originating from Vrancea area together with several real

  11. Seismic responses of Baozhusi gravity dam upon MS 8.0 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Zhang, C. H.

    2012-04-01

    Baozhusi gravity dam was not destructively damaged during the Ms 8.0 Wenchuan Earthquake even though the earthquake intensity (0.2g) at the dam site exceeded the design level of the dam (0.1g). In order to analyze the dam's performance to resist the earthquake, we design a three-dimensional model to simulate the dam's dynamic responses with finite element modeling scheme with consideration of the nonlinearities of contraction joint opening and different combination patterns of three-component seismic processes. Then with 2D elasto-plastic yielding analysis technique we reassess the seismic safety and discuss the possible destruction modals of the dam during strong earthquake with updated seismic fortification levels. The results demonstrate that (1) the cross-stream component of earthquake motion predominates in the dynamic responses of the dam, and the stream component has relatively weaker excitation to the dam, which are probably the reason that the dam luckily avoided strong damage in the Wenchuan Earthquake; (2) the concrete fracture occurred near the permanent contraction joints at the top of the dam may have resulted from the impact of concrete blocks during joints opening; (3) the dam safety can meet the requirement under the updated design earthquake (0.27g) and will be lower under the maximum credible earthquake (0.32g), which may affect the reservoir operating and the resistant abilities to aftershocks.

  12. Update NEMC Database using Arcgis Software and Example of Simav-Kutahya earthquake sequences

    NASA Astrophysics Data System (ADS)

    Altuncu Poyraz, S.; Kalafat, D.; Kekovali, K.

    2011-12-01

    In this study, totally 144043 earthquake data from the Kandilli Observatory Earthquake Research Institute & National Earthquake Monitoring Center (KOERI-NEMC) seismic catalog between 2.0≤M≤7.9 occured in Turkey for the time interval 1900-2011 were used. The data base includes not only coordinates, date, magnitude and depth of these earthquakes but also location and installation information, field studies, geology, technical properties of 154 seismic stations. Additionally, 1063 historical earthquakes included to the data base. Source parameters of totally 738 earthquakes bigger than M≥4.0 occured between the years 1938-2008 were added to the database. In addition, 103 earthquake's source parameters were calculated (bigger than M≥4.5) since 2008. In order to test the charateristics of earthquakes, questioning, visualization and analyzing aftershock sequences on 19 May 2011 Simav-Kutahya earthquake were selected and added to the data base. The Simav earthquake (western part of Anatolia) with magnitude Ml= 5.9 occurred at local time 23:15 is investigated, in terms of accurate event locations and source properties of the largest events. The aftershock distribution of Simav earthquake shows the activation of a 17-km long zone, which extends in depth between 5 and 10 km. In order to make contribution to better understand the neotectonics of this region, we analysed the earthquakes using the KOERI (Kandilli Observatory and Earthquake Research Institute) seismic stations along with the seismic stations that are operated by other communities and recorded suscessfuly the Simav seismic activity in 2011. Source mechanisms of 19 earthquakes with magnitudes between 3.8 ≤ML<6.0 were calculated by means of Regional Moment Tensor Inversion (RMT) technique. The mechanism solutions show the presence of east-west direction normal faults in the region. As a result an extensional regime is dominated in the study area. The aim of this study is to store and compile earthquake

  13. Earthquake induced Landslides in the Sikkim Himalaya - A Consequences of the 18th September 2011 Earthquake

    NASA Astrophysics Data System (ADS)

    Sharma, Ashok Kumar

    2015-04-01

    On September 18, 2011 an earthquake of 6.8 magnitude on the Richter scale struck Sikkim at 18.11 hours IST. The epicenter of the quake was latidude 27.7o North and longitude 88.2o East about 64 km North-West of Gangtok along the junction point of Teesta lineament and Kanchenjunga fault in the North District of Sikkim. The high intensity tremor triggered various types of natural calamities in the form of landslides, road blocks, falling boulders, lake bursts, flash floods, falling of trees, etc. and caused severe damage to life and property of the people in Sikkim. As the earthquake occurred during the monsoon season, heavy rain and landslides rendered rescue operations extremely difficult. Almost all road connectivity and communication network were disrupted. Sikkim experiences landslides year after year, especially during the monsoons and periods of intense rain. This hazard affects the economy of the State very badly. But due to the earthquake, many new and a few reactivated landslides have occurred in the Sikkim Himalaya.

  14. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1976-01-01

    Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

  15. BASIS9. 4

    SciTech Connect

    Allsman, R.; Barrett, K.; Busby, L.; Chiu, Y.; Crotinger, J.; Dubois, B.; Dubois, P.F.; Langdon, B.; Motteler, Z.C.; Takemoto, J.; Taylor, S.; Willmann, P.; Wilson, S. )

    1993-08-01

    BASIS9.4 is a system for developing interactive computer programs in Fortran, with some support for C and C++ as well. Using BASIS9.4 you can create a program that has a sophisticated programming language as its user interface so that the user can set, calculate with, and plot, all the major variables in the program. The program author writes only the scientific part of the program; BASIS9.4 supplies an environment in which to exercise that scientific programming which includes an interactive language, an interpreter, graphics, terminal logs, error recovery, macros, saving and retrieving variables, formatted I/O, and online documentation.

  16. Earthquake Forecast Science Research with a Small Satellite

    NASA Astrophysics Data System (ADS)

    Jason, Susan; da Silva Curiel, Alex; Pulinets, Sergey; Sweeting, Martin, , Sir

    events, as well as an indication of the seismic centre may also be possible. These mission data should also lead to improved knowledge of the physics of earthquakes, improved accuracy for GPS-based navigation models, and could be used to study the reaction of the global ionosphere during magnetic storms and other solar-terrestrial events. The poster presents an overview of the scientific basis, goals, and proposed platform for this research mission.

  17. Is Your Class a Natural Disaster? It can be... The Real Time Earthquake Education (RTEE) System

    NASA Astrophysics Data System (ADS)

    Whitlock, J. S.; Furlong, K.

    2003-12-01

    In cooperation with the U.S. Geological Survey (USGS) and its National Earthquake Information Center (NEIC) in Golden, Colorado, we have implemented an autonomous version of the NEIC's real-time earthquake database management and earthquake alert system (Earthworm). This is the same system used professionally by the USGS in its earthquake response operations. Utilizing this system, Penn State University students participating in natural hazard classes receive real-time alerts of worldwide earthquake events on cell phones distributed to the class. The students are then responsible for reacting to actual earthquake events, in real-time, with the same data (or lack thereof) as earthquake professionals. The project was first implemented in Spring 2002, and although it had an initial high intrigue and "coolness" factor, the interest of the students waned with time. Through student feedback, we observed that scientific data presented on its own without an educational context does not foster student learning. In order to maximize the impact of real-time data and the accompanying e-media, the students need to become personally involved. Therefore, in collaboration with the Incorporated Research Institutes of Seismology (IRIS), we have begun to develop an online infrastructure that will help teachers and faculty effectively use real-time earthquake information. The Real-Time Earthquake Education (RTEE) website promotes student learning by integrating inquiry-based education modules with real-time earthquake data. The first module guides the students through an exploration of real-time and historic earthquake datasets to model the most important criteria for determining the potential impact of an earthquake. Having provided the students with content knowledge in the first module, the second module presents a more authentic, open-ended educational experience by setting up an earthquake role-play situation. Through the Earthworm system, we have the ability to "set off

  18. Ten Years of Real-Time Earthquake Loss Alerts

    NASA Astrophysics Data System (ADS)

    Wyss, M.

    2013-12-01

    The priorities of the most important parameters of an earthquake disaster are: Number of fatalities, number of injured, mean damage as a function of settlement, expected intensity of shaking at critical facilities. The requirements to calculate these parameters in real time are: 1) Availability of reliable earthquake source parameters within minutes. 2) Capability of calculating expected intensities of strong ground shaking. 3) Data sets on population distribution and conditions of building stock as a function of settlements. 4) Data on locations of critical facilities. 5) Verified methods of calculating damage and losses. 6) Personnel available on a 24/7 basis to perform and review these calculations. There are three services available that distribute information about the likely consequences of earthquakes within about half an hour of the event. Two of these calculate losses, one gives only general information. Although, much progress has been made during the last ten years improving the data sets and the calculating methods, much remains to be done. The data sets are only first order approximations and the methods bare refinement. Nevertheless, the quantitative loss estimates after damaging earthquakes in real time are generally correct in the sense that they allow distinguishing disastrous from inconsequential events.

  19. Authorization basis for the 209-E Building

    SciTech Connect

    TIFFANY, M.S.

    1999-02-23

    This Authorization Basis document is one of three documents that constitute the Authorization Basis for the 209-E Building. Per the U.S. Department of Energy, Richland Operations Office (RL) letter 98-WSD-074, this document, the 209-E Building Preliminary Hazards Analysis (WHC-SD-WM-TI-789), and the 209-E Building Safety Evaluation Report (97-WSD-074) constitute the Authorization Basis for the 209-E Building. This Authorization Basis and the associated controls and safety programs will remain in place until safety documentation addressing deactivation of the 209-E Building is developed by the contractor and approved by RL.

  20. Detection of change points in underlying earthquake rates, with application to global mega-earthquakes

    NASA Astrophysics Data System (ADS)

    Touati, Sarah; Naylor, Mark; Main, Ian

    2016-02-01

    The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large

  1. 47 CFR 13.1 - Basis and purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Basis and purpose. 13.1 Section 13.1 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMERCIAL RADIO OPERATORS General § 13.1 Basis and purpose. (a) Basis. The basis for the rules contained in this part is the Communications Act of 1934,...

  2. Multi-parameter observation of pre-earthquake signals and their potential for short -term earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Kalenda, Pavel; Ouzounov, Dimitar; Bobrovskiy, Vadim; Neumann, Libor; Boborykina, Olga; Nazarevych, Andrij; Šebela, Stanka; Kvetko, Július; Shen, Wen-Bin

    2013-04-01

    We present methodologies for the multi-parameter observations of pre-earthquake phenomena and their retrospective/prospective testing. The hypothesis that the strongest earthquakes depend on the global stress field leads to global observations and a multi-parameter and multi-sensors approach. In 2012 we performed coordinated tests of several geophysical and environmental parameters, which are associated with the earthquakes preparation processes, namely: 1) Rocks deformation measurements (Kalenda et al. 2012); 2) Subterranean non-stationary electric processes (Bobrovskij 2011); 3) superconducting gravimeters (SGs) records and broadband seismometers (BS) time series (Shen et al); and 4) satellite infra-red observations (10-13 μm) measured at the top of the atmosphere (Ouzounov et al , 2011). In the retrospective test for the two most recent major events in Asia: Wenchuan earthquake (2008,China) and the latest Tohoku earthquake/tsunami (2011, Japan) our combined analysis showed a coordinated appearance of anomalies in advance (days) that could be explained by a coupling process between the observed physical parameters and the earthquake preparation processes. In 2012 three internal retrospective alerts were issued in advance (days) associated with the following events: M7.7 Okhotsk sea of August 14; M7.3 Honshu EQ of December 7 and M7.1 Banda sea EQ on December 10. Not all observations were able to detect anomalies before the M 7.4 Guatemala EQ of November 11. We discuss the reliability of each observation, their time lag, ability to localize and estimate the magnitude of the main shock. References Bobrovskij V. (2011): Kamchatkian Subterranean Electric Operative Forerunners of Catastrophic Earthquake with M9, occurred close to Honshu Island 2011/03/11 . IUGG Meeting Melbourne, 2011. postrer. Kalenda P. et al. (2012): Tilts, global tectonics and earthquake prediction. SWB, London, 247pp. Ozounov D. et al. (2011): Atmosphere-Ionosphere Response to the M9 Tohoku

  3. Authorization basis requirements comparison report

    SciTech Connect

    Brantley, W.M.

    1997-08-18

    The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation.

  4. The effects of the Yogyakarta earthquake at LUSI mud volcano, Indonesia

    NASA Astrophysics Data System (ADS)

    Lupi, M.; Saenger, E. H.; Fuchs, F.; Miller, S. A.

    2013-12-01

    The M6.3 Yogyakarta earthquake shook Central Java on May 27th, 2006. Forty seven hours later, hot mud outburst at the surface near Sidoarjo, approximately 250 km from the earthquake epicentre. The mud eruption continued and originated LUSI, the youngest mud volcanic system on earth. Since the beginning of the eruption, approximately 30,000 people lost their homes and 13 people died due to the mud flooding. The causes that initiated the eruption are still debated and are based on different geological observations. The earthquake-triggering hypothesis is supported by the evidence that at the time of the earthquake ongoing drilling operations experienced a loss of the drilling mud downhole. In addition, the eruption of the mud began only 47 hours after the Yogyakarta earthquake and the mud reached the surface at different locations aligned along the Watukosek fault, a strike-slip fault upon which LUSI resides. Moreover, the Yogyakarta earthquake also affected the volcanic activity of Mt. Semeru, located as far as Lusi from the epicentre of the earthquake. However, the drilling-triggering hypothesis points out that the earthquake was too far from LUSI for inducing relevant stress changes at depth and highlight how upwelling fluids that reached the surface first emerged only 200 m far from the drilling rig that was operative at the time. Hence, was LUSI triggered by the earthquake or by drilling operations? We conducted a seismic wave propagation study on a geological model based on vp, vs, and density values for the different lithologies and seismic profiles of the crust beneath LUSI. Our analysis shows compelling evidence for the effects produced by the passage of seismic waves through the geological formations and highlights the importance of the overall geological structure that focused and reflected incoming seismic energy.

  5. The music of earthquakes and Earthquake Quartet #1

    USGS Publications Warehouse

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  6. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  7. Reduction of earthquake risk in the united states: Bridging the gap between research and practice

    USGS Publications Warehouse

    Hays, W.W.

    1998-01-01

    Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE.

  8. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward,; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  9. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench

    PubMed Central

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-01-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15–18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface. PMID:27447546

  10. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench

    NASA Astrophysics Data System (ADS)

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-07-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15-18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface.

  11. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench.

    PubMed

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-01-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15-18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface. PMID:27447546

  12. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench.

    PubMed

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-07-22

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15-18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface.

  13. Reevaluation of the macroseismic effects of the 23 January 1838 Vrancea earthquake

    NASA Astrophysics Data System (ADS)

    Rogozea, M.; Marmureanu, Gh.; Radulian, M.

    2012-04-01

    The aim of this paper is to analyze the great event that occurred on 23 January 1838 (magnitude 7.5 in the Romanian catalogue). Valuable information has been collected from original or compiled historical sources, such as chronicles and manuscripts on that time, and related books and reports. The historical data are critically analyzed and, on the basis of our investigation, we showed the degree of significance of the earthquake parameters, as resulted from the effect distribution. The pattern of the intensity data points as reevaluated for this historical earthquake is compared with the pattern of instrumentally recorded major earthquake of 4 March 1977, the two events assumed to be similar as hypocenter location, source parameters and rupture propagation. We make also a comparative investigation for the attenuation relationship in case of Vrancea earthquakes, using historical data vs. instrumental data. Implications for the seismic hazard assessment are finally discussed.

  14. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    NASA Astrophysics Data System (ADS)

    Ingebritsen, S. E.; Shelly, D. R.; Hsieh, P. A.; Clor, L. E.; Seward, P. H.; Evans, W. C.

    2015-11-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  15. Earthquakes & Volcanoes, Volume 23, Number 6, 1992

    USGS Publications Warehouse

    ,; Gordon, David W.

    1993-01-01

    Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers.

  16. The October 12, 1992, Dahshur, Egypt, Earthquake

    USGS Publications Warehouse

    Thenhaus, P.C.; Celebi, M.; Sharp, R.V.

    1993-01-01

    We were part of an international reconnaissance team that investigated the Dahsur earthquake. This article summarizes our findings and points out how even a relatively moderate sized earthquake can cause widespread damage and a large number of casualities. 

  17. Earthquakes and the urban environment. Volume II

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 2 contains chapters on earthquake prediction, control, building design and building response.

  18. Earthquakes and the urban environment. Volume I

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 1 contains chapters on earthquake parameters and hazards.

  19. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  20. Characterisation of Liquefaction Effects for Beyond-Design Basis Safety Assessment of Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Bán, Zoltán; Győri, Erzsébet; János Katona, Tamás; Tóth, László

    2015-04-01

    Preparedness of nuclear power plants to beyond design base external effects became high importance after 11th of March 2011 Great Tohoku Earthquakes. In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be considered as a beyond design basis hazard. The consequences of liquefaction have to be analysed with the aim of definition of post-event plant condition, identification of plant vulnerabilities and planning the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The case of Nuclear Power Plant at Paks, Hungary is used as an example for demonstration of practical importance of the presented results and considerations. Contrary to the design, conservatism of the methodology for the evaluation of beyond design basis liquefaction effects for an operating plant has to be limited to a reasonable level. Consequently, applicability of all existing methods has to be considered for the best estimation. The adequacy and conclusiveness of the results is mainly limited by the epistemic uncertainty of the methods used for liquefaction hazard definition and definition of engineering parameters characterizing the consequences of liquefaction. The methods have to comply with controversial requirements. They have to be consistent and widely accepted and used in the practice. They have to be based on the comprehensive database. They have to provide basis for the evaluation of dominating engineering parameters that control the post-liquefaction response of the plant structures. Experience of Kashiwazaki-Kariwa plant hit by Niigata-ken Chuetsu-oki earthquake of 16 July 2007 and analysis of site conditions and plant layout at Paks plant have shown that the differential settlement is found to be the dominating effect in case considered. They have to be based on the probabilistic seismic hazard assessment and allow the integration into logic

  1. 340 waste handling facility interim safety basis

    SciTech Connect

    VAIL, T.S.

    1999-04-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  2. 340 Waste handling facility interim safety basis

    SciTech Connect

    Stordeur, R.T.

    1996-10-04

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  3. Detection of crustal deformation from the Landers earthquake sequence using continuous geodetic measurements

    NASA Technical Reports Server (NTRS)

    Bock, Yehuda; Agnew, Duncan C.; Fang, Peng; Genrich, Joachim F.; Hager, Bradford H.; Herring, Thomas A.; Hudnut, Kenneth W.; King, Robert W.; Larsen, Shawn; Minster, J.-B.

    1993-01-01

    The first measurements are reported for a major earthquake by a continuously operating GPS network, the permanent GPS Genetic ARRY (PGGA) in southern California. The Landers and Big Bear earthquakes of June 28, 1992 were monitored by daily observations. Ten weeks of measurements indicate significant coseismic motion at all PGGA sites, significant postseismic motion at one site for two weeks after the earthquakes, and no significant preseismic motion. These measurements demonstrate the potential of GPS monitoring for precise detection of precursory and aftershock seismic deformation in the near and far field.

  4. Earthquake prognosis:cause for failure and ways for the problem solution

    NASA Astrophysics Data System (ADS)

    Kondratiev, O.

    2003-04-01

    Despite of the more than 50-years history of the development of the prognosis earthquake method this problem is yet not to be resolved. This makes one to have doubt in rightness of the chosen approaches retrospective search of the diverse earthquake precursors. It is obvious to speak of long-term, middle-term and short-term earthquake prognosis. They all have a probabilistic character and it would be more correct to consider them as related to the seismic hazard prognosis. In distinction of them, the problem of the operative prognosis is being discussed in report. The operative prognosis should conclude the opportune presenting of the seismic alarm signal of the place, time and power of the earthquake in order to take necessary measures for maximal mitigation of the catastrophic consequence of this event. To do this it is necessary to predict the earthquake location with accuracy of first dozens of kilometres, time of its occurrence with accuracy of the first days and its power with accuracy of the magnitude units. If the problem is formulated in such a way, it cannot principally be resolved in the framework of the concept of the indirect earthquake precursors using. It is necessary to pass from the concept of the passive observatory network to the concept of the object-oriented search of the potential source zones and direct information obtaining on the parameter medium changes within these zones in the process of the earthquake preparation and development. While formulated in this way, the problem becomes a integrated task for the planet and prospecting geophysics. To detect the source zones it is possible to use the method of the converted waves of earthquakes, for monitoring - seismic reflecting and method of the common point. Arrangement of these and possible other geophysical methods should be provided by organising the special integrated geophysic expedition of the rapid response on the occurred strong earthquakes and conducting purposeful investigation

  5. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  6. Earthquake-dammed lakes in New Zealand

    SciTech Connect

    Adams, J.

    1981-05-01

    Eleven small lakes were formed by landslides caused by the 1929 Buller earthquake; four others were formed by other historic earthquakes in New Zealand. At least nine other New Zealand lakes are also dammed by landslides and were probably formed by prehistoric earthquakes. When recognized by morphology, synchronous age, and areal distribution, earthquake-dammed lakes could provide an estimate of paleoseismicity for the past few hundred or thousand years.

  7. EARTHQUAKES - VOLCANOES (Causes - Forecast - Counteraction)

    NASA Astrophysics Data System (ADS)

    Tsiapas, Elias

    2014-05-01

    Earthquakes and volcanoes are caused by: 1)Various liquid elements (e.g. H20, H2S, S02) which emerge from the pyrosphere and are trapped in the space between the solid crust and the pyrosphere (Moho discontinuity). 2)Protrusions of the solid crust at the Moho discontinuity (mountain range roots, sinking of the lithosphere's plates). 3)The differential movement of crust and pyrosphere. The crust misses one full rotation for approximately every 100 pyrosphere rotations, mostly because of the lunar pull. The above mentioned elements can be found in small quantities all over the Moho discontinuity, and they are constantly causing minor earthquakes and small volcanic eruptions. When large quantities of these elements (H20, H2S, SO2, etc) concentrate, they are carried away by the pyrosphere, moving from west to east under the crust. When this movement takes place under flat surfaces of the solid crust, it does not cause earthquakes. But when these elements come along a protrusion (a mountain root) they concentrate on its western side, displacing the pyrosphere until they fill the space created. Due to the differential movement of pyrosphere and solid crust, a vacuum is created on the eastern side of these protrusions and when the aforementioned liquids overfill this space, they explode, escaping to the east. At the point of their escape, these liquids are vaporized and compressed, their flow accelerates, their temperature rises due to fluid friction and they are ionized. On the Earth's surface, a powerful rumbling sound and electrical discharges in the atmosphere, caused by the movement of the gasses, are noticeable. When these elements escape, the space on the west side of the protrusion is violently taken up by the pyrosphere, which collides with the protrusion, causing a major earthquake, attenuation of the protrusions, cracks on the solid crust and damages to structures on the Earth's surface. It is easy to foresee when an earthquake will occur and how big it is

  8. Earthquakes - Volcanoes (Causes - Forecast - Counteraction)

    NASA Astrophysics Data System (ADS)

    Tsiapas, Elias

    2013-04-01

    Earthquakes and volcanoes are caused by: 1)Various liquid elements (e.g. H20, H2S, S02) which emerge from the pyrosphere and are trapped in the space between the solid crust and the pyrosphere (Moho discontinuity). 2)Protrusions of the solid crust at the Moho discontinuity (mountain range roots, sinking of the lithosphere's plates). 3)The differential movement of crust and pyrosphere. The crust misses one full rotation for approximately every 100 pyrosphere rotations, mostly because of the lunar pull. The above mentioned elements can be found in small quantities all over the Moho discontinuity, and they are constantly causing minor earthquakes and small volcanic eruptions. When large quantities of these elements (H20, H2S, SO2, etc) concentrate, they are carried away by the pyrosphere, moving from west to east under the crust. When this movement takes place under flat surfaces of the solid crust, it does not cause earthquakes. But when these elements come along a protrusion (a mountain root) they concentrate on its western side, displacing the pyrosphere until they fill the space created. Due to the differential movement of pyrosphere and solid crust, a vacuum is created on the eastern side of these protrusions and when the aforementioned liquids overfill this space, they explode, escaping to the east. At the point of their escape, these liquids are vaporized and compressed, their flow accelerates, their temperature rises due to fluid friction and they are ionized. On the Earth's surface, a powerful rumbling sound and electrical discharges in the atmosphere, caused by the movement of the gasses, are noticeable. When these elements escape, the space on the west side of the protrusion is violently taken up by the pyrosphere, which collides with the protrusion, causing a major earthquake, attenuation of the protrusions, cracks on the solid crust and damages to structures on the Earth's surface. It is easy to foresee when an earthquake will occur and how big it is

  9. Earthquakes - Volcanoes (Causes - Forecast - Counteraction)

    NASA Astrophysics Data System (ADS)

    Tsiapas, Elias

    2015-04-01

    Earthquakes and volcanoes are caused by: 1) Various liquid elements (e.g. H20, H2S, S02) which emerge from the pyrosphere and are trapped in the space between the solid crust and the pyrosphere (Moho discontinuity). 2) Protrusions of the solid crust at the Moho discontinuity (mountain range roots, sinking of the lithosphere's plates). 3) The differential movement of crust and pyrosphere. The crust misses one full rotation for approximately every 100 pyrosphere rotations, mostly because of the lunar pull. The above mentioned elements can be found in small quantities all over the Moho discontinuity, and they are constantly causing minor earthquakes and small volcanic eruptions. When large quantities of these elements (H20, H2S, SO2, etc) concentrate, they are carried away by the pyrosphere, moving from west to east under the crust. When this movement takes place under flat surfaces of the solid crust, it does not cause earthquakes. But when these elements come along a protrusion (a mountain root) they concentrate on its western side, displacing the pyrosphere until they fill the space created. Due to the differential movement of pyrosphere and solid crust, a vacuum is created on the eastern side of these protrusions and when the aforementioned liquids overfill this space, they explode, escaping to the east. At the point of their escape, these liquids are vaporized and compressed, their flow accelerates, their temperature rises due to fluid friction and they are ionized. On the Earth's surface, a powerful rumbling sound and electrical discharges in the atmosphere, caused by the movement of the gasses, are noticeable. When these elements escape, the space on the west side of the protrusion is violently taken up by the pyrosphere, which collides with the protrusion, causing a major earthquake, attenuation of the protrusions, cracks on the solid crust and damages to structures on the Earth's surface. It is easy to foresee when an earthquake will occur and how big it is

  10. Incubation of Chile's 1960 Earthquake

    NASA Astrophysics Data System (ADS)

    Atwater, B. F.; Cisternas, M.; Salgado, I.; Machuca, G.; Lagos, M.; Eipert, A.; Shishikura, M.

    2003-12-01

    Infrequent occurrence of giant events may help explain how the 1960 Chile earthquake attained M 9.5. Although old documents imply that this earthquake followed great earthquakes of 1575, 1737 and 1837, only three earthquakes of the past 1000 years produced geologic records like those for 1960. These earlier earthquakes include the 1575 event but not 1737 or 1837. Because the 1960 earthquake had nearly twice the seismic slip expected from plate convergence since 1837, much of the strain released in 1960 may have been accumulating since 1575. Geologic evidence for such incubation comes from new paleoseismic findings at the R¡o Maullin estuary, which indents the Pacific coast at 41.5§ S midway along the 1960 rupture. The 1960 earthquake lowered the area by 1.5 m, and the ensuing tsunami spread sand across lowland soils. The subsidence killed forests and changed pastures into sandy tidal flats. Guided by these 1960 analogs, we inferred tsunami and earthquake history from sand sheets, tree rings, and old maps. At Chuyaquen, 10 km upriver from the sea, we studied sand sheets in 31 backhoe pits on a geologic transect 1 km long. Each sheet overlies the buried soil of a former marsh or meadow. The sand sheet from 1960 extends the entire length of the transect. Three earlier sheets can be correlated at least half that far. The oldest one, probably a tsunami deposit, surrounds herbaceous plants that date to AD 990-1160. Next comes a sandy tidal-flat deposit dated by stratigraphic position to about 1000-1500. The penultimate sheet is a tsunami deposit younger than twigs from 1410-1630. It probably represents the 1575 earthquake, whose accounts of shaking, tsunami, and landslides rival those of 1960. In that case, the record excludes the 1737 and 1837 events. The 1737 and 1837 events also appear missing in tree-ring evidence from islands of Misquihue, 30 km upriver from the sea. Here the subsidence in 1960 admitted brackish tidal water that defoliated tens of thousands of

  11. The algorithm of decomposing superimposed 2-D Poisson processes and its application to the extracting earthquake clustering pattern

    NASA Astrophysics Data System (ADS)

    Pei, Tao; Zhou, Cheng-Hu; Yang, Ming; Luo, Jian-Cheng; Li, Quan-Lin

    2004-01-01

    Aiming at the complexity of seismic gestation mechanism and spatial distribution, we hypothesize that the seismic data are composed of background earthquakes and anomaly earthquakes in a certain temporal-spatial scope. Also the background earthquakes and anomaly earthquakes both satisfy the 2-D Poisson process of different parameters respectively. In the paper, the concept of N-th order distance is introduced in order to transform 2-D superimposed Poisson process into 1-D mixture density function. On the basis of choosing the distance, mixture density function is decomposed to recognize the anomaly earthquakes through genetic algorithm. Combined with the temporal scanning of C value, the algorithm is applied to the recognition on spatial pattern of foreshock anomalies by examples of Songpan and Longling sequences in the southwest of China.

  12. Geophysical setting of the 2000 ML 5.2 Yountville, California, earthquake: Implications for seismic Hazard in Napa Valley, California

    USGS Publications Warehouse

    Langenheim, V.E.; Graymer, R.W.; Jachens, R.C.

    2006-01-01

    The epicenter of the 2000 ML 5.2 Yountville earthquake was located 5 km west of the surface trace of the West Napa fault, as defined by Helley and Herd (1977). On the basis of the re-examination of geologic data and the analysis of potential field data, the earthquake occurred on a strand of the West Napa fault, the main basin-bounding fault along the west side of Napa Valley. Linear aeromagnetic anomalies and a prominent gravity gradient extend the length of the fault to the latitude of Calistoga, suggesting that this fault may be capable of larger-magnitude earthquakes. Gravity data indicate an ???2-km-deep basin centered on the town of Napa, where damage was concentrated during the Yountville earthquake. It most likely played a minor role in enhancing shaking during this event but may lead to enhanced shaking caused by wave trapping during a larger-magnitude earthquake.

  13. A Revised Earthquake Catalogue for South Iceland

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Zechar, J. Douglas; Vogfjörd, Kristín S.; Eberhard, David A. J.

    2016-01-01

    In 1991, a new seismic monitoring network named SIL was started in Iceland with a digital seismic system and automatic operation. The system is equipped with software that reports the automatic location and magnitude of earthquakes, usually within 1-2 min of their occurrence. Normally, automatic locations are manually checked and re-estimated with corrected phase picks, but locations are subject to random errors and systematic biases. In this article, we consider the quality of the catalogue and produce a revised catalogue for South Iceland, the area with the highest seismic risk in Iceland. We explore the effects of filtering events using some common recommendations based on network geometry and station spacing and, as an alternative, filtering based on a multivariate analysis that identifies outliers in the hypocentre error distribution. We identify and remove quarry blasts, and we re-estimate the magnitude of many events. This revised catalogue which we consider to be filtered, cleaned, and corrected should be valuable for building future seismicity models and for assessing seismic hazard and risk. We present a comparative seismicity analysis using the original and revised catalogues: we report characteristics of South Iceland seismicity in terms of b value and magnitude of completeness. Our work demonstrates the importance of carefully checking an earthquake catalogue before proceeding with seismicity analysis.

  14. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  15. Earthquakes in the New Zealand Region.

    ERIC Educational Resources Information Center

    Wallace, Cleland

    1995-01-01

    Presents a thorough overview of earthquakes in New Zealand, discussing plate tectonics, seismic measurement, and historical occurrences. Includes 10 figures illustrating such aspects as earthquake distribution, intensity, and fissures in the continental crust. Tabular data includes a list of most destructive earthquakes and descriptive effects…

  16. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  17. Izmit, Turkey 1999 Earthquake Interferogram

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul.

  18. Statistical earthquake focal mechanism forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2014-04-01

    Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

  19. Fault failure with moderate earthquakes

    USGS Publications Warehouse

    Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

    1987-01-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

  20. A Probabilistic Assessment of Earthquake Hazard Parameters in NW Himalaya and the Adjoining Regions

    NASA Astrophysics Data System (ADS)

    Yadav, R. B. S.; Bayrak, Yusuf; Tripathi, J. N.; Chopra, S.; Singh, A. P.; Bayrak, Erdem

    2012-09-01

    The maximum likelihood estimation method is applied to study the geographical distribution of earthquake hazard parameters and seismicity in 28 seismogenic source zones of NW Himalaya and the adjoining regions. For this purpose, we have prepared a reliable, homogeneous and complete earthquake catalogue during the period 1500-2010. The technique used here allows the data to contain either historical or instrumental era or even a combination of the both. In this study, the earthquake hazard parameters, which include maximum regional magnitude ( M max), mean seismic activity rate ( λ), the parameter b (or β = b/log e) of Gutenberg-Richter (G-R) frequency-magnitude relationship, the return periods of earthquakes with a certain threshold magnitude along with their probabilities of occurrences have been calculated using only instrumental earthquake data during the period 1900-2010. The uncertainties in magnitude have been also taken into consideration during the calculation of hazard parameters. The earthquake hazard in the whole NW Himalaya region has been calculated in 28 seismogenic source zones delineated on the basis of seismicity level, tectonics and focal mechanism. The annual probability of exceedance of earthquake (activity rate) of certain magnitude is also calculated for all seismogenic source zones. The obtained earthquake hazard parameters were geographically distributed in all 28 seismogenic source zones to analyze the spatial variation of localized seismicity parameters. It is observed that seismic hazard level is high in Quetta-Kirthar-Sulaiman region in Pakistan, Hindukush-Pamir Himalaya region and Uttarkashi-Chamoli region in Himalayan Frontal Thrust belt. The source zones that are expected to have maximum regional magnitude ( M max) of more than 8.0 are Quetta, southern Pamir, Caucasus and Kashmir-Himanchal Pradesh which have experienced such magnitude of earthquakes in the past. It is observed that seismic hazard level varies spatially from one zone

  1. InSAR observations of the 2009 Racha earthquake, the Republic Georgia

    NASA Astrophysics Data System (ADS)

    Nikolaeva, E.; Walter, T. R.

    2015-08-01

    Central Georgia is an area strongly affected by earthquake and landslide hazards. On 29 April 1991 a major earthquake (Mw = 7.0) struck the Racha region in the republic Georgia, followed by aftershocks and significant afterslip. The same region was hit by another major event (Mw = 6.0) on 7 September 2009. The aim of the study reported here was to utilize geodetic data as synthetic aperture radar interferometry (InSAR) to improve a knowledge about the spatial pattern of deformation due to the earthquakes in the seismic active central Georgia. There were no actual earthquake observations by InSAR in Georgia. We used the multi-temporal ALOS L-band InSAR data to produce interferograms spanning times before and after the 2009 earthquake. We detected a local uplift around 10 cm in the interferogram near the earthquake's epicenter whereas evidence of surface ruptures could not be found in the field along the active thrust fault. We simulated a deformation signal which could be created by the 2009 Racha earthquake on the basis of local seismic records and by using an elastic dislocation model. The observed InSAR deformation is in good agreement with our model. We compared our modeled fault surface of the September 2009 with the April 1991 Racha earthquake fault surfaces, and identify the same fault or a sub-parallel fault of the same system as the origin. The patch that was active in 2009 is just adjacent to the 1991 patch, indicating a possible mainly westward propagation direction, with important implications for future earthquake hazards.

  2. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  3. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  4. Earthquakes triggered by fluid extraction

    NASA Astrophysics Data System (ADS)

    Segall, P.

    1989-10-01

    Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics.

  5. Earth science: lasting earthquake legacy

    USGS Publications Warehouse

    Parsons, Thomas E.

    2009-01-01

    On 31 August 1886, a magnitude-7 shock struck Charleston, South Carolina; low-level activity continues there today. One view of seismic hazard is that large earthquakes will return to New Madrid and Charleston at intervals of about 500 years. With expected ground motions that would be stronger than average, that prospect produces estimates of earthquake hazard that rival those at the plate boundaries marked by the San Andreas fault and Cascadia subduction zone. The result is two large 'bull's-eyes' on the US National Seismic Hazard Maps — which, for example, influence regional building codes and perceptions of public safety.

  6. Earthquakes triggered by fluid extraction

    USGS Publications Warehouse

    Segall, P.

    1989-01-01

    Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author

  7. A comprehensive framework for post-earthquake rehabilitation plan of lifeline networks

    SciTech Connect

    Liang, J.W.

    1996-12-01

    Post-earthquake rehabilitation process of lifeline networks can be divided into three stages: emergency operation, short-term restoration, and long-term restoration, and different rehabilitation measures should be taken for different stages. This paper outlines the post-earthquake rehabilitation plan of lifeline networks which is being developed for Tianjin City. The objective of this plane is to shorten the time for restoration of lifeline networks and to reduce secondary disasters.

  8. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  9. Earthquake prediction in Japan and natural time analysis of seismicity

    NASA Astrophysics Data System (ADS)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  10. 4 Earthquake: Major offshore earthquakes recall the Aztec myth

    USGS Publications Warehouse

    ,

    1970-01-01

    Long before the sun clears the eastern mountains of April 29, 1970, the savanna highlands of Chiapas tremble from a magnitude 6.7 earthquake centered off the Pacific coast near Mexico’s southern border. Then, for a few hours, he Isthmus of Tehuantepec is quiet.

  11. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    ERIC Educational Resources Information Center

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  12. Utilizing online monitoring of water wells for detecting earthquake precursors

    NASA Astrophysics Data System (ADS)

    Reuveni, Y.; Anker, Y.; Inbar, N.; Yellin-Dror, A.; Guttman, J.; Flexer, A.

    2015-12-01

    Groundwater reaction to earthquakes is well known and documented, mostly as changes in water levels or springs discharge, but also as changes in groundwater chemistry. During 2004 groundwater level undulations preceded a series of moderate (ML~5) earthquakes, which occurred along the Dead Sea Rift System (DSRS). In order to try and validate these preliminary observations monitoring of several observation wells was initiated. The monitoring and telemetry infrastructure as well as the wells were allocated specifically for the research by the Israeli National Water Company (Mekorot LTD.). Once several earthquake events were skipped due to insufficient sampling frequency and owing to insufficient storage capacity that caused loss of data, it was decided to establish an independent monitoring system. This current stage of research had commenced at 2011 and just recently became fully operative. At present there are four observation wells that are located along major faults, adjacent to the DSRS. The wells must be inactive and with a confined production layer. The wells are equipped with sensors for groundwter level, water conductivity and groundwater temperature measurements. The data acquisition and transfer resolution is of one minute and the dataset is being transferred through a GPRS network to a central database server. Since the start of the present research stage, most of the earthquakes recorded at the vicinity of the DSRS were smaller then ML 5, with groundwater response only after the ground movement. Nonetheless, distant earthquakes occurring as far as 300 km along a DSRS adjacent fault (ML~3), were noticed at the observation wells. A recent earthquake precursory reoccurrence was followed by a 5.5ML earthquake with an epicenter near the eastern shore of the Red Sea about 400km south to the wells that alerted the quake (see figure). In both wells anomalies is water levels and conductivity were found few hours before the quake, although any single anomaly cannot

  13. ERTS Applications in earthquake research and mineral exploration in California

    NASA Technical Reports Server (NTRS)

    Abdel-Gawad, M.; Silverstein, J.

    1973-01-01

    Examples that ERTS imagery can be effectively utilized to identify, locate, and map faults which show geomorphic evidence of geologically recent breakage are presented. Several important faults not previously known have been identified. By plotting epicenters of historic earthquakes in parts of California, Sonora, Mexico, Arizona, and Nevada, we found that areas known for historic seismicity are often characterized by abundant evidence of recent fault and crustal movements. There are many examples of seismically quiet areas where outstanding evidence of recent fault movements is observed. One application is clear: ERTS-1 imagery could be effectively utilized to delineate areas susceptible to earthquake recurrence which, on the basis of seismic data alone, may be misleadingly considered safe. ERTS data can also be utilized in planning new sites in the geophysical network of fault movement monitoring and strain and tilt measurements.

  14. [Basis of radiation protection].

    PubMed

    Roth, J; Schweizer, P; Gückel, C

    1996-06-29

    After an introduction, three selected contributions to the 10th Course on Radiation Protection held at the University Hospital of Basel are presented. The principles of radiation protection and new Swiss legislation are discussed as the basis for radiological protection. Ways are proposed of reducing radiation exposure while optimizing the X-ray picture with a minimum dose to patient and personnel. Radiation effects from low doses. From the beginning, life on this planet has been exposed to ionizing radiation from natural sources. For about one century additional irradiation has reached us from man-made sources as well. In Switzerland the overall annual radiation exposure from ambient and man-made sources amounts to about 4 mSv. The terrestrial and cosmic radiation and natural radionuclids in the body cause about 1.17 mSv (29%). As much as 1.6 mSv (40%) results from exposure to radon and its progenies, primarily inside homes. Medical applications contribute approximately 1 mSv (26%) to the annual radiation exposure and releases from atomic weapons, nuclear facilities and miscellaneous industrial operations yield less than 0.12 mSv (< 5%) to the annual dose. Observations of detrimental radiation effects from intermediate to high doses are challenged by observations of biopositive adaptive responses and hormesis following low dose exposure. The important question, whether cellular adaptive response or hormesis could cause beneficial effects to the human organism that would outweigh the detrimental effects attributed to low radiation doses, remains to be resolved. Whether radiation exerts a detrimental, inhibitory, modifying or even beneficial effect is likely to result from identical molecular lesions but to depend upon their quantity, localization and time scale of initiation, as well as the specific responsiveness of the cellular systems involved. For matters of radiation protection the bionegative radiation effects are classified as deterministic effects or

  15. Predictability of population displacement after the 2010 Haiti earthquake.

    PubMed

    Lu, Xin; Bengtsson, Linus; Holme, Petter

    2012-07-17

    Most severe disasters cause large population movements. These movements make it difficult for relief organizations to efficiently reach people in need. Understanding and predicting the locations of affected people during disasters is key to effective humanitarian relief operations and to long-term societal reconstruction. We collaborated with the largest mobile phone operator in Haiti (Digicel) and analyzed the movements of 1.9 million mobile phone users during the period from 42 d before, to 341 d after the devastating Haiti earthquake of January 12, 2010. Nineteen days after the earthquake, population movements had caused the population of the capital Port-au-Prince to decrease by an estimated 23%. Both the travel distances and size of people's movement trajectories grew after the earthquake. These findings, in combination with the disorder that was present after the disaster, suggest that people's movements would have become less predictable. Instead, the predictability of people's trajectories remained high and even increased slightly during the three-month period after the earthquake. Moreover, the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds. For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought.

  16. The New Zealand earthquake of March 2, 1987: Effects on electric power and industrial facilities

    SciTech Connect

    Not Available

    1989-11-01

    The Bay of Plenty region of the northern island of New Zealand was struck by a magnitude 6.3 earthquake on March 2, 1987. The region is a center of large industrial operations for the production of paper and agricultural products. A modern high-voltage electric power system serves the area. The local soil conditions and the complicated faulting mechanism of the earthquake combined to create very intense ground shaking in the vicinity of some of the power and industrial facilities. The effects of the earthquake in some locations were the most severe yet encountered in the post-earthquake investigation program sponsored by EPRI. Extensive damage occurred in substation switchyards, in spite of well-engineered provisions for seismic loads. Industrial operations suffered substantial financial losses from the earthquake, again in spite of generally good seismic design provisions in structures. The earthquake provides excellent illustrations of the seismic fragility of certain equipment and structures common to electric power installations. It also reinforces previous experience regarding the seismic durability of most equipment critical to the safe operation of power plants, nuclear plants in particular. 30 refs., 132 figs., 3 tabs.

  17. Systematic Underestimation of Earthquake Magnitudes from Large Intracontinental Reverse Faults: Historical Ruptures Break Across Segment Boundaries

    NASA Technical Reports Server (NTRS)

    Rubin, C. M.

    1996-01-01

    Because most large-magnitude earthquakes along reverse faults have such irregular and complicated rupture patterns, reverse-fault segments defined on the basis of geometry alone may not be very useful for estimating sizes of future seismic sources. Most modern large ruptures of historical earthquakes generated by intracontinental reverse faults have involved geometrically complex rupture patterns. Ruptures across surficial discontinuities and complexities such as stepovers and cross-faults are common. Specifically, segment boundaries defined on the basis of discontinuities in surficial fault traces, pronounced changes in the geomorphology along strike, or the intersection of active faults commonly have not proven to be major impediments to rupture. Assuming that the seismic rupture will initiate and terminate at adjacent major geometric irregularities will commonly lead to underestimation of magnitudes of future large earthquakes.

  18. Systematical Search for remotely triggered earthquakes in Tibet Plateau following the 2004 M9.2 Sumatra and the 2005 M8.6 Nias earthquake

    NASA Astrophysics Data System (ADS)

    Yao, D.; Peng, Z.; Meng, X.

    2013-12-01

    The continuous collision between the Indian Plate and Eurasian Plate gives rise to the formation of Tibet Plateau, one of the most complex tectonic environments in the world. In this study, we search for remotely triggered earthquakes in Tibet Plateau following the 2004 M9.2 Sumatra and the 2005 M8.6 Nias Earthquakes. During 2002 to 2005, the Hi-CLIMB (Himalayan-Tibetan Continental Lithosphere during Mountain Building) experiment (IRIS network code XF) was operated in South-Central Tibet, extending from the Ganges lowland, across the Himalayas and into the central Tibetan plateau. Over 200 sites were occupied over this experiment, providing unprecedented continuous recordings around the 2004 Sumatra and the 2005 Nias Earthquakes. On April 7th 2005, nearly 10 days after the 2005 Nias Earthquake, Zhongba was shaken by an Mw 6.3 Earthquake, raising the question on whether this event was delay-triggered by the Nias Earthquake. The Zhongba region is characterized by many NS-trending rift systems that accommodate EW extension, and several moderate-size earthquakes since 2004 suggest that this region is seismically active. Here we use a recently developed matched filter technique to examine seismicity rate changes associated with the 2004 Sumatra and 2005 Nias earthquakes. Specifically, we use 32 relocated earthquakes in Zhongba as templates to detect additional local events between 12/2004 and 05/2005. Although we have detected more than 45 times more events during this time period, we find no statistically significant change in the seismicity rate following the Nias earthquake and before the 2005 Zhongba mainshock. Hence, it is difficult to establish a triggering relationship between these two events. However, after manual inspection of the continuous data during the surface wave trains of the 2004 Sumatra and the 2005 Nias earthquakes, we identify many triggered microearthquakes 350 km north of Zhongba. These dynamically triggered events were located in southern part

  19. Intrastab Earthquakes: Dehydration of the Cascadia Slab

    USGS Publications Warehouse

    Preston, L.A.; Creager, K.C.; Crosson, R.S.; Brocher, T.M.; Trehu, A.M.

    2003-01-01

    We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intrastab earthquakes into two groups, permitting a new understanding of the origins of intrastab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation.

  20. Intraslab earthquakes: dehydration of the Cascadia slab.

    PubMed

    Preston, Leiph A; Creager, Kenneth C; Crosson, Robert S; Brocher, Thomas M; Trehu, Anne M

    2003-11-14

    We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intraslab earthquakes into two groups, permitting a new understanding of the origins of intraslab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation.