Science.gov

Sample records for operating basis earthquake

  1. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  2. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  3. [Autism after an earthquake: the experience of L'Aquila (Central Italy) as a basis for an operative guideline].

    PubMed

    Valenti, Marco; Di Giovanni, Chiara; Mariano, Melania; Pino, Maria Chiara; Sconci, Vittorio; Mazza, Monica

    2016-01-01

    People with autism, their families, and their specialised caregivers are a social group at high health risk after a disruptive earthquake. They need emergency assistance and immediate structured support according to definite protocols and quality standards. We recommend to establish national guidelines for taking-in-charge people with autism after an earthquake. The adaptive behaviour of participants with autism declined dramatically in the first months after the earthquake in all the dimensions examined (i.e., communication, daily living, socialisation, and motor skills). After relatively stable conditions returned and with immediate and intensive post-disaster intervention, children and adolescents with autism showed a trend towards partial recovery of adaptive functioning. As to the impact on services, this study indicates the need for supporting exposed caregivers at high risk of burnout over the first two years after the disaster and for an immediate reorganisation of person-tailored services.

  4. PHYSIOLOGIC BASIS OF NASAL OPERATIONS

    PubMed Central

    Hilding, A. C.

    1950-01-01

    To be successful, intranasal operations must be so designed as to restore the normal physiologic function of the nose. It is impossible with impunity to operate upon the interior of the nose as though it were simply an air flue and on the sinuses as though they were boxes. PMID:15400563

  5. The potential uses of operational earthquake forecasting

    USGS Publications Warehouse

    Field, Ned; Jordan, Thomas; Jones, Lucille; Michael, Andrew; Blanpied, Michael L.

    2016-01-01

    This article reports on a workshop held to explore the potential uses of operational earthquake forecasting (OEF). We discuss the current status of OEF in the United States and elsewhere, the types of products that could be generated, the various potential users and uses of OEF, and the need for carefully crafted communication protocols. Although operationalization challenges remain, there was clear consensus among the stakeholders at the workshop that OEF could be useful.

  6. Operational Earthquake Forecasting of Aftershocks for New England

    NASA Astrophysics Data System (ADS)

    Ebel, J.; Fadugba, O. I.

    2015-12-01

    Although the forecasting of mainshocks is not possible, recent research demonstrates that probabilistic forecasts of expected aftershock activity following moderate and strong earthquakes is possible. Previous work has shown that aftershock sequences in intraplate regions behave similarly to those in California, and thus the operational aftershocks forecasting methods that are currently employed in California can be adopted for use in areas of the eastern U.S. such as New England. In our application, immediately after a felt earthquake in New England, a forecast of expected aftershock activity for the next 7 days will be generated based on a generic aftershock activity model. Approximately 24 hours after the mainshock, the parameters of the aftershock model will be updated using the observed aftershock activity observed to that point in time, and a new forecast of expected aftershock activity for the next 7 days will be issued. The forecast will estimate the average number of weak, felt aftershocks and the average expected number of aftershocks based on the aftershock statistics of past New England earthquakes. The forecast also will estimate the probability that an earthquake that is stronger than the mainshock will take place during the next 7 days. The aftershock forecast will specify the expected aftershocks locations as well as the areas over which aftershocks of different magnitudes could be felt. The system will use web pages, email and text messages to distribute the aftershock forecasts. For protracted aftershock sequences, new forecasts will be issued on a regular basis, such as weekly. Initially, the distribution system of the aftershock forecasts will be limited, but later it will be expanded as experience with and confidence in the system grows.

  7. Linking earthquakes and hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2013-01-01

    Hydraulic fracturing, also known as fracking, to extract oil and gas from rock, has been a controversial but increasingly common practice; some studies have linked it to groundwater contamination and induced earthquakes. Scientists discussed several studies on the connection between fracking and earthquakes at the AGU Fall Meeting in San Francisco in December.

  8. Assessment of precast beam-column using capacity demand response spectrum subject to design basis earthquake and maximum considered earthquake

    NASA Astrophysics Data System (ADS)

    Ghani, Kay Dora Abd.; Tukiar, Mohd Azuan; Hamid, Nor Hayati Abdul

    2017-08-01

    Malaysia is surrounded by the tectonic feature of the Sumatera area which consists of two seismically active inter-plate boundaries, namely the Indo-Australian and the Eurasian Plates on the west and the Philippine Plates on the east. Hence, Malaysia experiences tremors from far distant earthquake occurring in Banda Aceh, Nias Island, Padang and other parts of Sumatera Indonesia. In order to predict the safety of precast buildings in Malaysia under near field ground motion the response spectrum analysis could be used for dealing with future earthquake whose specific nature is unknown. This paper aimed to develop of capacity demand response spectrum subject to Design Basis Earthquake (DBE) and Maximum Considered Earthquake (MCE) in order to assess the performance of precast beam column joint. From the capacity-demand response spectrum analysis, it can be concluded that the precast beam-column joints would not survive when subjected to earthquake excitation with surface-wave magnitude, Mw, of more than 5.5 Scale Richter (Type 1 spectra). This means that the beam-column joint which was designed using the current code of practice (BS8110) would be severely damaged when subjected to high earthquake excitation. The capacity-demand response spectrum analysis also shows that the precast beam-column joints in the prototype studied would be severely damaged when subjected to Maximum Considered Earthquake (MCE) with PGA=0.22g having a surface-wave magnitude of more than 5.5 Scale Richter, or Type 1 spectra.

  9. The Bender-Dunne basis operators as Hilbert space operators

    SciTech Connect

    Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph

    2014-02-15

    The Bender-Dunne basis operators, T{sub −m,n}=2{sup −n}∑{sub k=0}{sup n}(n/k )q{sup k}p{sup −m}q{sup n−k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub −m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)

  10. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart.

  11. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  12. Retrospective tests of hybrid operational earthquake forecasting models for Canterbury

    NASA Astrophysics Data System (ADS)

    Rhoades, D. A.; Liukis, M.; Christophersen, A.; Gerstenberger, M. C.

    2016-01-01

    The Canterbury, New Zealand, earthquake sequence, which began in September 2010, occurred in a region of low crustal deformation and previously low seismicity. Because, the ensuing seismicity in the region is likely to remain above previous levels for many years, a hybrid operational earthquake forecasting model for Canterbury was developed to inform decisions on building standards and urban planning for the rebuilding of Christchurch. The model estimates occurrence probabilities for magnitudes M ≥ 5.0 in the Canterbury region for each of the next 50 yr. It combines two short-term, two medium-term and four long-term forecasting models. The weight accorded to each individual model in the operational hybrid was determined by an expert elicitation process. A retrospective test of the operational hybrid model and of an earlier informally developed hybrid model in the whole New Zealand region has been carried out. The individual and hybrid models were installed in the New Zealand Earthquake Forecast Testing Centre and used to make retrospective annual forecasts of earthquakes with magnitude M > 4.95 from 1986 on, for time-lags up to 25 yr. All models underpredict the number of earthquakes due to an abnormally large number of earthquakes in the testing period since 2008 compared to those in the learning period. However, the operational hybrid model is more informative than any of the individual time-varying models for nearly all time-lags. Its information gain relative to a reference model of least information decreases as the time-lag increases to become zero at a time-lag of about 20 yr. An optimal hybrid model with the same mathematical form as the operational hybrid model was computed for each time-lag from the 26-yr test period. The time-varying component of the optimal hybrid is dominated by the medium-term models for time-lags up to 12 yr and has hardly any impact on the optimal hybrid model for greater time-lags. The optimal hybrid model is considerably more

  13. Minimization of Basis Risk in Parametric Earthquake Cat Bonds

    NASA Astrophysics Data System (ADS)

    Franco, G.

    2009-12-01

    A catastrophe -cat- bond is an instrument used by insurance and reinsurance companies, by governments or by groups of nations to cede catastrophic risk to the financial markets, which are capable of supplying cover for highly destructive events, surpassing the typical capacity of traditional reinsurance contracts. Parametric cat bonds, a specific type of cat bonds, use trigger mechanisms or indices that depend on physical event parameters published by respected third parties in order to determine whether a part or the entire bond principal is to be paid for a certain event. First generation cat bonds, or cat-in-a-box bonds, display a trigger mechanism that consists of a set of geographic zones in which certain conditions need to be met by an earthquake’s magnitude and depth in order to trigger payment of the bond principal. Second generation cat bonds use an index formulation that typically consists of a sum of products of a set of weights by a polynomial function of the ground motion variables reported by a geographically distributed seismic network. These instruments are especially appealing to developing countries with incipient insurance industries wishing to cede catastrophic losses to the financial markets because the payment trigger mechanism is transparent and does not involve the parties ceding or accepting the risk, significantly reducing moral hazard. In order to be successful in the market, however, parametric cat bonds have typically been required to specify relatively simple trigger conditions. The consequence of such simplifications is the increase of basis risk. This risk represents the possibility that the trigger mechanism fails to accurately capture the actual losses of a catastrophic event, namely that it does not trigger for a highly destructive event or vice versa, that a payment of the bond principal is caused by an event that produced insignificant losses. The first case disfavors the sponsor who was seeking cover for its losses while the

  14. Operational earthquake forecasting in the South Iceland Seismic Zone: improving the earthquake catalogue

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Vogfjörd, Kristin; Zechar, J. Douglas; Eberhard, David

    2014-05-01

    A major earthquake sequence is ongoing in the South Iceland Seismic Zone (SISZ), where experts expect earthquakes of up to MW = 7.1 in the coming years to decades. The historical seismicity in this region is well known and many major faults here and on Reykjanes Peninsula (RP) have already been mapped. The faults are predominantly N-S with right-lateral strike-slip motion, while the overall motion in the SISZ is E-W oriented left-lateral motion. The area that we propose for operational earthquake forecasting(OEF) contains both the SISZ and the RP. The earthquake catalogue considered for OEF, called the SIL catalogue, spans the period from 1991 until September 2013 and contains more than 200,000 earthquakes. Some of these events have a large azimuthal gap between stations, and some have large horizontal and vertical uncertainties. We are interested in building seismicity models using high-quality data, so we filter the catalogue using the criteria proposed by Gomberg et al. (1990) and Bondar et al. (2004). The resulting filtered catalogue contains around 130,000 earthquakes. Magnitude estimates in the Iceland catalogue also require special attention. The SIL system uses two methods to estimate magnitude. The first method is based on an empirical local magnitude (ML) relationship. The other magnitude scale is a so-called "local moment magnitude" (MLW), originally constructed by Slunga et al. (1984) to agree with local magnitude scales in Sweden. In the SIL catalogue, there are two main problems with the magnitude estimates and consequently it is not immediately possible to convert MLW to moment magnitude (MW). These problems are: (i) immediate aftershocks of large events are assigned magnitudes that are too high; and (ii) the seismic moment of large earthquakes is underestimated. For this reason the magnitude values in the catalogue must be corrected before developing an OEF system. To obtain a reliable MW estimate, we calibrate a magnitude relationship based on

  15. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  16. Operational Earthquake Forecasting and Earthquake Early Warning: The Challenges of Introducing Scientific Innovations for Public Safety

    NASA Astrophysics Data System (ADS)

    Goltz, J. D.

    2016-12-01

    Although variants of both earthquake early warning and short-term operational earthquake forecasting systems have been implemented or are now being implemented in some regions and nations, they have been slow to gain acceptance within the disciplines that produced them as well as among those for whom they were intended to assist. To accelerate the development and implementation of these technologies will require the cooperation and collaboration of multiple disciplines, some inside and others outside of academia. Seismologists, social scientists, emergency managers, elected officials and key opinion leaders from the media and public must be the participants in this process. Representatives of these groups come from both inside and outside of academia and represent very different organizational cultures, backgrounds and expectations for these systems, sometimes leading to serious disagreements and impediments to further development and implementation. This presentation will focus on examples of the emergence of earthquake early warning and operational earthquake forecasting systems in California, Japan and other regions and document the challenges confronted in the ongoing effort to improve seismic safety.

  17. The Establishment of an Operational Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Lombardi, Anna Maria; Casarotti, Emanuele

    2014-05-01

    Just after the Mw 6.2 earthquake that hit L'Aquila, on April 6 2009, the Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) that paved the way to the development of the Operational Earthquake Forecasting (OEF), defined as the "procedures for gathering and disseminating authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes". In this paper we introduce the first official OEF system in Italy that has been developed by the new-born Centro di Pericolosità Sismica at the Istituto Nazionale di Geofisica e Vulcanologia. The system provides every day an update of the weekly probabilities of ground shaking over the whole Italian territory. In this presentation, we describe in detail the philosophy behind the system, the scientific details, and the output format that has been preliminary defined in agreement with Civil Protection. To our knowledge, this is the first operational system that fully satisfies the ICEF guidelines. Probably, the most sensitive issue is related to the communication of such a kind of message to the population. Acknowledging this inherent difficulty, in agreement with Civil Protection we are planning pilot tests to be carried out in few selected areas in Italy; the purpose of such tests is to check the effectiveness of the message and to receive feedbacks.

  18. FB Line Basis for Interim Operation

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The safety analysis of the FB-Line Facility indicates that the operation of FB-Line to support the current mission does not present undue risk to the facility and co-located workers, general public, or the environment.

  19. Design basis for the NRC Operations Center

    SciTech Connect

    Lindell, M.K.; Wise, J.A.; Griffin, B.N.; Desrosiers, A.E.; Meitzler, W.D.

    1983-05-01

    This report documents the development of a design for a new NRC Operations Center (NRCOC). The project was conducted in two phases: organizational analysis and facility design. In order to control the amount of traffic, congestion and noise within the facility, it is recommended that information flow in the new NRCOC be accomplished by means of an electronic Status Information Management System. Functional requirements and a conceptual design for this system are described. An idealized architectural design and a detailed design program are presented that provide the appropriate amount of space for operations, equipment and circulation within team areas. The overall layout provides controlled access to the facility and, through the use of a zoning concept, provides each team within the NRCOC the appropriate balance of ready access and privacy determined from the organizational analyses conducted during the initial phase of the project.

  20. Solid waste retrieval. Phase 1, Operational basis

    SciTech Connect

    Johnson, D.M.

    1994-09-30

    This Document describes the operational requirements, procedures, and options for execution of the retrieval of the waste containers placed in buried storage in Burial Ground 218W-4C, Trench 04 as TRU waste or suspect TRU waste under the activity levels defining this waste in effect at the time of placement. Trench 04 in Burial Ground 218W-4C is totally dedicated to storage of retrievable TRU waste containers or retrievable suspect TRU waste containers and has not been used for any other purpose.

  1. Lessons of L'Aquila for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  2. Operational real-time GPS-enhanced earthquake early warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.; Johanson, I. A.; Allen, R. M.

    2014-10-01

    Moment magnitudes for large earthquakes (Mw≥7.0) derived in real time from near-field seismic data can be underestimated due to instrument limitations, ground tilting, and saturation of frequency/amplitude-magnitude relationships. Real-time high-rate GPS resolves the buildup of static surface displacements with the S wave arrival (assuming nonsupershear rupture), thus enabling the estimation of slip on a finite fault and the event's geodetic moment. Recently, a range of high-rate GPS strategies have been demonstrated on off-line data. Here we present the first operational system for real-time GPS-enhanced earthquake early warning as implemented at the Berkeley Seismological Laboratory (BSL) and currently analyzing real-time data for Northern California. The BSL generates real-time position estimates operationally using data from 62 GPS stations in Northern California. A fully triangulated network defines 170+ station pairs processed with the software trackRT. The BSL uses G-larmS, the Geodetic Alarm System, to analyze these positioning time series and determine static offsets and preevent quality parameters. G-larmS derives and broadcasts finite fault and magnitude information through least-squares inversion of the static offsets for slip based on a priori fault orientation and location information. This system tightly integrates seismic alarm systems (CISN-ShakeAlert, ElarmS-2) as it uses their P wave detections to trigger its processing; quality control runs continuously. We use a synthetic Hayward Fault earthquake scenario on real-time streams to demonstrate recovery of slip and magnitude. Reanalysis of the Mw7.2 El Mayor-Cucapah earthquake tests the impact of dynamic motions on offset estimation. Using these test cases, we explore sensitivities to disturbances of a priori constraints (origin time, location, and fault strike/dip).

  3. Geological and seismological survey for new design-basis earthquake ground motion of Kashiwazaki-Kariwa NPS

    NASA Astrophysics Data System (ADS)

    Takao, M.; Mizutani, H.

    2009-05-01

    At about 10:13 on July 16, 2007, a strong earthquake named 'Niigata-ken Chuetsu-oki Earthquake' of Mj6.8 on Japan Meteorological Agencyfs scale occurred offshore Niigata prefecture in Japan. However, all of the nuclear reactors at Kashiwazaki-Kariwa Nuclear Power Station (KKNPS) in Niigata prefecture operated by Tokyo Electric Power Company shut down safely. In other words, automatic safety function composed of shutdown, cooling and containment worked as designed immediately after the earthquake. During the earthquake, the peak acceleration of the ground motion exceeded the design-basis ground motion (DBGM), but the force due to the earthquake applied to safety-significant facilities was about the same as or less than the design basis taken into account as static seismic force. In order to assess anew the safety of nuclear power plants, we have evaluated a new DBGM after conducting geomorphological, geological, geophysical, seismological survey and analyses. [Geomorphological, Geological and Geophysical survey] In the land area, aerial photograph interpretation was performed at least within the 30km radius to extract geographies that could possibly be tectonic reliefs as a geomorphological survey. After that, geological reconnaissance was conducted to confirm whether the extracted landforms are tectonic reliefs or not. Especially we carefully investigated Nagaoka Plain Western Boundary Fault Zone (NPWBFZ), which consists of Kakuda-Yahiko fault, Kihinomiya fault and Katakai fault, because NPWBFZ is the one of the active faults which have potential of Mj8 class in Japan. In addition to the geological survey, seismic reflection prospecting of approximate 120km in total length was completed to evaluate the geological structure of the faults and to assess the consecutiveness of the component faults of NPWBFZ. As a result of geomorphological, geological and geophysical surveys, we evaluated that the three component faults of NPWBFZ are independent to each other from the

  4. Scientific and non-scientific challenges for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2015-12-01

    Tracking the time evolution of seismic hazard in time windows shorter than the usual 50-years of long-term hazard models may offer additional opportunities to reduce the seismic risk. This is the target of operational earthquake forecasting (OEF). During the OEF development in Italy we identify several challenges that range from pure science to the more practical interface of science with society. From a scientific point of view, although earthquake clustering is the clearest empirical evidence about earthquake occurrence, and OEF clustering models are the most (successfully) tested hazard models in seismology, we note that some seismologists are still reluctant to accept their scientific reliability. After exploring the motivations of these scientific doubts, we also look into an issue that is often overlooked in this discussion, i.e., in any kind of hazard analysis, we do not use a model because it is the true one, but because it is the better than anything else we can think of. The non-scientific aspects are mostly related to the fact that OEF usually provides weekly probabilities of large eartquakes smaller than 1%. These probabilities are considered by some seismologists too small to be of interest or useful. However, in a recent collaboration with engineers we show that such earthquake probabilities may lead to intolerable individual risk of death. Interestingly, this debate calls for a better definition of the still fuzzy boundaries among the different expertise required for the whole risk mitigation process. The last and probably more pressing challenge is related to the communication to the public. In fact, a wrong message could be useless or even counterproductive. Here we show some progresses that we have made in this field working with communication experts in Italy.

  5. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  6. Is there a basis for preferring characteristic earthquakes over a Gutenberg-Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg-Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg-Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  7. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  8. 78 FR 39781 - Consequence Study of a Beyond-Design-Basis Earthquake Affecting the Spent Fuel Pool for a U.S...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ... COMMISSION Consequence Study of a Beyond-Design-Basis Earthquake Affecting the Spent Fuel Pool for a U.S... comment, titled Consequence Study of a Beyond- Design-Basis Earthquake Affecting the Spent Fuel Pool for a... earthquakes present the dominant risk for spent fuel pools, the draft study evaluated how a potential pool...

  9. Circuit breaker operation and potential failure modes during an earthquake

    SciTech Connect

    Lambert, H.E.; Budnitz, R.J.

    1987-01-01

    This study addresses the effect of a strong-motion earthquake on circuit breaker operation. It focuses on the loss of offsite power (LOSP) transient caused by a strong-motion earthquake at the Zion Nuclear Power Plant. This paper also describes the operator action necessary to prevent core melt if the above circuit breaker failure modes occur simultaneously on three 4.16 KV buses. Numerous circuit breakers important to plant safety, such as circuit breakers to diesel generators and engineered safety systems (ESS), must open and/or close during this transient while strong motion is occurring. Potential seismically-induced circuit-breaker failures modes were uncovered while the study was conducted. These failure modes include: circuit breaker fails to close; circuit breaker trips inadvertently; circuit breaker fails to reclose after trip. The causes of these failure modes include: Relay chatter causes the circuit breaker to trip; Relay chatter causes anti-pumping relays to seal-in which prevents automatic closure of circuit breakers; Load sequencer failures. The incorporation of these failure modes as well as other instrumentation and control failures into a limited scope seismic probabilistic risk assessment is also discussed in this paper.

  10. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  11. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  12. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  13. Ground motion following selection of SRS design basis earthquake and associated deterministic approach. Final report: Revision 1

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section`s Seismic Qualification Program for reactor restart.

  14. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  15. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  16. Monitoring and control of lifeline systems to enhance post-earthquake operation

    SciTech Connect

    Ballantyne, D.

    1995-12-31

    This paper summarizes the problem of earthquake damage to lifeline systems, particularly buried pipe, and the high cost of mitigation by replacement. System monitoring and control is presented as an alternative. Earthquake hazard, structural, soils, and system operation parameters are identified as useful for system control; examples are presented. Monitoring and control system implementation issues are discussed including system configuration, local/centralized control, hardware, and appropriate types of systems for earthquake mitigation implementation.

  17. A subleading operator basis and matching for gg → H

    NASA Astrophysics Data System (ADS)

    Moult, Ian; Stewart, Iain W.; Vita, Gherardo

    2017-07-01

    The Soft Collinear Effective Theory (SCET) is a powerful framework for studying factorization of amplitudes and cross sections in QCD. While factorization at leading power has been well studied, much less is known at subleading powers in the λ ≪ 1 expansion. In SCET subleading soft and collinear corrections to a hard scattering process are described by power suppressed operators, which must be fixed case by case, and by well established power suppressed Lagrangians, which correct the leading power dynamics of soft and collinear radiation. Here we present a complete basis of power suppressed operators for gg → H, classifying all operators which contribute to the cross section at O({λ}^2) , and showing how helicity selection rules significantly simplify the construction of the operator basis. We perform matching calculations to determine the tree level Wilson coefficients of our operators. These results are useful for studies of power corrections in both resummed and fixed order perturbation theory, and for understanding the factorization properties of gauge theory amplitudes and cross sections at subleading power. As one example, our basis of operators can be used to analytically compute power corrections for N -jettiness subtractions for gg induced color singlet production at the LHC.

  18. Earthquake Response Modeling for a Parked and Operating Megawatt-Scale Wind Turbine

    SciTech Connect

    Prowell, I.; Elgamal, A.; Romanowitz, H.; Duggan, J. E.; Jonkman, J.

    2010-10-01

    Demand parameters for turbines, such as tower moment demand, are primarily driven by wind excitation and dynamics associated with operation. For that purpose, computational simulation platforms have been developed, such as FAST, maintained by the National Renewable Energy Laboratory (NREL). For seismically active regions, building codes also require the consideration of earthquake loading. Historically, it has been common to use simple building code approaches to estimate the structural demand from earthquake shaking, as an independent loading scenario. Currently, International Electrotechnical Commission (IEC) design requirements include the consideration of earthquake shaking while the turbine is operating. Numerical and analytical tools used to consider earthquake loads for buildings and other static civil structures are not well suited for modeling simultaneous wind and earthquake excitation in conjunction with operational dynamics. Through the addition of seismic loading capabilities to FAST, it is possible to simulate earthquake shaking in the time domain, which allows consideration of non-linear effects such as structural nonlinearities, aerodynamic hysteresis, control system influence, and transients. This paper presents a FAST model of a modern 900-kW wind turbine, which is calibrated based on field vibration measurements. With this calibrated model, both coupled and uncoupled simulations are conducted looking at the structural demand for the turbine tower. Response is compared under the conditions of normal operation and potential emergency shutdown due the earthquake induced vibrations. The results highlight the availability of a numerical tool for conducting such studies, and provide insights into the combined wind-earthquake loading mechanism.

  19. 1/f and the Earthquake Problem: Scaling constraints to facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Rundle, J. B.; Glasscoe, M. T.

    2013-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or '1/f', nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this '1/f problem,' it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area), in combination with a metric to quantify rate trends in local seismicity, to the local earthquake magnitude potential - the magnitudes of earthquakes the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.

  20. Post Test Analysis of a PCCV Model Dynamically Tested Under Simulated Design-Basis Earthquakes

    SciTech Connect

    Cherry, J.; Chokshi, N.; James, R.J.; Rashid, Y.R.; Tsurumaki, S.; Zhang, L.

    1998-11-09

    In a collaborative program between the United States Nuclear Regulatory Commission (USNRC) and the Nuclear Power Engineering Corporation (NUPEC) of Japan under sponsorship of the Ministry of International Trade and Ihdustry, the seismic behavior of Prestressed Concrete Containment Vessels (PCCV) is being investigated. A 1:10 scale PCCV model has been constructed by NUPEC and subjected to seismic simulation tests using the high performance shaking table at the Tadotsu Engineering Laboratory. A primary objective of the testing program is to demonstrate the capability of the PCCV to withstand design basis earthquakes with a significant safety margin against major damage or failure. As part of the collaborative program, Sandia National Laboratories (SNL) is conducting research in state-of-the-art analytical methods for predicting the seismic behavior of PCCV structures, with the eventual goal of understanding, validating, and improving calculations dated to containment structure performance under design and severe seismic events. With the increased emphasis on risk-informed- regulatory focus, more accurate ch&@erization (less uncertainty) of containment structural and functional integri~ is desirable. This paper presents results of post-test calculations conducted at ANATECH to simulate the design level scale model tests.

  1. Imminent earthquake forecasting on the basis of Japan INTERMAGNET stations, NEIC, NOAA and Tide code data analysis

    NASA Astrophysics Data System (ADS)

    Mavrodiev, S. Cht.

    2016-08-01

    This research presents one possible way for imminent prediction of earthquake magnitude, depth and epicenter coordinates by solving the inverse problem using a data acquisition network system for monitoring, archiving and complex analysis of geophysical variables precursors. Among many possible precursors the most reliable are the geoelectromagnetic field, the boreholes water level, the radon surface concentration, the local heat flow, the ionosphere variables, the low frequency atmosphere and Earth core waves. In this study only geomagnetic data are used. Within the framework of geomagnetic quake approach it is possible to perform an imminent regional seismic activity forecasting on the basis of simple analysis of geomagnetic data which use a new variable Schtm with dimension surface density of energy. Such analysis of Memambetsu, Kakioka, Kanoya (Japan, INTERMAGNET) stations and NEIC earthquakes data, the hypothesis that the predicted earthquake is this with bigest value of the variable Schtm permit to formulate an inverse problem (overdetermined algebraic system) for precursors signals like a functions of earthquake magnitude, depth and distance from a monitoring point. Thus, in the case of data acquisition network system existence, which includes monitoring of more than one reliable precursor variables in at least four points distributed within the area with a radius of up to 700 km, there will be enough algebraic equations for calculation of impending earthquake magnitude, depth and distance, solving the overdetermined algebraic system.

  2. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  3. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  4. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  5. A spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3‐ETAS): Toward an operational earthquake forecast

    USGS Publications Warehouse

    Field, Ned; Milner, Kevin R.; Hardebeck, Jeanne L.; Page, Morgan T.; van der Elst, Nicholas; Jordan, Thomas H.; Michael, Andrew J.; Shaw, Bruce E.; Werner, Maximillan J.

    2017-01-01

    We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic‐type aftershock sequence (ETAS) component to the previously published time‐independent and long‐term time‐dependent forecasts. This combined model, referred to as UCERF3‐ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic‐rebound model for fault‐based ruptures, and a state‐of‐the‐art spatiotemporal clustering component. It also represents an attempt to merge fault‐based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3‐ETAS produces synthetic catalogs of M≥2.5 events, conditioned on any prior M≥2.5 events that are input to the model. We evaluate results with respect to both long‐term (1000 year) simulations as well as for 10‐year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3‐ETAS has many sources of uncertainty, as

  6. M6.0 South Napa Earthquake Forecasting on the basis of jet stream precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.

    2014-12-01

    Currently earthquake prediction research methods can be divided into the crust change, radon concentration, well water level, animal behavior, Very high frequency (VHF) signals, GPS/TEC in ionospheric variations, thermal infrared radiation (TIR) anomalies. Before major earthquakes (M> 6) occurred, jet stream in the epicenter area will interrupt or velocity flow lines cross. That meaning is that before earthquake happen, atmospheric pressure in high altitude suddenly dropped during 6~12 hours (Wu & Tikhonov, 2014). This technique has been used to predict the strong earthquakes in real time, and then pre-registered on the website. For example: M6.0 Northern California earthquake on 2014/08/24(figure1) , M6.6 Russia earthquake on 2013/10/12(figure2), As far as 2014/08/24 M6.6 earthquake in CA, USA, the front end of the 60knots speed line was at the S.F. on 2014/06/16 12:00, and then after 69 days ,M6.1 earthquake happened. We predicted that magnitude is larger than 5.5 but the period is only 30 days on 2014/07/16 . The deviation of predicted point was about 70 km. Lithosphere-atmosphere-ionosphere (LAI) coupling model may be explained this phenomenon : Ionization of the air produced by an increased emanation of radon at epicenter. The water molecules in the air react with these ions, and then release heat. The heat result in temperature rise in the air. They are also accompanied by a large-scale change in the atmospheric pressure and jet streams morphology.We obtain satisfactory accuracy of estimation of the epicenter location. As well we define the short alarm period. That's the positive aspects of our forecast. However, estimates of magnitude jet contain a big uncertainty.Reference:H.C Wu, I.N. Tikhonov, 2014, "Jet streams anomalies as possible short-term precursors of earthquakes with M>6.0", Research in geophysics, DOI: http://dx.doi.org/10.4081/ rg.2014.4939 http://www.pagepress.org/journals/index.php/rg/article/view/rg.2014.4939

  7. Earthquakes

    EPA Pesticide Factsheets

    Information on this page will help you understand environmental dangers related to earthquakes, what you can do to prepare and recover. It will also help you recognize possible environmental hazards and learn what you can do to protect you and your family

  8. PBO Southwest Region: Baja Earthquake Response and Network Operations

    NASA Astrophysics Data System (ADS)

    Walls, C. P.; Basset, A.; Mann, D.; Lawrence, S.; Jarvis, C.; Feaux, K.; Jackson, M. E.

    2011-12-01

    The SW region of the Plate Boundary Observatory consists of 455 continuously operating GPS stations located principally along the transform system of the San Andreas fault and Eastern California Shear Zone. In the past year network uptime exceeded an average of 97% with greater than 99% data acquisition. Communications range from CDMA modem (307), radio (92), Vsat (30), DSL/T1/other (25) to manual downloads (1). Sixty-three stations stream 1 Hz data over the VRS3Net typically with <0.5 second latency. Over 620 maintenance activities were performed during 316 onsite visits out of approximately 368 engineer field days. Within the past year there have been 7 incidences of minor (attempted theft) to moderate vandalism (solar panel stolen) with one total loss of receiver and communications gear. Security was enhanced at these sites through fencing and more secure station configurations. In the past 12 months, 4 new stations were installed to replace removed stations or to augment the network at strategic locations. Following the M7.2 El Mayor-Cucapah earthquake CGPS station P796, a deep-drilled braced monument, was constructed in San Luis, AZ along the border within 5 weeks of the event. In addition, UNAVCO participated in a successful University of Arizona-led RAPID proposal for the installation of six continuous GPS stations for post-seismic observations. Six stations are installed and telemetered through a UNAM relay at the Sierra San Pedro Martir. Four of these stations have Vaisala WXT520 meteorological sensors. An additional site in the Sierra Cucapah (PTAX) that was built by CICESE, an Associate UNAVCO Member institution in Mexico, and Caltech has been integrated into PBO dataflow. The stations will be maintained as part of the PBO network in coordination with CICESE. UNAVCO is working with NOAA to upgrade PBO stations with WXT520 meteorological sensors and communications systems capable of streaming real-time GPS and met data. The real-time GPS and

  9. Operational earthquake forecasting in California: A prototype system combining UCERF3 and CyberShake

    NASA Astrophysics Data System (ADS)

    Milner, K. R.; Jordan, T. H.; Field, E. H.

    2014-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about time-dependent earthquake probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To attain this goal, OEF must provide a complete description of the seismic hazard—ground motion exceedance probabilities as well as short-term rupture probabilities—in concert with the long-term forecasts of probabilistic seismic hazard analysis. We have combined the Third Uniform California Earthquake Rupture Forecast (UCERF3) of the Working Group on California Earthquake Probabilities (Field et al., 2014) with the CyberShake ground-motion model of the Southern California Earthquake Center (Graves et al., 2011; Callaghan et al., this meeting) into a prototype OEF system for generating time-dependent hazard maps. UCERF3 represents future earthquake activity in terms of fault-rupture probabilities, incorporating both Reid-type renewal models and Omori-type clustering models. The current CyberShake model comprises approximately 415,000 earthquake rupture variations to represent the conditional probability of future shaking at 285 geographic sites in the Los Angeles region (~236 million horizontal-component seismograms). This combination provides significant probability gains relative to OEF models based on empirical ground-motion prediction equations (GMPEs), primarily because the physics-based CyberShake simulations account for the rupture directivity, basin effects, and directivity-basin coupling that are not represented by the GMPEs.

  10. Medical Response to Haiti Earthquake: Operation Unified Response

    DTIC Science & Technology

    2011-01-24

    struck Haiti at 4:53 pm, Tues, Jan 12, 2010 –230,000 Deaths * –197,000 Injured – 1.1M Displaced People –3,000,000 Affected People –60% of government...15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 Magnitude 7.0 Earthquake vic. Port au Prince 6.1 Aftershock COMFORT Arrives BATAAN ARG Arrives AFSOC...RESPONSE Deployment Timeline ( March ) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 281110986 754321 JTF-H Conducts RIP w/ARSOUTH Final DHHS Treatment

  11. Plutonium uranium extraction (PUREX) end state basis for interim operation (BIO) for surveillance and maintenance

    SciTech Connect

    DODD, E.N.

    1999-05-12

    This Basis for Interim Operation (BIO) was developed for the PUREX end state condition following completion of the deactivation project. The deactivation project has removed or stabilized the hazardous materials within the facility structure and equipment to reduce the hazards posed by the facility during the surveillance and maintenance (S and M) period, and to reduce the costs associated with the S and M. This document serves as the authorization basis for the PUREX facility, excluding the storage tunnels, railroad cut, and associated tracks, for the deactivated end state condition during the S and M period. The storage tunnels, and associated systems and areas, are addressed in WHC-SD-HS-SAR-001, Rev. 1, PUREX Final Safety Analysis Report. During S and M, the mission of the facility is to maintain the conditions and equipment in a manner that ensures the safety of the workers, environment, and the public. The S and M phase will continue until the final decontamination and decommissioning (D and D) project and activities are begun. Based on the methodology of DOE-STD-1027-92, Hazards Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, the final facility hazards category is identified as hazards category This considers the remaining material inventories, form and distribution of the material, and the energies present to initiate events of concern. Given the current facility configuration, conditions, and authorized S and M activities, there are no operational events identified resulting in significant hazard to any of the target receptor groups (e.g., workers, public, environment). The only accident scenarios identified with consequences to the onsite co-located workers were based on external natural phenomena, specifically an earthquake. The dose consequences of these events are within the current risk evaluation guidelines and are consistent with the expectations for a hazards category 2

  12. The establishment of a standard operation procedure for psychiatric service after an earthquake.

    PubMed

    Su, Chao-Yueh; Chou, Frank Huang-Chih; Tsai, Kuan-Yi; Lin, Wen-Kuo

    2011-07-01

    This study presents information on the design and creation of a standard operation procedure (SOP) for psychiatric service after an earthquake. The strategies employed focused on the detection of survivors who developed persistent psychiatric illness, particularly post-traumatic stress and major depressive disorders. In addition, the study attempted to detect the risk factors for psychiatric illness. A Disaster-Related Psychological Screening Test (DRPST) was designed by five psychiatrists and two public health professionals for rapidly and simply interviewing 4,223 respondents within six months of the September 1999 Chi-Chi earthquake. A SOP was established through a systemic literature review, action research, and two years of data collection. Despite the limited time and resources inherent to a disaster situation, it is necessary to develop an SOP for psychiatric service after an earthquake in order to assist the high number of survivors suffering from subsequent psychiatric impairment. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

  13. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  14. Jumping over the hurdles to effectively communicate the Operational Earthquake Forecast

    NASA Astrophysics Data System (ADS)

    McBride, S.; Wein, A. M.; Becker, J.; Potter, S.; Tilley, E. N.; Gerstenberger, M.; Orchiston, C.; Johnston, D. M.

    2016-12-01

    Probabilities, uncertainties, statistics, science, and threats are notoriously difficult topics to communicate with members of the public. The Operational Earthquake Forecast (OEF) is designed to provide an understanding of potential numbers and sizes of earthquakes and the communication of it must address all of those challenges. Furthermore, there are other barriers to effective communication of the OEF. These barriers include the erosion of trust in scientists and experts, oversaturation of messages, fear and threat messages magnified by the sensalisation of the media, fractured media environments and online echo chambers. Given the complexities and challenges of the OEF, how can we overcome barriers to effective communication? Crisis and risk communication research can inform the development of communication strategies to increase the public understanding and use of the OEF, when applied to the opportunities and challenges of practice. We explore ongoing research regarding how the OEF can be more effectively communicated - including the channels, tools and message composition to engage with a variety of publics. We also draw on past experience and a study of OEF communication during the Canterbury Earthquake Sequence (CES). We demonstrate how research and experience has guided OEF communications during subsequent events in New Zealand, including the M5.7 Valentine's Day earthquake in 2016 (CES), M6.0 Wilberforce earthquake in 2015, and the Cook Strait/Lake Grassmere earthquakes in 2013. We identify the successes and lessons learned of the practical communication of the OEF. Finally, we present future projects and directions in the communication of OEF, informed by both practice and research.

  15. Development of Site-Specific Soil Design Basis Earthquake (DBE) Parameters for the Integrated Waste Treatment Unit (IWTU)

    SciTech Connect

    Payne, Suzette

    2008-08-01

    Horizontal and vertical PC 3 (2,500 yr) Soil Design Basis Earthquake (DBE) 5% damped spectra, corresponding time histories, and strain-compatible soil properties were developed for the Integrated Waste Treatment Unit (IWTU). The IWTU is located at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Laboratory (INL). Mean and 84th percentile horizontal DBE spectra derived from site-specific site response analyses were evaluated for the IWTU. The horizontal and vertical PC 3 (2,500 yr) Soil DBE 5% damped spectra at the 84th percentile were selected for Soil Structure Interaction (SSI) analyses at IWTU. The site response analyses were performed consistent with applicable Department of Energy (DOE) Standards, recommended guidance of the Nuclear Regulatory Commission (NRC), American Society of Civil Engineers (ASCE) Standards, and recommendations of the Blue Ribbon Panel (BRP) and Defense Nuclear Facilities Safety Board (DNFSB).

  16. The Earthquake Prediction Experiment on the Basis of the Jet Stream's Precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.; Tikhonov, I. N.

    2014-12-01

    Simultaneous analysis of the jet stream maps and EQ data of M > 6.0 have been made. 58 cases of EQ occurred in 2006-2010 were studied. It has been found that interruption or velocity flow lines cross above an epicenter of EQ take place 1-70 days prior to event. The duration was 6-12 hours. The assumption is that jet stream will go up or down near an epicenter. In 45 cases the distance between epicenters and jet stream's precursor does not exceed 90 km. The forecast during 30 days before the EQ was 66.1 % (Wu and Tikhonov, 2014). This technique has been used to predict the strong EQ and pre-registered on the website (for example, the 23 October 2011, M 7.2 EQ (Turkey); the 20 May 2012, M 6.1 EQ (Italy); the 16 April 2013, M 7.8 EQ (Iran); the 12 November 2013, M 6.6 EQ (Russia); the 03 March 2014, M 6.7 Ryukyu EQ (Japan); the 21 July 2014, M 6.2 Kuril EQ). We obtain satisfactory accuracy of the epicenter location. As well we define the short alarm period. That's the positive aspects of forecast. However, estimates of magnitude contain a big uncertainty. Reference Wu, H.C., Tikhonov, I.N., 2014. Jet streams anomalies as possible short-term precursors of earthquakes with M > 6.0. Research in Geophysics, Special Issue on Earthquake Precursors. Vol. 4. No 1. doi:10.4081/rg.2014.4939. The precursor of M9.0 Japan EQ on 2011/03/11(fig1). A. M6.1 Italy EQ (2012/05/20, 44.80 N, 11.19 E, H = 5.1 km) Prediction: 2012/03/20~2012/04/20 (45.6 N, 10.5 E), M > 5.5(fig2) http://ireport.cnn.com/docs/DOC-764800 B. M7.8 Iran EQ (2013/04/16, 28.11 N, 62.05 E, H = 82.0 km) Prediction: 2013/01/14~2013/02/04 (28.0 N, 61.3 E) M > 6.0(fig3) http://ireport.cnn.com/docs/DOC-910919 C. M6.6 Russia EQ (2013/11/12, 54.68 N, 162.29 E, H = 47.2 km). Prediction: 2013/10/27~2013/11/13 (56.0 N, 162.9 E) M > 5.5 http://ireport.cnn.com/docs/DOC-1053599 D. M6.7 Japan EQ (2014/03/03, 27.41 N, 127.34 E, H = 111.2 km). Prediction: 2013/12/02 ~2014/01/15 (26.7 N, 128.1 E) M > 6.5(fig4) http

  17. Planning a Preliminary program for Earthquake Loss Estimation and Emergency Operation by Three-dimensional Structural Model of Active Faults

    NASA Astrophysics Data System (ADS)

    Ke, M. C.

    2015-12-01

    Large scale earthquakes often cause serious economic losses and a lot of deaths. Because the seismic magnitude, the occurring time and the occurring location of earthquakes are still unable to predict now. The pre-disaster risk modeling and post-disaster operation are really important works of reducing earthquake damages. In order to understanding disaster risk of earthquakes, people usually use the technology of Earthquake simulation to build the earthquake scenarios. Therefore, Point source, fault line source and fault plane source are the models which often are used as a seismic source of scenarios. The assessment results made from different models used on risk assessment and emergency operation of earthquakes are well, but the accuracy of the assessment results could still be upgrade. This program invites experts and scholars from Taiwan University, National Central University, and National Cheng Kung University, and tries using historical records of earthquakes, geological data and geophysical data to build underground three-dimensional structure planes of active faults. It is a purpose to replace projection fault planes by underground fault planes as similar true. The analysis accuracy of earthquake prevention efforts can be upgraded by this database. Then these three-dimensional data will be applied to different stages of disaster prevention. For pre-disaster, results of earthquake risk analysis obtained by the three-dimensional data of the fault plane are closer to real damage. For disaster, three-dimensional data of the fault plane can be help to speculate that aftershocks distributed and serious damage area. The program has been used 14 geological profiles to build the three dimensional data of Hsinchu fault and HisnCheng faults in 2015. Other active faults will be completed in 2018 and be actually applied on earthquake disaster prevention.

  18. [How to solve hospital operating problems based on the experience of the Hanshin earthquake?].

    PubMed

    Yoshikawa, J

    1995-06-01

    Immediately after the Hanshin (Kobe-Osaka) earthquake, electricity, water and gas supplies were discontinued at Kobe General Hospital, causing difficulties with many important hospital functions including the water cooled independent electric power plant, respirators, and sterilizers. A large water storage facility is needed to keep the water cooled type independent power plant operating. Alternative plans including the introduction of an air cooled independent power plant and a sea water to fresh water exchange system should be considered. Portable compressors are needed to retain the function of respirators in the absence of a water supply. The emergency use of propane gas should also be considered for sterilization and cooking facilities. There were very great problems in communication after the earthquake. The only method was the use of public phones, which have priority over private lines. Therefore, each hospital should have phones with similar priority. In addition, the use of personal computers and/or computer network methods should be considered to preserve a high level of communication after an earthquake. Otherwise, a hospital should be equipped with wireless phones. Immediately after the earthquake, the care of critically ill patients could not be achieved. Therefore, 20 cardiac patients requiring intensive care had to be transferred to other heart centers. The best method for the transfer is by helicopter. Kobe City suffered a transport crisis which occurred immediately after the earthquake and continued up to the end of March. A big helicopter or a special bus guided by a police car should be considered for hospital staff transport.(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Planning Matters: Response Operations following the 30 September 2009 Sumatran Earthquake

    NASA Astrophysics Data System (ADS)

    Comfort, L. K.; Cedillos, V.; Rahayu, H.

    2009-12-01

    Response operations following the 9/30/2009 West Sumatra earthquake tested extensive planning that had been done in Indonesia since the 26 December 2004 Sumatran Earthquake and Tsunami. After massive destruction in Aceh Province in 2004, the Indonesian National Government revised its national disaster management plans. A key component was to select six cities in Indonesia exposed to significant risk and make a focused investment of resources, planning activities, and public education to reduce risk of major disasters. Padang City was selected for this national “showcase” for disaster preparedness, planning, and response. The question is whether planning improved governmental performance and coordination in practice. There is substantial evidence that disaster preparedness planning and training initiated over the past four years had a positive effect on Padang in terms of disaster risk reduction. The National Disaster Management Agency (BNPB, 10/28/09) reported the following casualties: Padang City: deaths, 383; severe injuries, 431, minor injuries, 771. Province of West Sumatra: deaths, 1209; severe injuries, 67; minor injuries, 1179. These figures contrasted markedly with the estimated losses following the 2004 Earthquake and Tsunami when no training had been done: Banda Aceh, deaths, 118,000; Aceh Province, dead/missing, 236,169 (ID Health Ministry 2/22/05). The 2004 events were more severe, yet the comparable scale of loss was significantly lower in the 9/30/09 earthquake. Three factors contributed to reducing disaster risk in Padang and West Sumatra. First, annual training exercises for tsunami warning and evacuation had been organized by national agencies since 2004. In 2008, all exercises and training activities were placed under the newly established BNPB. The exercise held in Padang in February, 2009 served as an organizing framework for response operations in the 9/30/09 earthquake. Public officers with key responsibilities for emergency operations

  20. TECHNICAL BASIS FOR VENTILATION REQUIREMENTS IN TANK FARMS OPERATING SPECIFICATIONS DOCUMENTS

    SciTech Connect

    BERGLIN, E J

    2003-06-23

    This report provides the technical basis for high efficiency particulate air filter (HEPA) for Hanford tank farm ventilation systems (sometimes known as heating, ventilation and air conditioning [HVAC]) to support limits defined in Process Engineering Operating Specification Documents (OSDs). This technical basis included a review of older technical basis and provides clarifications, as necessary, to technical basis limit revisions or justification. This document provides an updated technical basis for tank farm ventilation systems related to Operation Specification Documents (OSDs) for double-shell tanks (DSTs), single-shell tanks (SSTs), double-contained receiver tanks (DCRTs), catch tanks, and various other miscellaneous facilities.

  1. Theoretical basis for operational ensemble forecasting of coronal mass ejections

    NASA Astrophysics Data System (ADS)

    Pizzo, V. J.; Koning, C.; Cash, M.; Millward, G.; Biesecker, D. A.; Puga, L.; Codrescu, M.; Odstrcil, D.

    2015-10-01

    We lay out the theoretical underpinnings for the application of the Wang-Sheeley-Arge-Enlil modeling system to ensemble forecasting of coronal mass ejections (CMEs) in an operational environment. In such models, there is no magnetic cloud component, so our results pertain only to CME front properties, such as transit time to Earth. Within this framework, we find no evidence that the propagation is chaotic, and therefore, CME forecasting calls for different tactics than employed for terrestrial weather or hurricane forecasting. We explore a broad range of CME cone inputs and ambient states to flesh out differing CME evolutionary behavior in the various dynamical domains (e.g., large, fast CMEs launched into a slow ambient, and the converse; plus numerous permutations in between). CME propagation in both uniform and highly structured ambient flows is considered to assess how much the solar wind background affects the CME front properties at 1 AU. Graphical and analytic tools pertinent to an ensemble approach are developed to enable uncertainties in forecasting CME impact at Earth to be realistically estimated. We discuss how uncertainties in CME pointing relative to the Sun-Earth line affects the reliability of a forecast and how glancing blows become an issue for CME off-points greater than about the half width of the estimated input CME. While the basic results appear consistent with established impressions of CME behavior, the next step is to use existing records of well-observed CMEs at both Sun and Earth to verify that real events appear to follow the systematic tendencies presented in this study.

  2. A century of oilfield operations and earthquakes in the greater Los Angeles Basin, southern California

    USGS Publications Warehouse

    Hauksson, Egill; Goebel, Thomas; Ampuero, Jean-Paul; Cochran, Elizabeth S.

    2015-01-01

    Most of the seismicity in the Los Angeles Basin (LA Basin) occurs at depth below the sediments and is caused by transpressional tectonics related to the big bend in the San Andreas fault. However, some of the seismicity could be associated with fluid extraction or injection in oil fields that have been in production for almost a century and cover ∼ 17% of the basin. In a recent study, first the influence of industry operations was evaluated by analyzing seismicity characteristics, including normalized seismicity rates, focal depths, and b-values, but no significant difference was found in seismicity characteristics inside and outside the oil fields. In addition, to identify possible temporal correlations, the seismicity and available monthly fluid extraction and injection volumes since 1977 were analyzed. Second, the production and deformation history of the Wilmington oil field were used to evaluate whether other oil fields are likely to experience similar surface deformation in the future. Third, the maximum earthquake magnitudes of events within the perimeters of the oil fields were analyzed to see whether they correlate with total net injected volumes, as suggested by previous studies. Similarly, maximum magnitudes were examined to see whether they exhibit an increase with net extraction volume. Overall, no obvious previously unidentified induced earthquakes were found, and the management of balanced production and injection of fluids appears to reduce the risk of induced-earthquake activity in the oil fields.

  3. Basis for Interim Operation for the K-Reactor in Cold Standby

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The Basis for Interim Operation (BIO) document for K Reactor in Cold Standby and the L- and P-Reactor Disassembly Basins was prepared in accordance with the draft DOE standard for BIO preparation (dated October 26, 1993).

  4. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J. ); Stephenson, D.E. ); Silva, W. )

    1991-01-01

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  5. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J.; Stephenson, D.E.; Silva, W.

    1991-12-31

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  6. New Operational Phase of KOERI-Regional Earthquake and Tsunami Monitoring Center

    NASA Astrophysics Data System (ADS)

    Necmioǧlu, Öcal; Özer Sözdinler, Ceren; Yılmazer, Mehmet; Köseoǧlu, Ayşegül; Turhan, Fatih; Çomoǧlu, Mustafa; Meral Özel, Nurcan; Pınar, Ali; Kekovalı, Kıvanç

    2017-04-01

    This presentation provides a status update on the tsunami warning related activities of the Bogazici University - Kandilli Observatory and Earthquake Research Institute - Regional Earthquake and Tsunami Monitoring Center (KOERI-RETMC). RETMC is providing services in the Eastern Mediterranean, Aegean and Black Seas since 1 July 2012 and has been accredited as a Tsunami Service Provider of ICG/NEAMTWS at its 13th session during 26-28 September 2016 in Bucharest, Romania. Second radar-type tide-gauge has been installed in Bozcaada-Çanakkale within the framework of "Inexpensive Device for Sea-Level Measurement" (IDSL) initiative offered as donation by the EC-JRC and planning is in progress for the possible installation of two more IDSLs in the Aegean Sea coast of Turkey. The work towards the creation of Tsunami Inundation Maps at 38 Tsunami Forecast Points in Turkey has been finalized. Early-Est (Lomax and Michelini) has been installed at the RETMC and its operational testing has begun. This work is partially funded by project ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839. We would like to thank the Ministry of Development (of Republic of Turkey) for its support to on-going activities of RETMC. We also would like to thank to EC-JRC and Dr. Alessandro Annunziato for their continuous support in the operational activities of RETMC and IDSL initiative, and to Anthony Lomax for his cooperation and support in the operationalization of Early-Est.

  7. Distinction of Abnormality of Surgical Operation on the Basis of Surface EMG Signals

    NASA Astrophysics Data System (ADS)

    Nakaya, Yusuke; Ishii, Chiharu; Nakakuki, Takashi; Nishitani, Yosuke; Hikita, Mitsutaka

    In this paper, a novel method for automatic identification of a surgical operation and on-line recognition of the singularity of the identified surgical operation is proposed. Suturing is divided into six operations. The features of the operation are extracted from the measurements of the movement of the forceps, and then, on the basis of the threshold criteria for the six operations, a surgical operation is identified as one of the six operations. Next, the features of any singularity of operation are extracted from operator's surface electromyogram signals, and the identified surgical operation is classified as either normal or singular using a self-organizing map. Using the built laparoscopic-surgery simulator with two forceps, the identification of each surgical operation and the distinction of the singularity of the identified surgical operation were carried out for a specific surgical operation, namely, insertion of a needle during suturing. Each surgical operation in suturing could be identified with more than 80% accuracy, and the singularity of the surgical operation of insertion could be distinguished with approximately 80% accuracy on an average. The experimental results showed the effectiveness of the proposed method.

  8. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  9. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-26

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years.

  10. Moving towards the operational seismogeodesy component of earthquake and tsunami early warning

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Bock, Y.; Geng, J.; Melgar, D.; Crowell, B. W.; Squibb, M. B.

    2013-12-01

    deviation; 3) simultaneous solution for ground motion biases to mitigate errors due to accelerometer tilt; 4) real time integration of accelerometer data to velocity and displacement without baseline corrections, providing the fundamental input for rapid finite fault source inversion; 5) low frequency estimates of P-wave arrival displacement to support single station earth quake early warning. The operational real-time GPS analysis was implemented in time to provide waveforms from the August 2012 Brawley, CA, seismic swarm. Now the full real-time seismogeodetic analysis is operational for GPS sites we have upgraded with low-cost MEMS accelerometers, meteorological sensors and an in-house geodetic modules for efficient real-time data transmission. The analysis system does not yet incorporate an alert system but is currently available to serve as a complement to seismic-based early warning systems to increase redundancy and robustness. It is anticipated to be especially useful for large earthquakes (> M7) where rapid determination of the fault parameters is critical for early assessment of the extent of damage in affected areas, or for rapid tsunami modeling.

  11. Hilbert series and operator basis for NRQED and NRQCD/HQET

    NASA Astrophysics Data System (ADS)

    Kobach, Andrew; Pal, Sridip

    2017-09-01

    We use a Hilbert series to construct an operator basis in the 1 / m expansion of a theory with a nonrelativistic heavy fermion in an electromagnetic (NRQED) or color gauge field (NRQCD/HQET). We present a list of effective operators with mass dimension d ≤ 8. Comparing to the current literature, our results for NRQED agree for d ≤ 8, but there are some discrepancies in NRQCD/HQET at d = 7 and 8.

  12. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2010-07-01 2010-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION...

  13. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2011-07-01 2011-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION...

  14. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2012-07-01 2012-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION...

  15. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2014-07-01 2014-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION...

  16. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2013-07-01 2013-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION...

  17. Nuclear magnetic resonance of J-coupled quadrupolar nuclei: Use of the tensor operator product basis

    NASA Astrophysics Data System (ADS)

    Kemp-Harper, R.; Philp, D. J.; Kuchel, P. W.

    2001-08-01

    In nuclear magnetic resonance (NMR) of I=1/2 nuclei that are scalar coupled to quadrupolar spins, a tensor operator product (TOP) basis set provides a convenient description of the time evolution of the density operator. Expressions for the evolution of equivalent I=1/2 spins, coupled to an arbitrary spin S>1/2, were obtained by explicit algebraic density operator calculations in Mathematica, and specific examples are given for S=1 and S=3/2. Tensor operators are described by the convenient quantum numbers rank and order and this imparts to the TOP basis features that enable an intuitive understanding of NMR behavior of these spin systems. It is shown that evolution as a result of J coupling alone changes the rank of tensors for the coupling partner, generating higher-rank tensors, which allow efficient excitation of S-spin multiple-quantum coherences. Theoretical predictions obtained using the TOP formalism were confirmed using multiple-quantum filtered heteronuclear spin-echo experiments and were further employed to demonstrate polarization transfer directly to multiple-quantum transitions using the insensitive nucleus enhancement by polarization transfer pulse sequence. This latter experiment is the basis of two-dimensional heteronuclear correlation experiments and direct generation of multiple-quantum S-spin coherences can therefore be exploited to yield greater spectral resolution in such experiments. Simulated spectra and experimental results are presented.

  18. On the Physical Basis of Rate Law Formulations for River Evolution, and their Applicability to the Simulation of Evolution after Earthquakes

    NASA Astrophysics Data System (ADS)

    An, C.; Parker, G.; Fu, X.

    2015-12-01

    River morphology evolves in response to trade-offs among a series of environmental forcing factors, and this evolution will be disturbed if such environmental factors change. One example of response to chronic disturbance is the intensive river evolution after earthquakes in southwest China's mountain areas. When simulating river morphological response to environmental disturbance, an exponential rate law with a specified characteristic response time is often regarded as a practical tool for quantification. As conceptual models, empirical rate law formulations can be used to describe broad brush morphological response, but their physically basis is not solid in that they do not consider the details of morphodynamic processes. Meanwhile, river evolution can also be simulated with physically-based morphodynamic models which conserve sediment via the Exner equation. Here we study the links between the rate law formalism and the Exner equation through solving the Exner equation mathematically and numerically. The results show that, when implementing a very simplified form of a relation for bedload transport, the Exner equation can be reduced to the diffusion equation, the solution of which is a Gaussian function. This solution coincides with the solution associated with rate laws, thus providing a physical basis for such formulations. However, when the complexities of a natural river are considered, the solution of the Exner equation will no longer be a simple Gaussian function. Under such circumstances, the rate law becomes invalid, and a full understanding of the response of rivers to earthquakes requires a complete morphodynamic model.

  19. Comparing operational performance of the Virtual Seismologist and FinDer for earthquake early warning.

    NASA Astrophysics Data System (ADS)

    Massin, Frederick; Boese, Maren; Cauzzi, Carlo V.; Clinton, John F.

    2017-04-01

    An earthquake early warning (EEW) system can provide fast and accurate parameter estimations across wide ranges of source dimensions, event types and epicentral distances by integrating event or ground motion parameter estimations from different EEW algorithms, each of them optimized for specific tasks. We have integrated two such independent EEW algorithms, Virtual Seismologist (VS) and FinDer in the popular open-source seismic monitoring framework, SeisComP3 (SC3). VS(SC3) provides rapid magnitude estimates for network-based point-source origins using conventional triggering and association, while FinDer matches evolving patterns of ground motion to track on-going rupture extent, and hence can provide accurate ground motion predictions for finite fault ruptures. SC3 is operated by a large number of regional seismic network across the world, many of which have a long term interest to develop EEW capabilities. By combining real-time performance with playbacks from significant events, we report on the configuration and performance of VS and FinDer in various different tectonic and monitoring environments - Switzerland, Nicaragua and Southern California. We discuss how real-time EEW reports from these complimentary algorithms can be combined in practice to provide a single EEW from the SC3 system.

  20. Theory of coherent two-photon NMR: Standard-basis operators and coherent averaging

    NASA Astrophysics Data System (ADS)

    Stepišnik, Janez

    1980-05-01

    Theory of the two-photon coherent transitions for the multilevel spin system is developed by using the coherent averaging of the time-evolution operator and the spin description by the standard-basis operators. The employed formalism provides a clear picture of the interactions which cause the multi-quantum transitions and make possible to evaluate not only the two-photon but also the multiphoton transitions. The theory has been applied to the quadrupole perturbed spin-systems with s = 1 and s = {3}/{2} where the effective double-quantum rf field has been evaluated.

  1. Lessons Learned from Eight Years' Experience of Actual Operation, and Future Prospects of JMA Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Nishimae, Y.

    2015-12-01

    Since 2007, experiences of actual operation of EEW have been gained by the Japan Meteorological Agency (JMA). During this period, we have learned lessons from many M6- and M7-class earthquakes, and the Mw9.0 Tohoku earthquake. During the Mw9.0 Tohoku earthquake, JMA system functioned well: it issued a warning message more than 15 s before strong ground shaking in the Tohoku district (relatively near distance from the epicenter). However, it was not perfect: in addition to the problem of large extent of fault rupture, some false warning messages were issued due to the confusion of the system because of simultaneous multiple aftershocks which occurred at the wide rupture area. To address the problems, JMA will introduce two new methods into the operational system this year to start their tests, aiming at practical operation within a couple of years. One is Integrated Particle Filter (IPF) method, which is an integrated algorithm of multiple hypocenter determination techniques with Bayesian estimation, in which amplitude information is also used for hypocenter determination. The other is Propagation of Local Undamped Motion (PLUM) method, in which warning message is issued when strong ground shaking is detected at nearby stations around the target site (e.g., within 30 km). Here, hypocenter and magnitude are not required in PLUM. Aiming at application for several years later, we are investigating a new approach, in which current wavefield is estimated in real time, and then future wavefield is predicted time evolutionally from the current situation using physics of wave propagation. Here, hypocenter and magnitude are not necessarily required, but real-time observation of ground shaking is necessary. JMA also plans to predict long period ground motion (up to 8 s) with the EEW system for earthquake damage mitigation in high-rise buildings. Its test will start using the operational system in the near future.

  2. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  3. Computing single step operators of logic programming in radial basis function neural networks

    SciTech Connect

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  4. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  5. The Mixed Waste Management Facility. Design basis integrated operations plan (Title I design)

    SciTech Connect

    1994-12-01

    The Mixed Waste Management Facility (MWMF) will be a fully integrated, pilotscale facility for the demonstration of low-level, organic-matrix mixed waste treatment technologies. It will provide the bridge from bench-scale demonstrated technologies to the deployment and operation of full-scale treatment facilities. The MWMF is a key element in reducing the risk in deployment of effective and environmentally acceptable treatment processes for organic mixed-waste streams. The MWMF will provide the engineering test data, formal evaluation, and operating experience that will be required for these demonstration systems to become accepted by EPA and deployable in waste treatment facilities. The deployment will also demonstrate how to approach the permitting process with the regulatory agencies and how to operate and maintain the processes in a safe manner. This document describes, at a high level, how the facility will be designed and operated to achieve this mission. It frequently refers the reader to additional documentation that provides more detail in specific areas. Effective evaluation of a technology consists of a variety of informal and formal demonstrations involving individual technology systems or subsystems, integrated technology system combinations, or complete integrated treatment trains. Informal demonstrations will typically be used to gather general operating information and to establish a basis for development of formal demonstration plans. Formal demonstrations consist of a specific series of tests that are used to rigorously demonstrate the operation or performance of a specific system configuration.

  6. The investigation of the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents.

    PubMed

    Hekimoglu, Yavuz; Dursun, Recep; Karadas, Sevdegul; Asirdizer, Mahmut

    2015-10-01

    The purpose of this study is to identify the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents. In this study, we evaluated 245 patients of occupational accidents who were admitted to emergency services of Van city hospitals in the 1-year periods including pre-earthquake and post-earthquake. We determined that there was a 63.4% (P < 0.05) increase in work-related accidents in the post-earthquake period compared to the pre-earthquake period. Also, injuries due to occupational accidents increased 211% (P < 0.05) in the construction industry, the rate of injuries due to falls from height increased 168% (P < 0.05), and the rate of traumas to the head and upper limbs increased 200% (P < 0.05) and 130% (P < 0.05), respectively, in the post-earthquake period compared to the pre-earthquake period. We determined that the ignoring of measures for occupational health and safety by employers and employees during conducted rapid construction activities and post-earthquake restoration works in order to remove the effects of the earthquake increased the number of work accidents. In this study, the impact of disasters such as earthquakes on the accidents at work was evaluated as we have not seen in literature. This study emphasizes that governments should make regulations and process relating to the post-disaster business before the emergence of disaster by taking into account factors that may increase their work-related accidents. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  7. Gas storage project development, operation, and analysis: Basis guidelines for gas storage project development, operation, and operations analysis

    SciTech Connect

    Nowaczewski, S.F.

    1995-09-01

    Reservoir selection matches location, capacity, and deliverability to market demand; gathering, processing, compression, land acquisition, and pipeline connection significantly impact economics. Geologic considerations include field-wide variations in permeability, porosity, pay thickness. Well deliverability, and the number of wells required to meet targeted field deliverability can be estimated from kh or {phi}h. Analogous reservoir types can be used to estimate kh, {phi}h ranges for particular fields. Capillary pressure data define pore size distribution and gas-water threshold pressure. Existing well location and log data are essential in mapping subtle stratigraphic relationships. Definitions of field type, trap type, and liquid phases are important to the economics of storage development and operations, since safe high pressure storage is of greater benefit. Well construction considerations include location, type (vertical/slant/horizontal), and completion type to maximize drainage and deliverability; casing sizing to eliminate frictional pressure loss; and casing cementing for long-term mechanical integrity. Deliverability prediction uses well/gathering system nodal pressure data. The importance of deliverability maintenance/enhancement is increasing as markets demand ever greater deliverability. By design, a field allows cycling of an expected volume; loss of potential decreases efficiently. Inventory verification relies on well pressure and fluid data, accurate metering, and estimation of losses or leaks and fuel use. Data quality, quantity and management affect results in all these major areas of storage operations.

  8. Power systems after the Northridge earthquake: Emergency operations and changes in seismic equipment specifications, practice, and system configuration

    SciTech Connect

    Schiff, A.J.; Tognazzini, R.; Ostrom, D.

    1995-12-31

    The Northridge earthquake caused extensive damage to high voltage substation equipment, and for the first time the failure of transmission towers. Power was lost to much of the earthquake impacted area, and 93% of the customers were restored within 24 hours. To restore service, damage monitoring, communication and protective equipment, such as current-voltage transformers, wave traps, and lightning arresters, were removed or bypassed and operation restored. To improve performance some porcelain members are being replaced with composite materials for bushings, current-voltage transformers and lightning arresters. Interim equipment seismic specifications for equipment have been instituted. Some substations are being re-configured and rigid bus and conductors are being replaced with flexible conductors. Non-load carrying conductors, such as those used on lightning arrester, are being reduced in size to reduce potential interaction problems. Better methods of documenting damage and repair costs are being considered.

  9. Ecological Equivalence Assessment Methods: What Trade-Offs between Operationality, Scientific Basis and Comprehensiveness?

    NASA Astrophysics Data System (ADS)

    Bezombes, Lucie; Gaucherand, Stéphanie; Kerbiriou, Christian; Reinert, Marie-Eve; Spiegelberger, Thomas

    2017-08-01

    In many countries, biodiversity compensation is required to counterbalance negative impacts of development projects on biodiversity by carrying out ecological measures, called offset when the goal is to reach "no net loss" of biodiversity. One main issue is to ensure that offset gains are equivalent to impact-related losses. Ecological equivalence is assessed with ecological equivalence assessment methods taking into account a range of key considerations that we summarized as ecological, spatial, temporal, and uncertainty. When equivalence assessment methods take into account all considerations, we call them "comprehensive". Equivalence assessment methods should also aim to be science-based and operational, which is challenging. Many equivalence assessment methods have been developed worldwide but none is fully satisfying. In the present study, we examine 13 equivalence assessment methods in order to identify (i) their general structure and (ii) the synergies and trade-offs between equivalence assessment methods characteristics related to operationality, scientific-basis and comprehensiveness (called "challenges" in his paper). We evaluate each equivalence assessment methods on the basis of 12 criteria describing the level of achievement of each challenge. We observe that all equivalence assessment methods share a general structure, with possible improvements in the choice of target biodiversity, the indicators used, the integration of landscape context and the multipliers reflecting time lags and uncertainties. We show that no equivalence assessment methods combines all challenges perfectly. There are trade-offs between and within the challenges: operationality tends to be favored while scientific basis are integrated heterogeneously in equivalence assessment methods development. One way of improving the challenges combination would be the use of offset dedicated data-bases providing scientific feedbacks on previous offset measures.

  10. Representation of discrete Steklov-Poincare operator arising in domain decomposition methods in wavelet basis

    SciTech Connect

    Jemcov, A.; Matovic, M.D.

    1996-12-31

    This paper examines the sparse representation and preconditioning of a discrete Steklov-Poincare operator which arises in domain decomposition methods. A non-overlapping domain decomposition method is applied to a second order self-adjoint elliptic operator (Poisson equation), with homogeneous boundary conditions, as a model problem. It is shown that the discrete Steklov-Poincare operator allows sparse representation with a bounded condition number in wavelet basis if the transformation is followed by thresholding and resealing. These two steps combined enable the effective use of Krylov subspace methods as an iterative solution procedure for the system of linear equations. Finding the solution of an interface problem in domain decomposition methods, known as a Schur complement problem, has been shown to be equivalent to the discrete form of Steklov-Poincare operator. A common way to obtain Schur complement matrix is by ordering the matrix of discrete differential operator in subdomain node groups then block eliminating interface nodes. The result is a dense matrix which corresponds to the interface problem. This is equivalent to reducing the original problem to several smaller differential problems and one boundary integral equation problem for the subdomain interface.

  11. Operant self-administration models for testing the neuropharmacological basis of ethanol consumption in rats.

    PubMed

    June, Harry L; Gilpin, Nicholas W

    2010-04-01

    Operant self-administration procedures are used to assess the neural basis of ethanol-seeking behavior under a wide range of experimental conditions. In general, rats do not spontaneously self-administer ethanol in pharmacologically meaningful amounts. This unit provides a step-by-step guide for training rats to self-administer quantities of ethanol that produce moderate to high blood-alcohol content. Different protocols are used for rats that are genetically heterogeneous versus rats that are selectively bred for high alcohol preference. Also, these protocols have different sets of advantages and disadvantages in terms of the ability to control for caloric intake and taste of solutions in operant testing. Basic self-administration protocols can also be altered to focus on different aspects of the motivational properties of ethanol (for example, those related to dependence). This unit provides multiple protocols that lead to alcohol intake in rats, which can be pharmacologically probed relative to a variety of control conditions.

  12. Real-time earthquake alert system for the greater San Francisco Bay Area: a prototype design to address operational issues

    SciTech Connect

    Harben, P.E.; Jarpe, S.; Hunter, S.

    1996-12-10

    The purpose of the earthquake alert system (EAS) is to outrun the seismic energy released in a large earthquake using a geographically distributed network of strong motion sensors that telemeter data to a rapid CPU-processing station, which then issues an area-wide warning to a region before strong motion will occur. The warning times involved are short, from 0 to 30 seconds or so; consequently, most responses must be automated. The San Francisco Bay Area is particularly well suited for an EAS because (1) large earthquakes have relatively shallow hypocenters (10- to 20-kilometer depth), giving favorable ray-path geometries for larger warning times than deeper from earthquakes, and (2) the active faults are few in number and well characterized, which means far fewer geographically distributed strong motion sensors are (about 50 in this region). An EAS prototype is being implemented in the San Francisco Bay Area. The system consists of four distinct subsystems: (1) a distributed strong motion seismic network, (2) a central processing station, (3) a warning communications system and (4) user receiver and response systems. We have designed a simple, reliable, and inexpensive strong motion monitoring station that consists of a three-component Analog Devices ADXLO5 accelerometer sensing unit, a vertical component weak motion sensor for system testing, a 16-bit digitizer with multiplexing, and communication output ports for RS232 modem or radio telemetry. The unit is battery-powered and will be sited in fire stations. The prototype central computer analysis system consists of a PC dam-acquisition platform that pipes the incoming strong motion data via Ethernet to Unix-based workstations for dam processing. Simple real-time algorithms, particularly for magnitude estimation, are implemented to give estimates of the time since the earthquake`s onset its hypocenter location, its magnitude, and the reliability of the estimate. These parameters are calculated and transmitted

  13. An Earthworm Based Earthquake Early Warning System for Southwest Iberian Peninsula: Experience after Two Years of Operation

    NASA Astrophysics Data System (ADS)

    Jara, J. A.; Romeu, N.; Colom, Y.; Goula, X.; Roca, A.

    2016-12-01

    The South Iberian Peninsula is located near a complex plate boundary between Eurasia and Africa. Several very large earthquakes have occurred, especially offshore Cape San Vicente and in the Gulf of Cadiz. The largest one, the 1755 Mw=8.5 Lisbon earthquake, was associated with a large tsunami and caused more than 60,000 casualties and significant damage in the SW Iberian Peninsula and NW Morocco (Buforn et al. 1998; Baptista et al. 2003; Gutscher et al. 2006; Grandin et al. 2007). This study presents the development and results after two years of continuous operation of a prototype of a regional Earthquake Early Warning System (EEWS) applied to South Portugal and Southwest of Spain within the framework of the Alertes-Rim Spanish project. This EEWS, based on Earthworm tools (USGS), was implemented to automatically produce location scenarios, with an optimized location and estimated magnitude that minimize the warning time, as a previous step to produce a complete scenario with "lead-times" and potential damages. The prototype was put into operation after a setup period, during which multiple simulations were carried out to establish the optimal settings, considering existing real-time seismic stations available in the region. After two years of operation, location and magnitude assessment results are fairly good, comparing to Instituto Geográfico Nacional (IGN) catalog and lead times obtained are on the order of tens of seconds for the majority of targets, which is long enough to mitigate damage for a big area of the southern coasts of Portugal and Spain, demonstrating the possibility of a regional, reliable and effective EEWS in the region.

  14. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  15. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    SciTech Connect

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  16. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 3 2011-07-01 2011-07-01 false System must be nonprofit or operated on a share-crop basis... Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways in...

  17. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false System must be nonprofit or operated on a share-crop basis... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on...

  18. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1995-01-01

    Incineration as a method of treating radioactive or mixed waste is attractive because of volume reduction, but may result in high concentrations of some hazardous components. For safety reasons during operation, and because of the environmental impact of the plant, it is important to know how these materials partition between the furnace slay, the fly ash, and the stack emission. The chemistry of about 50 elements is discussed and through consideration of high temperature thermodynamic equilibria, an attempt is made to provide a basis for predicting how various radionuclides and heavy metals behave in a typical incinerator. The chemistry of the individual elements is first considered and a prediction of the most stable chemical species in the typical incinerator atmosphere is made. The treatment emphasizes volatility and the parameters considered are temperature, acidity, oxygen, sulfur, and halogen content, and the presence of several other key non-radioactive elements. A computer model is used to calculate equilibrium concentrations of many species in several systems at temperatures ranging from 500 to 1600{degrees}K. It is suggested that deliberate addition of various feed chemicals can have a major impact on the fate of many radionuclides and heavy metals. Several problems concerning limitations and application of the data are considered.

  19. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  20. Rethinking first-principles electron transport theories with projection operators: the problems caused by partitioning the basis set.

    PubMed

    Reuter, Matthew G; Harrison, Robert J

    2013-09-21

    We revisit the derivation of electron transport theories with a focus on the projection operators chosen to partition the system. The prevailing choice of assigning each computational basis function to a region causes two problems. First, this choice generally results in oblique projection operators, which are non-Hermitian and violate implicit assumptions in the derivation. Second, these operators are defined with the physically insignificant basis set and, as such, preclude a well-defined basis set limit. We thus advocate for the selection of physically motivated, orthogonal projection operators (which are Hermitian) and present an operator-based derivation of electron transport theories. Unlike the conventional, matrix-based approaches, this derivation requires no knowledge of the computational basis set. In this process, we also find that common transport formalisms for nonorthogonal basis sets improperly decouple the exterior regions, leading to a short circuit through the system. We finally discuss the implications of these results for first-principles calculations of electron transport.

  1. The power of simplification: Operator interface with the AP1000{sup R} during design-basis and beyond design-basis events

    SciTech Connect

    Williams, M. G.; Mouser, M. R.; Simon, J. B.

    2012-07-01

    The AP1000{sup R} plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been

  2. Transient Fluid Flow Along Basement Faults and Rupture Mechanics: Can We Expect Injection-Induced Earthquake Behavior to Correspond Directly With Injection Operations?

    NASA Astrophysics Data System (ADS)

    Norbeck, J. H.; Horne, R. N.

    2015-12-01

    We explored injection-induced earthquake behavior in geologic settings where basement faults are connected hydraulically to overlying saline aquifers targeted for wastewater disposal. Understanding how the interaction between natural geology and injection well operations affects the behavior of injection-induced earthquake sequences has important implications for characterizing seismic hazard risk. Numerical experiments were performed to investigate the extent to which seismicity is influenced by the migration of pressure perturbations along fault zones. Two distinct behaviors were observed: a) earthquake ruptures that were confined to the pressurized region of the fault and b) sustained earthquake ruptures that propagated far beyond the pressure front. These two faulting mechanisms have important implications for assessing the manner in which seismicity can be expected respond to injection well operations.Based upon observations from the numerical experiments, we developed a criterion that can be used to classify the expected faulting behavior near wastewater disposal sites. The faulting criterion depends on the state of stress, the initial fluid pressure, the orientation of the fault, and the dynamic friction coefficient of the fault. If the initial ratio of shear to effective normal stress resolved on the fault (the prestress ratio) is less than the fault's dynamic friction coefficient, then earthquake ruptures will tend to be limited by the distance of the pressure front. In this case, parameters that affect seismic hazard assessment, like the maximum earthquake magnitude or earthquake recurrence interval, could correlate with injection well operational parameters. For example, the maximum earthquake magnitude might be expected to grow over time in a systematic manner as larger patches of the fault are exposed to significant pressure changes. In contrast, if the prestress ratio is greater than dynamic friction, a stress drop can occur outside of the pressurized

  3. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  4. Earthquake Facts

    MedlinePlus

    ... landslide (usually triggered by an earthquake) displacing the ocean water. The hypocenter of an earthquake is the ... is the zone of earthquakes surrounding the Pacific Ocean — about 90% of the world’s earthquakes occur ...

  5. [Management of the operating room at the time of emergency outbreak--the experience of the 2011 Off the Pacific Coast of Tohoku Earthquake].

    PubMed

    Isosu, Tsuyoshi; Murakawa, Masahiro

    2012-03-01

    This article introduces the operating room disaster manual of our hospital. When "The 2011 Off the Pacific Coast of Tohoku Earthquake of magnitude 9" occurred, the 9 operations were being performed in our hospital. Among these, general and regional anesthesia had been induced in 8 cases, and as for one, patient was just leaving the operation room. General anesthesia was stopped in 6 cases. In our manual, all operations should be stopped, and then immediately finished if it is possible. There was no patient injured in our hospital. This was the first time we experienced such a large scale earthquake. It seemed closer cooperation between anesthesiologists, surgeons and the other co-medical staffs are very important to manage the unusual situation.

  6. Martin Marietta Energy Systems, Inc. comprehensive earthquake management plan: Emergency Operations Center training manual

    SciTech Connect

    Not Available

    1990-02-28

    The objective of this training is to: describe the responsibilities, resources, and goals of the Emergency Operations Center and be able to evaluate and interpret this information to best direct and allocate emergency, plant, and other resources to protect life and the Paducah Gaseous Diffusion Plant.

  7. Duration and predictors of emergency surgical operations - basis for medical management of mass casualty incidents

    PubMed Central

    2009-01-01

    Background Hospitals have a critically important role in the management of mass causality incidents (MCI), yet there is little information to assist emergency planners. A significantly limiting factor of a hospital's capability to treat those affected is its surgical capacity. We therefore intended to provide data about the duration and predictors of life saving operations. Methods The data of 20,815 predominantly blunt trauma patients recorded in the Trauma Registry of the German-Trauma-Society was retrospectively analyzed to calculate the duration of life-saving operations as well as their predictors. Inclusion criteria were an ISS ≥ 16 and the performance of relevant ICPM-coded procedures within 6 h of admission. Results From 1,228 patients fulfilling the inclusion criteria 1,793 operations could be identified as life-saving operations. Acute injuries to the abdomen accounted for 54.1% followed by head injuries (26.3%), pelvic injuries (11.5%), thoracic injuries (5.0%) and major amputations (3.1%). The mean cut to suture time was 130 min (IQR 65-165 min). Logistic regression revealed 8 variables associated with an emergency operation: AIS of abdomen ≥ 3 (OR 4,00), ISS ≥ 35 (OR 2,94), hemoglobin level ≤ 8 mg/dL (OR 1,40), pulse rate on hospital admission < 40 or > 120/min (OR 1,39), blood pressure on hospital admission < 90 mmHg (OR 1,35), prehospital infusion volume ≥ 2000 ml (OR 1,34), GCS ≤ 8 (OR 1,32) and anisocoria (OR 1,28) on-scene. Conclusions The mean operation time of 130 min calculated for emergency life-saving surgical operations provides a realistic guideline for the prospective treatment capacity which can be estimated and projected into an actual incident admission capacity. Knowledge of predictive factors for life-saving emergency operations helps to identify those patients that need most urgent operative treatment in case of blunt MCI. PMID:20149987

  8. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-04-01

    This report presents preliminary research results from the investigation in to the development of new models and guidance for concepts of operations (ConOps) in advanced small modular reactor (aSMR) designs. In support of this objective, three important research areas were included: operating principles of multi-modular plants, functional allocation models and strategies that would affect the development of new, non-traditional concept of operations, and the requiremetns for human performance, based upon work domain analysis and current regulatory requirements. As part of the approach for this report, we outline potential functions, including the theoretical and operational foundations for the development of a new functional allocation model and the identification of specific regulatory requirements that will influence the development of future concept of operations. The report also highlights changes in research strategy prompted by confirmationof the importance of applying the work domain analysis methodology to a reference aSMR design. It is described how this methodology will enrich the findings from this phase of the project in the subsequent phases and help in identification of metrics and focused studies for the determination of human performance criteria that can be used to support the design process.

  9. Experience in Construction and Operation of the Distributed Information Systems on the Basis of the Z39.50 Protocol

    NASA Astrophysics Data System (ADS)

    Zhizhimov, Oleg; Mazov, Nikolay; Skibin, Sergey

    Questions concerned with construction and operation of the distributed information systems on the basis of ANSI/NISO Z39.50 Information Retrieval Protocol are discussed in the paper. The paper is based on authors' practice in developing ZooPARK server. Architecture of distributed information systems, questions of reliability of such systems, minimization of search time and administration are examined. Problems with developing of distributed information systems are also described.

  10. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Hugo, Jacques; Forester, John; Gertman, David; Joe, Jeffrey; Medema, Heather; Persensky, Julius; Whaley, April

    2013-08-01

    This report presents preliminary research results from the investigation into the development of new models and guidance for Concepts of Operations in advanced small modular reactor (AdvSMR) designs. AdvSMRs are nuclear power plants (NPPs), but unlike conventional large NPPs that are constructed on site, AdvSMRs systems and components will be fabricated in a factory and then assembled on site. AdvSMRs will also use advanced digital instrumentation and control systems, and make greater use of automation. Some AdvSMR designs also propose to be operated in a multi-unit configuration with a single central control room as a way to be more cost-competitive with existing NPPs. These differences from conventional NPPs not only pose technical and operational challenges, but they will undoubtedly also have regulatory compliance implications, especially with respect to staffing requirements and safety standards.

  11. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  12. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  13. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.

    2005-12-01

    Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.

  14. The research and application of earthquake disaster comprehensive evaluation

    NASA Astrophysics Data System (ADS)

    Guo, Hongmei; Chen, Weifeng

    2017-04-01

    All disaster relief operations of the government after a destructive earthquake are dependent on earthquake disaster information, including command decision、rescue force deployment、dispatch of relief supplies etc. Earthquake disaster information is the most important requirements during earthquake emergency response and emergency disposal period. The macro disaster information, including distribution of disaster area 、personnel casualty scale etc,determines the disaster relief scale and response level. The specific disaster information determines the process and details of specific rescue operations. In view of the importance of earthquake disaster information, experts have been devoted to the study of seismic hazard assessment and acquisition, mainly from two aspects: improving the pre-assessment accuracy of the disaster and enriching the disaster information acquisition means. The problem is that the experts have carried out in-depth research from a certain aspect, they usually focus on optimizing pre-evaluation method、refining and updating basic data、 establishing new disaster information access channels, while ignoring the comprehensive use of various methods and means。 According to several devastating earthquake emergency disposal experience of sichuan province in recent years, this paper presents a new earthquake disaster comprehensive evaluation technology, in which Multi-disaster information source coordination, multi-faceted research field expert's complementarity coordination, rear and on-site coordination, multi-sectoral multi-regional coordination were taken into account. On this basis, Earthquake disaster comprehensive evaluation system with expert experience has been established. Based on the pre-assessment, the system can combine the background information of the disaster area such as seismic geological background and socioeconomic backgrounds, with disaster information from various sources to realize the fusion and mining of multi

  15. Anatomical basis of transgluteal approach for pudendal neuralgia and operative technique.

    PubMed

    Peltier, Johann

    2013-09-01

    Pudendal neuralgia is an entrapment syndrome whose both anatomic landmarks and operative technique remain relatively unfamiliar to neurosurgeons. To provide an outline of operative steps that is important to correct application of this approach. Surgical illustrations are included. The different figures detail the important steps of the operation. We perform a transmuscular approach leading to the sacrotuberous ligament, which is opened sagittally. The pudendal nerve and internal pudendal artery are found to be enclosed by a fascia sheath. The pudendal nerve swings around the sacrospinous ligament sacrospinous ligament with tension. Both distal branches of the pudendal nerve can be followed, especially the rectal branch running medially. After the section of the sacrospinous ligament, the pudendal nerve can be transposed frontally to the ischial spine within the ischiorectal fat. During this maneuver, significant venous bleeding may be encountered as perineural satellite veins dilatation can accompany or surround the pudendal nerve. It is important to avoid overpacking to limit compression injury to the pudendal nerve using judiciously small pieces of hemostatic device and soft cottonoid with light pressure. Then, the obturator fascia and the membranous falciform process of the sacrotuberous ligament that extend toward the ischioanal fossa must be incised. Transgluteal approach is a safe technique and we demonstrate that this approach can be performed safely minimizing pain, size of incision, surgical corridor, and trauma to adjacent muscles of buttock.

  16. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1994-09-01

    For waste containing small amounts of radioactivity, rad waste (RW), or mixed waste (MW) containing both radioactive and chemically hazardous components, incineration is a logical management candidate because of inherent safety, waste volume reduction, and low costs. Successful operation requires that the facility is properly designed and operated to protect workers and to limit releases of hazardous materials. The large decrease in waste volume achieved by incineration also results in a higher concentration of most of the radionuclides and non radioactive heavy metals in the ash products. These concentrations impact subsequent treatment and disposal. The various constituents (chemical elements) are not equal in concentration in the various incinerator feed materials, nor are they equal in their contribution to health risks on subsequent handling, or accidental release. Thus, for management of the wastes it is important to be able to predict how the nuclides partition between the primary combustion residue which may be an ash or a fused slag, the fine particulates or fly ash that is trapped in the burner off-gas by several different techniques, and the airborne fraction that escapes to the atmosphere. The objective of this report is to provide an estimate of how different elements of concern may behave in the chemical environment of the incinerator. The study briefly examines published incinerator operation data, then considers the properties of the elements of concern, and employs thermodynamic calculations, to help predict the fate of these RW and MW constituents. Many types and configurations of incinerators have been designed and tested.

  17. 14 CFR 331.35 - What is the basis upon which operators or providers will be reimbursed through the set-aside...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATIONS PROCEDURES FOR REIMBURSEMENT OF GENERAL AVIATION OPERATORS AND SERVICE PROVIDERS IN THE WASHINGTON, DC AREA Set-Aside for Operators or Providers at Certain Airports § 331.35 What is the basis upon...

  18. Physical basis of short-channel MESFET operation. II - Transient behavior

    NASA Astrophysics Data System (ADS)

    Faricelli, J. V.; Frey, J.; Krusius, J. P.

    1982-03-01

    The large-signal switching behavior of planar short-channel metal-semiconductor field-effect transistors (MESFET's) is simulated numerically. First, the intrinsic response of the MESFET is simulated in two space dimensions and time, using measured electric-field-dependent drift velocities and diffusivities in the conventional semiconductor equations; results of the intrinsic device simulations are then used to study the circuit behavior of Si and GaAs MESFET's in two-input NOR circuits. Although the simulated 1-micron-gate Si and GaAs MESFET's have intrinsic response times of 11 and 9 ps to a gate pulse of -2 V, for fan-in and fan-out = 2, the Si and GaAs NOR gates have average gates delays of 318 and 118 ps, respectively, for 1-micron gate lengths. The power-delay products for these 1-micron-gate Si and GaAs circuits are 1.8 and 1.5 pJ, respectively. These results are compared with measured data and their physical basis is discussed.

  19. A matrix representation of the translation operator with respect to a basis set of exponentially declining functions

    NASA Astrophysics Data System (ADS)

    Filter, Eckhard; Steinborn, E. Otto

    1980-12-01

    The matrix elements of the translation operator with respect to a complete orthonormal basis set of the Hilbert space L2(R3) are given in closed form as functions of the displacement vector. The basis functions are composed of an exponential, a Laguerre polynomial, and a regular solid spherical harmonic. With this formalism, a function which is defined with respect to a certain origin, can be ''shifted'', i.e., expressed in terms of given functions which are defined with respect to another origin. In this paper we also demonstrate the feasibility of this method by applying it to problems that are of special interest in the theory of the electronic structure of molecules and solids. We present new one-center expansions for some exponential-type functions (ETF's), and a closed-form expression for a multicenter integral over ETF's is given and numerically tested.

  20. Waste Encapsulation and Storage Facility (WESF) Basis for Interim Operation (BIO)

    SciTech Connect

    COVEY, L.I.

    2000-11-28

    The Waste Encapsulation and Storage Facility (WESF) is located in the 200 East Area adjacent to B Plant on the Hanford Site north of Richland, Washington. The current WESF mission is to receive and store the cesium and strontium capsules that were manufactured at WESF in a safe manner and in compliance with all applicable rules and regulations. The scope of WESF operations is currently limited to receipt, inspection, decontamination, storage, and surveillance of capsules in addition to facility maintenance activities. The capsules are expected to be stored at WESF until the year 2017, at which time they will have been transferred for ultimate disposition. The WESF facility was designed and constructed to process, encapsulate, and store the extracted long-lived radionuclides, {sup 90}Sr and {sup 137}Cs, from wastes generated during the chemical processing of defense fuel on the Hanford Site thus ensuring isolation of hazardous radioisotopes from the environment. The construction of WESF started in 1971 and was completed in 1973. Some of the {sup 137}Cs capsules were leased by private irradiators or transferred to other programs. All leased capsules have been returned to WESF. Capsules transferred to other programs will not be returned except for the seven powder and pellet Type W overpacks already stored at WESF.

  1. Modeling of the Reactor Core Isolation Cooling Response to Beyond Design Basis Operations - Interim Report

    SciTech Connect

    Ross, Kyle; Cardoni, Jeffrey N.; Wilson, Chisom Shawn; Morrow, Charles; Osborn, Douglas; Gauntt, Randall O.

    2015-12-01

    Efforts are being pursued to develop and qualify a system-level model of a reactor core isolation (RCIC) steam-turbine-driven pump. The model is being developed with the intent of employing it to inform the design of experimental configurations for full-scale RCIC testing. The model is expected to be especially valuable in sizing equipment needed in the testing. An additional intent is to use the model in understanding more fully how RCIC apparently managed to operate far removed from its design envelope in the Fukushima Daiichi Unit 2 accident. RCIC modeling is proceeding along two avenues that are expected to complement each other well. The first avenue is the continued development of the system-level RCIC model that will serve in simulating a full reactor system or full experimental configuration of which a RCIC system is part. The model reasonably represents a RCIC system today, especially given design operating conditions, but lacks specifics that are likely important in representing the off-design conditions a RCIC system might experience in an emergency situation such as a loss of all electrical power. A known specific lacking in the system model, for example, is the efficiency at which a flashing slug of water (as opposed to a concentrated jet of steam) could propel the rotating drive wheel of a RCIC turbine. To address this specific, the second avenue is being pursued wherein computational fluid dynamics (CFD) analyses of such a jet are being carried out. The results of the CFD analyses will thus complement and inform the system modeling. The system modeling will, in turn, complement the CFD analysis by providing the system information needed to impose appropriate boundary conditions on the CFD simulations. The system model will be used to inform the selection of configurations and equipment best suitable of supporting planned RCIC experimental testing. Preliminary investigations with the RCIC model indicate that liquid water ingestion by the turbine

  2. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  3. Stress and fluid-pressure changes associated with oil-field operations: A critical assessment of effects in the focal region of the earthquake

    SciTech Connect

    Segall, P.; Yerkes, R.F.

    1990-01-01

    The proximity of the May 2 earthquake to the active oil fields on Anticline Ridge has led to speculation that this earthquake might have been triggered by oil-field operations. Elsewhere, earthquakes have been associated with pore-pressure increases resulting from fluid injection and with subsidence resulting from fluid extraction. Simple calculations show that shale units, which underlie the oil-producing strata, hydraulically isolate the oil field from the earthquake focal region. The large volumes of fluid extracted from the oil fields caused a 50% decline in reservoir pressures from 1938 to 1983. These observations independently rule out substantial increases in pore pressure at focal depths due to fluid injection. The authors use a theoretical method, based on Biot's constitutive theory for fluid-infiltrated elastic media, to evaluate the change in stresses acting in the focal region resulting from fluid extraction in the overlying oil fields. As an independent check on this method, the subsidence of the Earth's surface in response to fluid withdrawal is calculated and compared with measured elevation changes on Anticline Ridge. The producing horizons are taken to be horizontal permeable layers, bounded above and below by impermeable horizons. Strains within the producing layers are related to extraction-induced changes in pore-fluid mass. Contraction of the producing layers causes the free surface to subside and strains the elastic surroundings. The calculated subsidence rate of Anticline Ridge between 1933 and 1972 if 3 mm/yr, in good agreement with the measured subsidence rate of 3.3 {plus minus} 0.7 mm/yr.

  4. Response to "Comment on 'Rethinking first-principles electron transport theories with projection operators: the problems caused by partitioning the basis set'" [J. Chem. Phys. 140, 177103 (2014)].

    PubMed

    Reuter, Matthew G; Harrison, Robert J

    2014-05-07

    The thesis of Brandbyge's comment [J. Chem. Phys. 140, 177103 (2014)] is that our operator decoupling condition is immaterial to transport theories, and it appeals to discussions of nonorthogonal basis sets in transport calculations in its arguments. We maintain that the operator condition is to be preferred over the usual matrix conditions and subsequently detail problems in the existing approaches. From this operator perspective, we conclude that nonorthogonal projectors cannot be used and that the projectors must be selected to satisfy the operator decoupling condition. Because these conclusions pertain to operators, the choice of basis set is not germane.

  5. Debriefing of American Red Cross personnel: pilot study on participants' evaluations and case examples from the 1994 Los Angeles earthquake relief operation.

    PubMed

    Armstrong, K; Zatzick, D; Metzler, T; Weiss, D S; Marmar, C R; Garma, S; Ronfeldt, H; Roepke, L

    1998-01-01

    The Multiple Stressor Debriefing (MSD) model was used to debrief 112 American Red Cross workers individually or in groups after their participation in the 1994 Los Angeles earthquake relief effort. Two composite case examples are presented that illustrate individual and group debriefings using the MSD model. A questionnaire which evaluated workers' experience of debriefing, was completed by 95 workers. Results indicated that workers evaluated the debriefings in which they participated positively. In addition, as participant to facilitator ratio increased, workers shared less of their feelings and reactions about the disaster relief operation. These findings, as well as more specific issues about debriefing, are discussed.

  6. Earthquake Facts

    MedlinePlus

    ... May 22, 1960. The earliest reported earthquake in California was felt in 1769 by the exploring expedition ... by wind or tides. Each year the southern California area has about 10,000 earthquakes . Most of ...

  7. PoroTomo Project - Subatask 6.2: Deploy and Operate DAS and DTS arrays - DAS Earthquake Data

    DOE Data Explorer

    Kurt Feigl

    2016-03-21

    The submitted data correspond to the vibration caused by a 3.4 M earthquake and captured by the DAS horizontal and vertical arrays during the PoroTomo Experiment. Earthquake information : M 4.3 - 23km ESE of Hawthorne, Nevada Time: 2016-03-21 07:37:10 (UTC) Location: 38.479°N 118.366°W Depth: 9.9 km Files for horizontal DAS array (each file is 30 s long and contain 8700 channels): PoroTomo_iDAS16043_160321073721.sgy PoroTomo_iDAS16043_160321073751.sgy Files for vertical DAS Array (each file is 30 s long and contain 380 channels): PoroTomo_iDAS025_160321073717.sgy PoroTomo_iDAS025_160321073747.sgy

  8. Comparative Analysis of Emergency Response Operations: Haiti Earthquake in January 2010 and Pakistan’s Flood in 2010

    DTIC Science & Technology

    2011-09-01

    safer by implementing strict building code and educating the population on how to respond when the tremor strikes; both of these measures need...separated children, human trafficking, sexual and gender based violence and overall violence in the camps continued to be serious issues even seven months...Lessons from Earthquakes in Haiti and Chile to Reduce Global Risk,” Speech given at The National Academies Disasters Roundtable Workshop,1 March , 2011

  9. Hidden Earthquakes.

    ERIC Educational Resources Information Center

    Stein, Ross S.; Yeats, Robert S.

    1989-01-01

    Points out that large earthquakes can take place not only on faults that cut the earth's surface but also on blind faults under folded terrain. Describes four examples of fold earthquakes. Discusses the fold earthquakes using several diagrams and pictures. (YP)

  10. Nowcasting Earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.

    2016-12-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nearthquake cycle in the defined region at the current time.

  11. Hidden Earthquakes.

    ERIC Educational Resources Information Center

    Stein, Ross S.; Yeats, Robert S.

    1989-01-01

    Points out that large earthquakes can take place not only on faults that cut the earth's surface but also on blind faults under folded terrain. Describes four examples of fold earthquakes. Discusses the fold earthquakes using several diagrams and pictures. (YP)

  12. Nowcasting earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Turcotte, D. L.; Donnellan, A.; Grant Ludwig, L.; Luginbuhl, M.; Gong, G.

    2016-11-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(n < n(t)) for the current count n(t) for the small earthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(n < n(t)). EPS is therefore the current level of hazard and assigns a number between 0% and 100% to every region so defined, thus providing a unique measure. Physically, the EPS corresponds to an estimate of the level of progress through the earthquake cycle in the defined region at the current time.

  13. Methods of Multivariable Earthquake Precursor Analysis and a Proposed Prototype Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Rojas, J. I.; Fletcher, L. E.

    2007-12-01

    Significant advances are being made in earthquake prediction theory; however, a reliable method for forecasting the occurrence of earthquakes from space and/or ground based technologies remains limited to no more than a few minutes before the event happens. Several claims of earthquake precursors have been put forward, such as ionospheric changes, electromagnetic effects, and ground heating, though the science behind these is far from complete and the successful application of these precursors is highly regionally variable. Existing and planned dedicated space missions for monitoring earthquake precursors are insufficient for resolving the precursor issue. Their performance does not satisfy the requirements of an earthquake early warning system in terms of spatial and temporal coverage (Pulinets and Boyarchuk, 2004). To achieve statistically significant validation of precursors for early warning delivery, precursor data must be obtained from simultaneous repeated monitoring of several precursors in focus regions over a long period of time and then integrated and processed. Data sources include historical data, data from ground-based units, airborne systems, and space-based systems. This paper describes methods of systematic evaluation of regionally specific, multivariable precursor data needed for the identification of the expected time, magnitude and the position of the epicentre. This data set forms the basis for a proposed operational early warning system developed at the International Space University and which is built in partnership with local and national governments as well as international organizations.

  14. Experience from the Great East Japan Earthquake response as the basis for revising the Japanese Disaster Medical Assistance Team (DMAT) training program.

    PubMed

    Anan, Hideaki; Akasaka, Osamu; Kondo, Hisayoshi; Nakayama, Shinichi; Morino, Kazuma; Homma, Masato; Koido, Yuichi; Otomo, Yasuhiro

    2014-12-01

    The objective of this study was to draft a new Japanese Disaster Medical Assistance Team (DMAT) training program based on the responses to the Great East Japan Earthquake. Working group members of the Japan DMAT Investigative Commission, Ministry of Health, Labour and Welfare, reviewed reports and academic papers on DMAT activities after the disaster and identified items in the current Japanese DMAT training program that should be changed. A new program was proposed that incorporates these changes. New topics that were identified to be added to the DMAT training program were hospital evacuation, preparations to receive DMATs at damaged hospitals, coordination when DMAT activities are prolonged, and safety management and communication when on board small helicopters. The use of wide-area transport was reviewed and changes were made to cover selection of various transport means including helicopter ambulances. Content related to confined space medicine was removed. The time spent on emergency medical information system (EMIS) practical training was increased. Redundant or similar content was combined and reorganized, and a revised DMAT training program that did not increase the overall training time was designed. The revised DMAT training program will provide practical training better suited to the present circumstances in Japan.

  15. The parkfield, california, earthquake prediction experiment.

    PubMed

    Bakun, W H; Lindh, A G

    1985-08-16

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment.

  16. Technical Basis for Safe Operations with Pu-239 in NMS and S Facilities (F and H Areas)

    SciTech Connect

    Bronikowski, M.G.

    1999-03-18

    Plutonium-239 is now being processed in HB-Line and H-Canyon as well as FB-Line and F-Canyon. As part of the effort to upgrade the Authorization Basis for H Area facilities relative to nuclear criticality, a literature review of Pu polymer characteristics was conducted to establish a more quantitative vs. qualitative technical basis for safe operations. The results are also applicable to processing in F Area facilities.The chemistry of Pu polymer formation, precipitation, and depolymerization is complex. Establishing limits on acid concentrations of solutions or changing the valence to Pu(III) or Pu(VI) can prevent plutonium polymer formation in tanks in the B lines and canyons. For Pu(IV) solutions of 7 g/L or less, 0.22 M HNO3 prevents polymer formation at ambient temperature. This concentration should remain the minimum acid limit for the canyons and B lines when processing Pu-239 solutions. If the minimum acid concentration is compromised, the solution may need to be sampled and tested for the presence of polymer. If polymer is not detected, processing may proceed. If polymer is detected, adding HNO3 to a final concentration above 4 M is the safest method for handling the solution. The solution could also be heated to speed up the depolymerization process. Heating with > 4 M HNO3 will depolymerize the solution for further processing.Adsorption of Pu(IV) polymer onto the steel walls of canyon and B line tanks is likely to be 11 mg/cm2, a literature value for unpolished steel. This value will be confirmed by experimental work. Tank-to-tank transfers via steam jets are not expected to produce Pu(IV) polymer unless a larger than normal dilution occurs (e.g., >3 percent) at acidities below 0.4 M.

  17. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  18. Earthquakes: A Teacher's Package for K-6.

    ERIC Educational Resources Information Center

    National Science Teachers Association, Washington, DC.

    Like rain, an earthquake is a natural occurrence which may be mild or catastrophic. Although an earthquake may last only a few seconds, the processes that cause it have operated within the earth for millions of years. Until recently, the cause of earthquakes was a mystery and the subject of fanciful folklore to people all around the world. This…

  19. [Efficient OP management. Suggestions for optimisation of organisation and administration as a basis for establishing statutes for operating theatres].

    PubMed

    Geldner, G; Eberhart, L H J; Trunk, S; Dahmen, K G; Reissmann, T; Weiler, T; Bach, A

    2002-09-01

    Economic aspects have gained increasing importance in recent years. The operating room (OR) is the most cost-intensive sector and determines the turnover process of a surgical patient within the hospital. Thus, optimisation of workflow processes is of particular interest for health care providers. If the results of surgery are viewed as a product, everything associated with surgery can be evaluated analogously to a manufacturing process. All steps involved in producing the end-result can and should be analysed with the goal of producing an efficient, economical and quality product. The leadership that physicians can provide to manage this process is important and leads to the introduction of a specialised "OR manager". This position must have the authority to issue directives to all other members of the OR team. An OR management subordinates directly to the administration of the hospital. By integrating and improving management of various elements of the surgical process, health care institutions are able to rationally trim costs while maintaining high-quality services. This paper gives a short introduction into the difficulties of organising an OR. Some suggestions are made to overcome common shortcomings in the daily practise. A proposal for an "OR statute" is presented that should be a basis for discussion within the OR team. It must be modified according to individual needs and prerequisites in every hospital. The single best opportunity for dramatic improvement in effective resource use in surgical services lies in the perioperative process. The management strategy must focus on process measurement using information technology and feed-back implementing modern quality management tools.However, no short-term effects can be expected from these changes. Improvements take about a year and continuous feed-back of all measures must accompany the reorganisation process.

  20. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  1. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  2. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  3. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  4. Undead earthquakes

    NASA Astrophysics Data System (ADS)

    Musson, R. M. W.

    This short communication deals with the problem of fake earthquakes that keep returning into circulation. The particular events discussed are some very early earthquakes supposed to have occurred in the U.K., which all originate from a single enigmatic 18th century source.

  5. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  6. The 1988 earthquake in soviet armenia: implications for earthquake preparedness.

    PubMed

    Noji, E K

    1989-09-01

    An earthquake registering 6.9 on the Richter scale hit the northern part of the Armenian Republic of the Soviet Union on 7 December 1988, resulting in thousands of deaths and injuries. The majority of these resulted from the collapse of inadequately designed and constructed buildings. Analysis of the effects of the Armenian earthquake on the population, as well as of the rescue and medical response, has strong implications for earthquake preparedness and response in other seismically vulnerable parts of the world. Specifically, this paper will recommend a number of important endeavours deemed necessary to improve medical planning, preparedness and response to earthquakes. Strengthening the self-reliance of the community in disaster preparedness is suggested as the best way to improve the effectiveness of relief operations. In earthquake-prone areas, training and education in basic first aid and methods of rescue should be an integral part of any community preparedness programme.

  7. Proposed plan/Statement of basis for the Grace Road Site (631-22G) operable unit: Final action

    SciTech Connect

    Palmer, E.

    1997-08-19

    This Statement of Basis/Proposed Plan is being issued by the U. S. Department of Energy (DOE), which functions as the lead agency for the Savannah River Site (SRS) remedial activities, with concurrence by the U. S. Environmental Protection Agency (EPA), and the South Carolina Department of Health and Environmental Control (SCDHEC). The purpose of this Statement of Basis/Proposed Plan is to describe the preferred alternative for addressing the Grace Road site (GRS) located at the Savannah River Site (SRS), in Aiken, South Carolina and to provide an opportunity for public input into the remedial action selection process.

  8. Earthquake technology fights crime

    USGS Publications Warehouse

    Lahr, John C.; Ward, Peter L.; Stauffer, Peter H.; Hendley, James W.

    1996-01-01

    Scientists with the U.S. Geological Survey have adapted their methods for quickly finding the exact source of an earthquake to the problem of locating gunshots. On the basis of this work, a private company is now testing an automated gunshot-locating system in a San Francisco Bay area community. This system allows police to rapidly pinpoint and respond to illegal gunfire, helping to reduce crime in our neighborhoods.

  9. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  10. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  11. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable year which... made upon the final determination of the rate of absorption applicable to the taxable year....

  12. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  13. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  14. Sensing the earthquake

    NASA Astrophysics Data System (ADS)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  15. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  16. Deep Earthquakes.

    ERIC Educational Resources Information Center

    Frohlich, Cliff

    1989-01-01

    Summarizes research to find the nature of deep earthquakes occurring hundreds of kilometers down in the earth's mantle. Describes further research problems in this area. Presents several illustrations and four references. (YP)

  17. Rapid estimation of the economic consequences of global earthquakes

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    to reduce this time gap to more rapidly and effectively mobilize response. We present here a procedure to rapidly and approximately ascertain the economic impact immediately following a large earthquake anywhere in the world. In principle, the approach presented is similar to the empirical fatality estimation methodology proposed and implemented by Jaiswal and others (2009). In order to estimate economic losses, we need an assessment of the economic exposure at various levels of shaking intensity. The economic value of all the physical assets exposed at different locations in a given area is generally not known and extremely difficult to compile at a global scale. In the absence of such a dataset, we first estimate the total Gross Domestic Product (GDP) exposed at each shaking intensity by multiplying the per-capita GDP of the country by the total population exposed at that shaking intensity level. We then scale the total GDP estimated at each intensity by an exposure correction factor, which is a multiplying factor to account for the disparity between wealth and/or economic assets to the annual GDP. The economic exposure obtained using this procedure is thus a proxy estimate for the economic value of the actual inventory that is exposed to the earthquake. The economic loss ratio, defined in terms of a country-specific lognormal cumulative distribution function of shaking intensity, is derived and calibrated against the losses from past earthquakes. This report describes the development of a country or region-specific economic loss ratio model using economic loss data available for global earthquakes from 1980 to 2007. The proposed model is a potential candidate for directly estimating economic losses within the currently-operating PAGER system. PAGER's other loss models use indirect methods that require substantially more data (such as building/asset inventories, vulnerabilities, and the asset values exposed at the time of earthquake) to implement on a global basis

  18. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  19. Implementation Guidance on Annual Compliance Certification Reporting and Statement of Basis Requirements for Title V Operating Permits

    EPA Pesticide Factsheets

    This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  20. Earthquake Scaling, Simulation and Forecasting

    NASA Astrophysics Data System (ADS)

    Sachs, Michael Karl

    Earthquakes are among the most devastating natural events faced by society. In 2011, just two events, the magnitude 6.3 earthquake in Christcurch New Zealand on February 22, and the magnitude 9.0 Tohoku earthquake off the coast of Japan on March 11, caused a combined total of $226 billion in economic losses. Over the last decade, 791,721 deaths were caused by earthquakes. Yet, despite their impact, our ability to accurately predict when earthquakes will occur is limited. This is due, in large part, to the fact that the fault systems that produce earthquakes are non-linear. The result being that very small differences in the systems now result in very big differences in the future, making forecasting difficult. In spite of this, there are patterns that exist in earthquake data. These patterns are often in the form of frequency-magnitude scaling relations that relate the number of smaller events observed to the number of larger events observed. In many cases these scaling relations show consistent behavior over a wide range of scales. This consistency forms the basis of most forecasting techniques. However, the utility of these scaling relations is limited by the size of the earthquake catalogs which, especially in the case of large events, are fairly small and limited to a few 100 years of events. In this dissertation I discuss three areas of earthquake science. The first is an overview of scaling behavior in a variety of complex systems, both models and natural systems. The focus of this area is to understand how this scaling behavior breaks down. The second is a description of the development and testing of an earthquake simulator called Virtual California designed to extend the observed catalog of earthquakes in California. This simulator uses novel techniques borrowed from statistical physics to enable the modeling of large fault systems over long periods of time. The third is an evaluation of existing earthquake forecasts, which focuses on the Regional

  1. Two grave issues concerning the expected Tokai Earthquake

    NASA Astrophysics Data System (ADS)

    Mogi, K.

    2004-08-01

    The possibility of a great shallow earthquake (M 8) in the Tokai region, central Honshu, in the near future was pointed out by Mogi in 1969 and by the Coordinating Committee for Earthquake Prediction (CCEP), Japan (1970). In 1978, the government enacted the Large-Scale Earthquake Countermeasures Law and began to set up intensified observations in this region for short-term prediction of the expected Tokai earthquake. In this paper, two serious issues are pointed out, which may contribute to catastrophic effects in connection with the Tokai earthquake: 1. The danger of black-and-white predictions: According to the scenario based on the Large-Scale Earthquake Countermeasures Law, if abnormal crustal changes are observed, the Earthquake Assessment Committee (EAC) will determine whether or not there is an imminent danger. The findings are reported to the Prime Minister who decides whether to issue an official warning statement. Administrative policy clearly stipulates the measures to be taken in response to such a warning, and because the law presupposes the ability to predict a large earthquake accurately, there are drastic measures appropriate to the situation. The Tokai region is a densely populated region with high social and economic activity, and it is traversed by several vital transportation arteries. When a warning statement is issued, all transportation is to be halted. The Tokyo capital region would be cut off from the Nagoya and Osaka regions, and there would be a great impact on all of Japan. I (the former chairman of EAC) maintained that in view of the variety and complexity of precursory phenomena, it was inadvisable to attempt a black-and-white judgment as the basis for a "warning statement". I urged that the government adopt a "soft warning" system that acknowledges the uncertainty factor and that countermeasures be designed with that uncertainty in mind. 2. The danger of nuclear power plants in the focal region: Although the possibility of the

  2. Earthquakes for Kids

    MedlinePlus

    ... A student doing an experiment in the rock physics lab. Earthquake Animations A trench dug across a ... Links Earthquake Photos Earthquake ABC Scientists doing field work and looking at their notes. Latest Earthquakes Children's ...

  3. Selection of operating parameters on the basis of hydrodynamics in centrifugal partition chromatography for the purification of nybomycin derivatives.

    PubMed

    Adelmann, S; Baldhoff, T; Koepcke, B; Schembecker, G

    2013-01-25

    The selection of solvent systems in centrifugal partition chromatography (CPC) is the most critical point in setting up a separation. Therefore, lots of research was done on the topic in the last decades. But the selection of suitable operating parameters (mobile phase flow rate, rotational speed and mode of operation) with respect to hydrodynamics and pressure drop limit in CPC is still mainly driven by experience of the chromatographer. In this work we used hydrodynamic analysis for the prediction of most suitable operating parameters. After selection of different solvent systems with respect to partition coefficients for the target compound the hydrodynamics were visualized. Based on flow pattern and retention the operating parameters were selected for the purification runs of nybomycin derivatives that were carried out with a 200 ml FCPC(®) rotor. The results have proven that the selection of optimized operating parameters by analysis of hydrodynamics only is possible. As the hydrodynamics are predictable by the physical properties of the solvent system the optimized operating parameters can be estimated, too. Additionally, we found that dispersion and especially retention are improved if the less viscous phase is mobile.

  4. A computational method for solving stochastic Itô–Volterra integral equations based on stochastic operational matrix for generalized hat basis functions

    SciTech Connect

    Heydari, M.H.; Hooshmandasl, M.R.; Maalek Ghaini, F.M.; Cattani, C.

    2014-08-01

    In this paper, a new computational method based on the generalized hat basis functions is proposed for solving stochastic Itô–Volterra integral equations. In this way, a new stochastic operational matrix for generalized hat functions on the finite interval [0,T] is obtained. By using these basis functions and their stochastic operational matrix, such problems can be transformed into linear lower triangular systems of algebraic equations which can be directly solved by forward substitution. Also, the rate of convergence of the proposed method is considered and it has been shown that it is O(1/(n{sup 2}) ). Further, in order to show the accuracy and reliability of the proposed method, the new approach is compared with the block pulse functions method by some examples. The obtained results reveal that the proposed method is more accurate and efficient in comparison with the block pule functions method.

  5. Generation Of Manufacturing Routing And Operations Using Structured Knowledge As Basis To Application Of Computer Aided In Process Planning

    NASA Astrophysics Data System (ADS)

    Oswaldo, Luiz Agostinho

    2011-01-01

    The development of computer aided resources in automation of generation of manufacturing routings and operations is being mainly accomplished through the search of similarities between existent ones, resulting standard process routings that are grouped by analysis of similarities between parts or routings. This article proposes the development of manufacturing routings and operations detailment using a methodology which steps will define the initial, intermediate and final operations, starting from the rough piece and going up to the final specifications, that must have binunivocal relationship with the part design specifications. Each step will use the so called rules of precedence to link and chain the routing operations. The rules of precedence order and prioritize the knowledge of various manufacturing processes, taking in account the theories of machining, forging, assembly, and heat treatments; also, utilizes the theories of accumulation of tolerances and process capabilities, between others. It is also reinforced the availability of manufacturing databases related to process tolerances, deviations of machine tool- cutting tool- fixturing devices—workpiece, and process capabilities. The statement and application of rules of precedence, linking and joining manufacturing concepts in a logical and structured way, and their application in the methodology steps will make viable the utilization of structured knowledge instead of tacit one currently available in the manufacturing engineering departments, in the generation of manufacturing routing and operations. Consequently, the development of Computer Aided in Process Planning will be facilitated, due to the structured knowledge applied with this methodology.

  6. Using an internal coordinate Gaussian basis and a space-fixed Cartesian coordinate kinetic energy operator to compute a vibrational spectrum with rectangular collocation

    NASA Astrophysics Data System (ADS)

    Manzhos, Sergei; Carrington, Tucker

    2016-12-01

    We demonstrate that it is possible to use basis functions that depend on curvilinear internal coordinates to compute vibrational energy levels without deriving a kinetic energy operator (KEO) and without numerically computing coefficients of a KEO. This is done by using a space-fixed KEO and computing KEO matrix elements numerically. Whenever one has an excellent basis, more accurate solutions to the Schrödinger equation can be obtained by computing the KEO, potential, and overlap matrix elements numerically. Using a Gaussian basis and bond coordinates, we compute vibrational energy levels of formaldehyde. We show, for the first time, that it is possible with a Gaussian basis to solve a six-dimensional vibrational Schrödinger equation. For the zero-point energy (ZPE) and the lowest 50 vibrational transitions of H2CO, we obtain a mean absolute error of less than 1 cm-1; with 200 000 collocation points and 40 000 basis functions, most errors are less than 0.4 cm-1.

  7. Using an internal coordinate Gaussian basis and a space-fixed Cartesian coordinate kinetic energy operator to compute a vibrational spectrum with rectangular collocation.

    PubMed

    Manzhos, Sergei; Carrington, Tucker

    2016-12-14

    We demonstrate that it is possible to use basis functions that depend on curvilinear internal coordinates to compute vibrational energy levels without deriving a kinetic energy operator (KEO) and without numerically computing coefficients of a KEO. This is done by using a space-fixed KEO and computing KEO matrix elements numerically. Whenever one has an excellent basis, more accurate solutions to the Schrödinger equation can be obtained by computing the KEO, potential, and overlap matrix elements numerically. Using a Gaussian basis and bond coordinates, we compute vibrational energy levels of formaldehyde. We show, for the first time, that it is possible with a Gaussian basis to solve a six-dimensional vibrational Schrödinger equation. For the zero-point energy (ZPE) and the lowest 50 vibrational transitions of H2CO, we obtain a mean absolute error of less than 1 cm(-1); with 200 000 collocation points and 40 000 basis functions, most errors are less than 0.4 cm(-1).

  8. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways in...

  9. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways in...

  10. Deep earthquakes

    SciTech Connect

    Frohlich, C.

    1989-01-01

    Earthquakes are often recorded at depths as great as 650 kilometers or more. These deep events mark regions where plates of the earth's surface are consumed in the mantle. But the earthquakes themselves present a conundrum: the high pressures and temperatures at such depths should keep rock from fracturing suddenly and generating a tremor. This paper reviews the research on this problem. Almost all deep earthquakes conform to the pattern described by Wadati, namely, they generally occur at the edge of a deep ocean and define an inclined zone extending from near the surface to a depth of 600 kilometers of more, known as the Wadati-Benioff zone. Several scenarios are described that were proposed to explain the fracturing and slipping of rocks at this depth.

  11. Postseismic Transient after the 2002 Denali Fault Earthquake from VLBI Measurements at Fairbanks

    NASA Technical Reports Server (NTRS)

    MacMillan, Daniel; Cohen, Steven

    2004-01-01

    The VLBI antenna (GILCREEK) at Fairbanks, Alaska observes in networks routinely twice a week with operational networks and on additional days with other networks on a more uneven basis. The Fairbanks antenna position is about 150 km north of the Denali fault and from the earthquake epicenter. We examine the transient behavior of the estimated VLBI position during the year following the earthquake to determine how the rate of change of postseismic deformation has changed. This is compared with what is seen in the GPS site position series.

  12. Structural basis of operator sites recognition and effector binding in the TetR family transcription regulator FadR.

    PubMed

    Yeo, Hyun Ku; Park, Young Woo; Lee, Jae Young

    2017-04-20

    FadR is a fatty acyl-CoA dependent transcription factor that regulates genes encoding proteins involved in fatty-acid degradation and synthesis pathways. In this study, the crystal structures of Bacillus halodurans FadR, which belong to the TetR family, have been determined in three different forms: ligand-bound, ligand-free and DNA-bound at resolutions of 1.75, 2.05 and 2.80 Å, respectively. Structural and functional data showed that B. halodurans FadR was bound to its operator site without fatty acyl-CoAs. Structural comparisons among the three different forms of B. halodurans FadR revealed that the movement of DNA binding domains toward the operator DNA was blocked upon binding of ligand molecules. These findings suggest that the TetR family FadR negatively regulates the genes involved in fatty acid metabolism by binding cooperatively to the operator DNA as a dimer of dimers. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. The Effects of Degraded Digital Instrumentation and Control Systems on Human-system Interfaces and Operator Performance: HFE Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.; W. Gunther, G. Martinez-Guridi

    2010-02-26

    New and advanced reactors will use integrated digital instrumentation and control (I&C) systems to support operators in their monitoring and control functions. Even though digital systems are typically highly reliable, their potential for degradation or failure could significantly affect operator performance and, consequently, impact plant safety. The U.S. Nuclear Regulatory Commission (NRC) supported this research project to investigate the effects of degraded I&C systems on human performance and plant operations. The objective was to develop human factors engineering (HFE) review guidance addressing the detection and management of degraded digital I&C conditions by plant operators. We reviewed pertinent standards and guidelines, empirical studies, and plant operating experience. In addition, we conducted an evaluation of the potential effects of selected failure modes of the digital feedwater system on human-system interfaces (HSIs) and operator performance. The results indicated that I&C degradations are prevalent in plants employing digital systems and the overall effects on plant behavior can be significant, such as causing a reactor trip or causing equipment to operate unexpectedly. I&C degradations can impact the HSIs used by operators to monitor and control the plant. For example, sensor degradations can make displays difficult to interpret and can sometimes mislead operators by making it appear that a process disturbance has occurred. We used the information obtained as the technical basis upon which to develop HFE review guidance. The guidance addresses the treatment of degraded I&C conditions as part of the design process and the HSI features and functions that support operators to monitor I&C performance and manage I&C degradations when they occur. In addition, we identified topics for future research.

  14. Influences of operating conditions on continuous lactulose synthesis in an enzymatic membrane reactor system: A basis prior to long-term operation.

    PubMed

    Sitanggang, Azis Boing; Drews, Anja; Kraume, Matthias

    2015-06-10

    Lactulose synthesis was performed in a continuous stirred enzymatic membrane reactor. Each investigated operating condition (agitation, pH, feed molar ratio of lactose to fructose (mL/mF ratio), hydraulic residence time (HRT)) had an influence on reaction performances, in terms of lactulose concentration, productivity and selectivity. Lactulose concentration was maximum at an mL/mF ratio of 1/2. Higher than this ratio, synthesis of galactooligosaccharides was promoted rather than lactulose. At mL/mF ratios lower than 1/2, enzyme inhibition was pronounced to the detriment of lactulose production. At 7 or 9h HRT, higher lactulose concentrations were obtained than at shorter HRTs. Applying an mL/mF ratio of 1/2 and an HRT of 9h in a long-term operation, nearly constant lactulose concentration was reached after 23h and lasted up to 32h with a mean concentration of 14.51±0.07g/L and a reaction selectivity of 0.075-0.080mollactulose/molcons.lactose. After 7d, lactulose concentration reduced by 31%. A continuous synthesis of lactulose at lab-scale was shown to be amenable using a membrane reactor process. Moreover, for process evaluation, this study can bridge the gap between batch laboratory scale and continuous full-scale operation regarding lactulose synthesis. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Earthquake engineering research: 1982

    NASA Astrophysics Data System (ADS)

    The Committee on Earthquake Engineering Research addressed two questions: What progress has research produced in earthquake engineering and which elements of the problem should future earthquake engineering pursue. It examined and reported in separate chapters of the report: Applications of Past Research, Assessment of Earthquake Hazard, Earthquake Ground Motion, Soil Mechanics and Earth Structures, Analytical and Experimental Structural Dynamics, Earthquake Design of Structures, Seismic Interaction of Structures and Fluids, Social and Economic Aspects, Earthquake Engineering Education, Research in Japan.

  16. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  17. Earthquake tectonics

    SciTech Connect

    Steward, R.F. )

    1991-02-01

    Earthquakes release a tremendous amount of energy into the subsurface in the form of seismic waves. The seismic wave energy of the San Francisco 1906 (M = 8.2) earthquake was equivalent to over 8 billion tons of TNT (3.3 {times} 10{sup 19} joules). Four basic wave types are propagated form seismic sources, two non-rotational and two rotational. As opposed to the non-rotational R and SH waves, the rotational compressional (RC) and rotational shear (RS) waves carry the bulk of the energy from a seismic source. RC wavefronts propagate in the subsurface and refract similarly to P waves, but are considerably slower. RC waves are critically refracted beneath the air surface interface at velocities less than the velocity of sound in air because they refract at the velocity of sound in air minus the retrograde particle velocity at the top of the wave. They propagate at tsunami waves in the open ocean, and produce loud sounds on land that are heard by humans and animals during earthquakes. The energy of the RS wave dwarfs that of the P, SH, and even the RC wave. The RS wave is the same as what is currently called the S wave in earthquake seismology, and produces both folding and strike-slip faulting at considerable distances from the epicenter. RC and RS waves, propagated during earthquakes from the Santa Ynez fault and a right-slip fault on trend with the Red Mountain fault, produced the Santa Ynez Mountains in California beginning in the middle Pliocene and continuing until the present.

  18. Risk assessment of people trapped in earthquake based on km grid: a case study of the 2014 Ludian earthquake

    NASA Astrophysics Data System (ADS)

    Wei, Ben-Yong; Nie, Gao-Zhong; Su, Gui-Wu; Sun, Lei

    2017-04-01

    China is one of the most earthquake prone countries in the world. The priority during earthquake emergency response is saving lives and minimizing casualties. Rapid judgment of the trapped location is the important basis for government to reasonable arrange the emergency rescue forces and resources after the earthquake. Through analyzing the key factors resulting in people trapped, we constructed an assessment model of personal trapped (PTED)in collapsed buildings caused by earthquake disaster. Then taking the 2014 Ludian Earthquake as a case, this study evaluated the distribution of trapped personal during this earthquake using the assessment model based on km grid data. Results showed that, there are two prerequisites for people might be trapped by the collapse of buildings in earthquake: earthquake caused buildings collapse and there are people in building when building collapsing; the PTED model could be suitable to assess the trapped people in collapsed buildings caused by earthquake. The distribution of people trapped by the collapse of buildings in the Ludian earthquake assessed by the model is basically the same as that obtained by the actual survey. Assessment of people trapped in earthquake based on km grid can meet the requirements of search-and-rescue zone identification and rescue forces allocation in the early stage of the earthquake emergency. In future, as the basic data become more complete, assessment of people trapped in earthquake based on km grid should provide more accurate and valid suggestions for earthquake emergency search and rescue.

  19. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  20. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  1. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  2. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  3. Connecting slow earthquakes to huge earthquakes

    NASA Astrophysics Data System (ADS)

    Obara, Kazushige; Kato, Aitaro

    2016-07-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  4. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  5. Earthquake ground motion: Chapter 3

    USGS Publications Warehouse

    Luco, Nicolas; Valley, Michael; Crouse, C.B.

    2012-01-01

    Most of the effort in seismic design of buildings and other structures is focused on structural design. This chapter addresses another key aspect of the design process—characterization of earthquake ground motion. Section 3.1 describes the basis of the earthquake ground motion maps in the Provisions and in ASCE 7. Section 3.2 has examples for the determination of ground motion parameters and spectra for use in design. Section 3.3 discusses and provides an example for the selection and scaling of ground motion records for use in response history analysis.

  6. Statistical tests of simple earthquake cycle models

    NASA Astrophysics Data System (ADS)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM < 4.0 × 1019 Pa s and ηM > 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  7. Predictable earthquakes?

    NASA Astrophysics Data System (ADS)

    Martini, D.

    2002-12-01

    acceleration) and global number of earthquake for this period from published literature which give us a great picture about the dynamical geophysical phenomena. Methodology: The computing of linear correlation coefficients gives us a chance to quantitatively characterise the relation among the data series, if we suppose a linear dependence in the first step. The correlation coefficients among the Earth's rotational acceleration and Z-orbit acceleration (perpendicular to the ecliptic plane) and the global number of the earthquakes were compared. The results clearly demonstrate the common feature of both the Earth's rotation and Earth's Z-acceleration around the Sun and also between the Earth's rotational acceleration and the earthquake number. This fact might means a strong relation among these phenomena. The mentioned rather strong correlation (r = 0.75) and the 29 year period (Saturn's synodic period) was clearly shown in the counted cross correlation function, which gives the dynamical characteristic of correlation, of Earth's orbital- (Z-direction) and rotational acceleration. This basic period (29 year) was also obvious in the earthquake number data sets with clear common features in time. Conclusion: The Core, which involves the secular variation of the Earth's magnetic field, is the only sufficiently mobile part of the Earth with a sufficient mass to modify the rotation which probably effects on the global time distribution of the earthquakes. Therefore it might means that the secular variation of the earthquakes is inseparable from the changes in Earth's magnetic field, i.e. the interior process of the Earth's core belongs to the dynamical state of the solar system. Therefore if the described idea is real the global distribution of the earthquakes in time is predictable.

  8. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  9. Response to “Comment on ‘Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set’” [J. Chem. Phys. 140, 177103 (2014)

    SciTech Connect

    Reuter, Matthew G.; Harrison, Robert J.

    2014-05-07

    The thesis of Brandbyge's comment [J. Chem. Phys. 140, 177103 (2014)] is that our operator decoupling condition is immaterial to transport theories, and it appeals to discussions of nonorthogonal basis sets in transport calculations in its arguments. We maintain that the operator condition is to be preferred over the usual matrix conditions and subsequently detail problems in the existing approaches. From this operator perspective, we conclude that nonorthogonal projectors cannot be used and that the projectors must be selected to satisfy the operator decoupling condition. Because these conclusions pertain to operators, the choice of basis set is not germane.

  10. EARTHQUAKE HAZARDS IN THE OFFSHORE ENVIRONMENT.

    USGS Publications Warehouse

    Page, Robert A.; Basham, Peter W.

    1985-01-01

    This report discusses earthquake effects and potential hazards in the marine environment, describes and illustrates methods for the evaluation of earthquake hazards, and briefly reviews strategies for mitigating hazards. The report is broadly directed toward engineers, scientists, and others engaged in developing offshore resources. The continental shelves have become a major frontier in the search for new petroleum resources. Much of the current exploration is in areas of moderate to high earthquake activity. If the resources in these areas are to be developed economically and safely, potential earthquake hazards must be identified and mitigated both in planning and regulating activities and in designing, constructing, and operating facilities. Geologic earthquake effects that can be hazardous to marine facilities and operations include surface faulting, tectonic uplift and subsidence, seismic shaking, sea-floor failures, turbidity currents, and tsunamis.

  11. On subduction zone earthquakes and the Pacific Northwest seismicity

    SciTech Connect

    Chung, Dae H.

    1991-12-01

    A short review of subduction zone earthquakes and the seismicity of the Pacific Northwest region of the United States is provided for the purpose of a basis for assessing issues related to earthquake hazard evaluations for the region. This review of seismotectonics regarding historical subduction zone earthquakes and more recent seismological studies pertaining to rupture processes of subduction zone earthquakes, with specific references to the Pacific Northwest, is made in this brief study. Subduction zone earthquakes tend to rupture updip and laterally from the hypocenter. Thus, the rupture surface tends to become more elongated as one considers larger earthquakes (there is limited updip distance that is strongly coupled, whereas rupture length can be quite large). The great Aleutian-Alaska earthquakes of 1957, 1964, and 1965 had rupture lengths of greater than 650 km. The largest earthquake observed instrumentally, the M{sub W} 9.5, 1960 Chile Earthquake, had a rupture length over 1000 km. However, earthquakes of this magnitude are very unlikely on Cascadia. The degree of surface shaking has a very strong dependency on the depth and style of rupture. The rupture surface during a great earthquake shows heterogeneous stress drop, displacement, energy release, etc. The high strength zones are traditionally termed asperities and these asperities control when and how large an earthquake is generated. Mapping of these asperities in specific subduction zones is very difficult before an earthquake. They show up more easily in inversions of dynamic source studies of earthquake ruptures, after an earthquake. Because seismic moment is based on the total radiated-energy from an earthquake, the moment-based magnitude M{sub W} is superior to all other magnitude estimates, such as M{sub L}, m{sub b}, M{sub bLg}, M{sub S}, etc Probably, just to have a common language, non-moment magnitudes should be converted to M{sub W} in any discussions of subduction zone earthquakes.

  12. Earthquake Archaeology: a logical approach?

    NASA Astrophysics Data System (ADS)

    Stewart, I. S.; Buck, V. A.

    2001-12-01

    Ancient earthquakes can leave their mark in the mythical and literary accounts of ancient peoples, the stratigraphy of their site histories, and the structural integrity of their constructions. Within this broad cross-disciplinary tramping ground, earthquake geologists have tended to focus on those aspects of the cultural record that are most familiar to them; the physical effects of seismic deformation on ancient constructions. One of the core difficulties with this 'earthquake archaeology' approach is that recent attempts to isolate structural criteria that are diagnostic or strongly suggestive of a seismic origin are undermined by the recognition that signs of ancient seismicity are generally indistinguishable from non-seismic mechanisms (poor construction, adverse geotechnical conditions). We illustrate the difficulties and inconsistencies in current proposed 'earthquake diagnostic' schemes by reference to two case studies of archaeoseismic damage in central Greece. The first concerns fallen columns at various Classical temple localities in mainland Greece (Nemea, Sounio, Olympia, Bassai) which, on the basis of observed structural criteria, are earthquake-induced but which are alternatively explained by archaeologists as the action of human disturbance. The second re-examines the almost type example of the Kyparissi site in the Atalanti region as a Classical stoa offset across a seismic surface fault, arguing instead for its deformation by ground instability. Finally, in highlighting the inherent ambiguity of archaeoseismic data, we consider the value of a logic-tree approach for quantifying and quantifying our uncertainities for seismic-hazard analysis.

  13. Chern-Simons gravity with (curvature){sup 2} and (torsion){sup 2} terms and a basis of degree-of-freedom projection operators

    SciTech Connect

    Helayeel-Neto, J. A.; Hernaski, C. A.; Pereira-Dias, B.; Vargas-Paredes, A. A.; Vasquez-Otoya, V. J.

    2010-09-15

    The effects of (curvature){sup 2}- and (torsion){sup 2}-terms in the Einstein-Hilbert-Chern-Simons Lagrangian are investigated. The purposes are two-fold: (i) to show the efficacy of an orthogonal basis of degree-of-freedom projection operators recently proposed and to ascertain its adequacy for obtaining propagators of general parity-breaking gravity models in three dimensions; (ii) to analyze the role of the topological Chern-Simons term for the unitarity and the particle spectrum of the model squared-curvature terms in connection with dynamical torsion. Our conclusion is that the Chern-Simons term does not influence the unitarity conditions imposed on the parameters of the Lagrangian but significantly modifies the particle spectrum.

  14. Automated Microwave Complex on the Basis of a Continuous-Wave Gyrotron with an Operating Frequency of 263 GHz and an Output Power of 1 kW

    NASA Astrophysics Data System (ADS)

    Glyavin, M. Yu.; Morozkin, M. V.; Tsvetkov, A. I.; Lubyako, L. V.; Golubiatnikov, G. Yu.; Kuftin, A. N.; Zapevalov, V. E.; V. Kholoptsev, V.; Eremeev, A. G.; Sedov, A. S.; Malygin, V. I.; Chirkov, A. V.; Fokin, A. P.; Sokolov, E. V.; Denisov, G. G.

    2016-02-01

    We study experimentally the automated microwave complex for microwave spectroscopy and diagnostics of various media, which was developed at the Institute of Applied Physics of the Russian Academy of Sciences in cooperation with GYCOM Ltd. on the basis of a gyrotron with a frequency of 263 GHz and operated at the first gyrofrequency harmonic. In the process of the experiments, a controllable output power of 0 .1 -1 kW was achieved with an efficiency of up to 17 % in the continuous-wave generation regime. The measured radiation spectrum with a relative width of about 10 -6 and the frequency values measured at various parameters of the device are presented. The results of measuring the parameters of the wave beam, which was formed by a built-in quasioptical converter, as well as the data obtained by measuring the heat loss in the cavity and the vacuum output window are analyzed.

  15. Effects of the 2011 Tohoku Earthquake on VLBI Geodetic Measurements

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Kurihara, S.; Behrend, D.

    2011-12-01

    The VLBI antenna TSUKUB32 at Tsukuba, Japan regularly observes in 24-hour observing sessions once per week with the R1 operational network and on additional days with other networks on a more irregular basis. Further, the antenna is an endpoint of the single-baseline, 1-hour Intensive sessions observed on the weekends for determination of UT1. TSUKUB32 returned to normal operational observing 25 days after the earthquake. The antenna is 160 km west and 240 km south of the epicenter (about the same distance west of the plate subduction boundary). We looked at the transient behavior of the TSUKUB32 position time series following the earthquake and found that significant deformation is continuing. The eastward rate as of July 2011, 4 months after the earthquake, is 20 cm/yr greater than the long-term rate prior to the earthquake. The VLBI series agrees with the corresponding JPL GPS series (M. B. Heflin, http://sideshow.jpl.nasa.gov/mbh/series.html, 2011) measured by the co-located GPS antenna TSUK. The coseismic UEN displacement at Tsukuba was approximately (-90 mm, 550 mm, 50 mm). We examined the effect of the variation of TSUKUB32 position on EOP estimates and specifically how best to correct its position for estimation of UT1 in the Intensive experiments. For this purpose and to provide operational UT1, the IVS scheduled a series of weekend Intensive sessions observing on the Kokee-Wettzell baseline immediately before each of the two Tsukuba-Wettzell Intensive sessions. Comparisons between UT1 estimates from these pairs of sessions were used in validating a model for the post-seismic displacement of TSUKUB32.

  16. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  17. Can Earthquakes Induced by Deep Fluid Injection Projects Be Controlled or Limited?

    NASA Astrophysics Data System (ADS)

    McGarr, A.; Williams, C. F.; Hickman, S.; Oppenheimer, D. H.

    2011-12-01

    Projects that involve the injection of high-pressure fluids at depth include Enhanced Geothermal Systems (EGS), CO2 sequestration and liquid waste disposal. We consider some case histories to address the question of the extent to which earthquakes induced by fluid injection can be controlled or limited. For instance, can induced earthquakes be controlled in ways that don't compromise the effectiveness of a given injection project? It is difficult to answer this question definitively because, to our knowledge, only one successful experiment in earthquake control has been performed (Raleigh et al., Science, v. 191, pp. 1230-1237, 1976). Moreover, for numerous injection projects, the induced earthquakes of maximum magnitude have been post shut-in, e.g., the Rocky Mountain Arsenal well, a liquid waste disposal project for which the three largest induced earthquakes occurred more than a year after injection had been terminated. For EGS operations requiring the injection of liquid into rock of low permeability, estimations of maximum magnitudes based on the volume of injected fluid have been moderately successful. For a typical magnitude distribution of induced earthquakes, it can be shown that the largest event accounts for about half of the total induced seismic moment, which is given by the volume of injected liquid multiplied by the modulus of rigidity (McGarr, J. Geophys. Res., v. 81, p. 1487, 1976). The Basel Deep Heat Mining project, an EGS injection of 11,500 cubic meters of water into low-permeability rock at a depth of five km, induced earthquakes with magnitudes that exceeded the safety threshold and so injection was discontinued (Deichmann and Giardini, Seismol. Res. Letters, v. 80, p. 784, 2009). Approximately half a day after shut-in, however, an earthquake of magnitude 3.4 occurred, the largest event of the sequence. It is worth noting that the magnitude of this earthquake is quite close to what could have been estimated based on the volume of injected

  18. Stress Drops for Potentially Induced Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  19. A Simplified Approach to the Basis Functions of Symmetry Operations and Terms of Metal Complexes in an Octahedral Field with d[superscript 1] to d[superscript 9] Configurations

    ERIC Educational Resources Information Center

    Lee, Liangshiu

    2010-01-01

    The basis sets for symmetry operations of d[superscript 1] to d[superscript 9] complexes in an octahedral field and the resulting terms are derived for the ground states and spin-allowed excited states. The basis sets are of fundamental importance in group theory. This work addresses such a fundamental issue, and the results are pedagogically…

  20. A Simplified Approach to the Basis Functions of Symmetry Operations and Terms of Metal Complexes in an Octahedral Field with d[superscript 1] to d[superscript 9] Configurations

    ERIC Educational Resources Information Center

    Lee, Liangshiu

    2010-01-01

    The basis sets for symmetry operations of d[superscript 1] to d[superscript 9] complexes in an octahedral field and the resulting terms are derived for the ground states and spin-allowed excited states. The basis sets are of fundamental importance in group theory. This work addresses such a fundamental issue, and the results are pedagogically…

  1. Evaluation of near-field earthquake effects

    SciTech Connect

    Shrivastava, H.P.

    1994-11-01

    Structures and equipment, which are qualified for the design basis earthquake (DBE) and have anchorage designed for the DBE loading, do not require an evaluation of the near-field earthquake (NFE) effects. However, safety class 1 acceleration sensitive equipment such as electrical relays must be evaluated for both NFE and DBE since they are known to malfunction when excited by high frequency seismic motions.

  2. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions.

  3. Earthquake Shaking and Damage to Buildings: Recent evidence for severe ground shaking raises questions about the earthquake resistance of structures.

    PubMed

    Page, R A; Joyner, W B; Blume, J A

    1975-08-22

    Ground shaking close to the causative fault of an earthquake is more intense than it was previously believed to be. This raises the possibility that large numbers of buildings and other structures are not sufficiently resistant for the intense levels of shaking that can occur close to the fault. Many structures were built before earthquake codes were adopted; others were built according to codes formulated when less was known about the intensity of near-fault shaking. Although many building types are more resistant than conventional design analyses imply, the margin of safety is difficult to quantify. Many modern structures, such as freeways, have not been subjected to and tested by near-fault shaking in major earthquakes (magnitude 7 or greater). Damage patterns in recent moderate-sized earthquakes occurring in or adjacent to urbanized areas (17), however, indicate that many structures, including some modern ones designed to meet earthquake code requirements, cannot withstand the severe shaking that can occur close to a fault. It is necessary to review the ground motion assumed and the methods utilized in the design of important existing structures and, if necessary, to strengthen or modify the use of structures that are found to be weak. New structures situated close to active faults should be designed on the basis of ground motion estimates greater than those used in the past. The ultimate balance between risk of earthquake losses and cost for both remedial strengthening and improved earthquake-resistant construction must be decided by the public. Scientists and engineers must inform the public about earthquake shaking and its effect on structures. The exposure to damage from seismic shaking is steadily increasing because of continuing urbanization and the increasing complexity of lifeline systems, such as power, water, transportation, and communication systems. In the near future we should expect additional painful examples of the damage potential of moderate

  4. Identification of Deep Earthquakes

    DTIC Science & Technology

    2010-09-01

    develop a ground truth dataset of earthquakes at both normal crustal depths and earthquakes from subduction zones , below the overlying crust. Many...deep earthquakes (depths between about 50 and 300 km). These deep earthquakes are known to occur in the Asia-India continental collision zone ...and/or NIL, as these stations are within a few hundred km of the zone where deep earthquakes are known to occur. To date we have selected about 300

  5. Impact of earthquakes on sex ratio at birth: Eastern Marmara earthquakes

    PubMed Central

    Doğer, Emek; Çakıroğlu, Yiğit; Köpük, Şule Yıldırım; Ceylan, Yasin; Şimşek, Hayal Uzelli; Çalışkan, Eray

    2013-01-01

    Objective: Previous reports suggest that maternal exposure to acute stress related to earthquakes affects the sex ratio at birth. Our aim was to examine the change in sex ratio at birth after Eastern Marmara earthquake disasters. Material and Methods: This study was performed using the official birth statistics from January 1997 to December 2002 – before and after 17 August 1999, the date of the Golcuk Earthquake – supplied from the Turkey Statistics Institute. The secondary sex ratio was expressed as the male proportion at birth, and the ratio of both affected and unaffected areas were calculated and compared on a monthly basis using data from gender with using the Chi-square test. Results: We observed significant decreases in the secondary sex ratio in the 4th and 8th months following an earthquake in the affected region compared to the unaffected region (p= 0.001 and p= 0.024). In the earthquake region, the decrease observed in the secondary sex ratio during the 8th month after an earthquake was specific to the period after the earthquake. Conclusion: Our study indicated a significant reduction in the secondary sex ratio after an earthquake. With these findings, events that cause sudden intense stress such as earthquakes can have an effect on the sex ratio at birth. PMID:24592082

  6. Material contrast does not predict earthquake rupture propagation direction

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    2005-01-01

    Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

  7. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  8. High voltage electric substation performance in earthquakes

    SciTech Connect

    Eidinger, J.; Ostrom, D.; Matsuda, E.

    1995-12-31

    This paper examines the performance of several types of high voltage substation equipment in past earthquakes. Damage data is provided in chart form. This data is then developed into a tool for estimating the performance of a substation subjected to an earthquake. First, suggests are made about the development of equipment class fragility curves that represent the expected earthquake performance of different voltages and types of equipment. Second, suggestions are made about how damage to individual pieces of equipment at a substation likely affects the post-earthquake performance of the substation as a whole. Finally, estimates are provided as to how quickly a substation, at various levels of damage, can be restored to operational service after the earthquake.

  9. Evidence for co-operativity in coenzyme binding to tetrameric Sulfolobus solfataricus alcohol dehydrogenase and its structural basis: fluorescence, kinetic and structural studies of the wild-type enzyme and non-co-operative N249Y mutant

    PubMed Central

    2005-01-01

    The interaction of coenzyme with thermostable homotetrameric NAD(H)-dependent alcohol dehydrogenase from the thermoacidophilic sulphur-dependent crenarchaeon Sulfolobus solfataricus (SsADH) and its N249Y (Asn-249→Tyr) mutant was studied using the high fluorescence sensitivity of its tryptophan residues Trp-95 and Trp-117 to the binding of coenzyme moieties. Fluorescence quenching studies performed at 25 °C show that SsADH exhibits linearity in the NAD(H) binding [the Hill coefficient (h)∼1) at pH 9.8 and at moderate ionic strength, in addition to positive co-operativity (h=2.0–2.4) at pH 7.8 and 6.8, and at pH 9.8 in the presence of salt. Furthermore, NADH binding is positively co-operative below 20 °C (h∼3) and negatively co-operative at 40–50 °C (h∼0.7), as determined at moderate ionic strength and pH 9.8. Steady-state kinetic measurements show that SsADH displays standard Michaelis–Menten kinetics between 35 and 45 °C, but exhibits positive and negative co-operativity for NADH oxidation below (h=3.3 at 20 °C) and above (h=0.7 at 70–80 °C) this range of temperatures respectively. However, N249Y SsADH displays non-co-operative behaviour in coenzyme binding under the same experimental conditions used for the wild-type enzyme. In loop 270–275 of the coenzyme domain and segments at the interface of dimer A–B, analyses of the wild-type and mutant SsADH structures identified the structural elements involved in the intersubunit communication and suggested a possible structural basis for co-operativity. This is the first report of co-operativity in a tetrameric ADH and of temperature-induced co-operativity in a thermophilic enzyme. PMID:15651978

  10. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  11. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  12. From Earthquake Prediction Research to Time-Variable Seismic Hazard Assessment Applications

    NASA Astrophysics Data System (ADS)

    Bormann, Peter

    2011-01-01

    The first part of the paper defines the terms and classifications common in earthquake prediction research and applications. This is followed by short reviews of major earthquake prediction programs initiated since World War II in several countries, for example the former USSR, China, Japan, the United States, and several European countries. It outlines the underlying expectations, concepts, and hypotheses, introduces the technologies and methodologies applied and some of the results obtained, which include both partial successes and failures. Emphasis is laid on discussing the scientific reasons why earthquake prediction research is so difficult and demanding and why the prospects are still so vague, at least as far as short-term and imminent predictions are concerned. However, classical probabilistic seismic hazard assessments, widely applied during the last few decades, have also clearly revealed their limitations. In their simple form, they are time-independent earthquake rupture forecasts based on the assumption of stable long-term recurrence of earthquakes in the seismotectonic areas under consideration. Therefore, during the last decade, earthquake prediction research and pilot applications have focused mainly on the development and rigorous testing of long and medium-term rupture forecast models in which event probabilities are conditioned by the occurrence of previous earthquakes, and on their integration into neo-deterministic approaches for improved time-variable seismic hazard assessment. The latter uses stress-renewal models that are calibrated for variations in the earthquake cycle as assessed on the basis of historical, paleoseismic, and other data, often complemented by multi-scale seismicity models, the use of pattern-recognition algorithms, and site-dependent strong-motion scenario modeling. International partnerships and a global infrastructure for comparative testing have recently been developed, for example the Collaboratory for the Study of

  13. EARTHQUAKE CAUSED RELEASES FROM A NUCLEAR FUEL CYCLE FACILITY

    SciTech Connect

    Charles W. Solbrig; Chad Pope; Jason Andrus

    2014-08-01

    The fuel cycle facility (FCF) at the Idaho National Laboratory is a nuclear facility which must be licensed in order to operate. A safety analysis is required for a license. This paper describes the analysis of the Design Basis Accident for this facility. This analysis involves a model of the transient behavior of the FCF inert atmosphere hot cell following an earthquake initiated breach of pipes passing through the cell boundary. The hot cell is used to process spent metallic nuclear fuel. Such breaches allow the introduction of air and subsequent burning of pyrophoric metals. The model predicts the pressure, temperature, volumetric releases, cell heat transfer, metal fuel combustion, heat generation rates, radiological releases and other quantities. The results show that releases from the cell are minimal and satisfactory for safety. This analysis method should be useful in other facilities that have potential for damage from an earthquake and could eliminate the need to back fit facilities with earthquake proof boundaries or lessen the cost of new facilities.

  14. The use of volunteer interpreters during the 201 0 Haiti earthquake: lessons learned from the USNS COMFORT Operation Unified Response Haiti.

    PubMed

    Powell, Clydette; Pagliara-Miller, Claire

    2012-01-01

    On January 12, 2010, a 7.0 magnitude Richter earthquake devastated Haiti, leading to the world's largest humanitarian effort in 60 years. The catastrophe led to massive destruction of homes and buildings, the loss of more than 200,000 lives, and overwhelmed the host nation response and its public health infrastructure. Among the many responders, the United States Government acted immediately by sending assistance to Haiti including a naval hospital ship as a tertiary care medical center, the USNS COMFORT. To adequately respond to the acute needs of patients, healthcare professionals on the USNS COMFORT relied on Haitian Creole-speaking volunteers who were recruited by the American Red Cross (ARC). These volunteers complemented full-time Creole-speaking military staff on board. The ARC provided 78 volunteers who were each able to serve up to 4 weeks on board. Volunteers' demographics, such as age and gender, as well as linguistic skills, work background, and prior humanitarian assistance experience varied. Volunteer efforts were critical in assisting with informed consent for surgery, family reunification processes, explanation of diagnosis and treatment, comfort to patients and families in various stages of grieving and death, and helping healthcare professionals to understand the cultural context and sensitivities unique to Haiti. This article explores key lessons learned in the use of volunteer interpreters in earthquake disaster relief in Haiti and highlights the approaches that optimize volunteer services in such a setting, and which may be applicable in similar future events.

  15. Performance of lifelines during the January 17, 1994 Northridge earthquake

    SciTech Connect

    Eguchi, R.T.; Chung, R.M.

    1995-12-31

    The occurrence for the January 17, 1994 Northridge earthquake has provided a unique opportunity to study the earthquake performance of lifeline systems. This particular areas has experienced two major earthquake events in the last 25 years, each playing a significant role in changing the way in which one designs and constructs lifeline systems for earthquake. In 1971, the San Fernando earthquake shook apart many lifeline systems causing significant damage and service disruption to Los Angeles area residents and businesses. As a result of this earthquake, special investigations were initiated to better understand and design these systems to remain functional after moderate and major earthquakes. Because of these post-1971 efforts, significant damage to lifelines was minimized in the January event. In each new earthquake, however, new lessons are learned, and as a result of these lessons, changes in either design or operational procedures are made to reduce the effects in future events. In the Northridge earthquake, some of the most significant lessons include effects on electric power system components and older steel natural gas transmission pipelines. This paper attempts to identify where lessons from previous southern California earthquakes were useful in preparing for the Northridge earthquake. In addition, areas that deserve further research or analysis, as a result of new lessons learned from the Northridge earthquake, are identified.

  16. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  17. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  18. POST Earthquake Debris Management — AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  19. Earthquake friction

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2016-12-01

    Laboratory friction slip experiments on rocks provide firm evidence that the static friction coefficient μ has values ∼0.7. This would imply large amounts of heat produced by seismically active faults, but no heat flow anomaly is observed, and mineralogic evidence of frictional heating is virtually absent. This stands for lower μ values ∼0.2, as also required by the observed orientation of faults with respect to the maximum compressive stress. We show that accounting for the thermal and mechanical energy balance of the system removes this inconsistence, implying a multi-stage strain release process. The first stage consists of a small and slow aseismic slip at high friction on pre-existent stress concentrators within the fault volume but angled with the main fault as Riedel cracks. This introduces a second stage dominated by frictional temperature increase inducing local pressurization of pore fluids around the slip patches, which is in turn followed by a third stage in which thermal diffusion extends the frictionally heated zones making them coalesce into a connected pressurized region oriented as the fault plane. Then, the system enters a state of equivalent low static friction in which it can undergo the fast elastic radiation slip prescribed by dislocation earthquake models.

  20. The uncertainty in earthquake conditional probabilities

    USGS Publications Warehouse

    Savage, J.C.

    1992-01-01

    The Working Group on California Earthquake Probabilities (WGCEP) questioned the relevance of uncertainty intervals assigned to earthquake conditional probabilities on the basis that the uncertainty in the probability estimate seemed to be greater the smaller the intrinsic breadth of the recurrence-interval distribution. It is shown here that this paradox depends upon a faulty measure of uncertainty in the conditional probability and that with a proper measure of uncertainty no paradox exists. The assertion that the WGCEP probability assessment in 1988 correctly forecast the 1989 Loma Prieta earthquake is also challenged by showing that posterior probability of rupture inferred after the occurrence of the earthquake from the prior WGCEP probability distribution reverts to a nearly informationless distribution. -Author

  1. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  2. Estimating earthquake potential

    USGS Publications Warehouse

    Page, R.A.

    1980-01-01

    The hazards to life and property from earthquakes can be minimized in three ways. First, structures can be designed and built to resist the effects of earthquakes. Second, the location of structures and human activities can be chosen to avoid or to limit the use of areas known to be subject to serious earthquake hazards. Third, preparations for an earthquake in response to a prediction or warning can reduce the loss of life and damage to property as well as promote a rapid recovery from the disaster. The success of the first two strategies, earthquake engineering and land use planning, depends on being able to reliably estimate the earthquake potential. The key considerations in defining the potential of a region are the location, size, and character of future earthquakes and frequency of their occurrence. Both historic seismicity of the region and the geologic record are considered in evaluating earthquake potential. 

  3. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  4. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, Susan E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  5. The size of earthquakes

    USGS Publications Warehouse

    Kanamori, H.

    1980-01-01

    How we should measure the size of an earthquake has been historically a very important, as well as a very difficult, seismological problem. For example, figure 1 shows the loss of life caused by earthquakes in recent times and clearly demonstrates that 1976 was the worst year for earthquake casualties in the 20th century. However, the damage caused by an earthquake is due not only to its physical size but also to other factors such as where and when it occurs; thus, figure 1 is not necessarily an accurate measure of the "size" of earthquakes in 1976. the point is that the physical process underlying an earthquake is highly complex; we therefore cannot express every detail of an earthquake by a simple straightforward parameter. Indeed, it would be very convenient if we could find a single number that represents the overall physical size of an earthquake. This was in fact the concept behind the Richter magnitude scale introduced in 1935. 

  6. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  7. A Century of Induced Earthquakes in Oklahoma

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Page, M. T.

    2015-12-01

    Seismicity rates have increased sharply since 2009 in the central and eastern United States, with especially high rates of activity in the state of Oklahoma. A growing body of evidence indicates that many of these events are induced, primarily by injection of wastewater in deep disposal wells. The upsurge in activity has raised the questions, what is the background rate of tectonic earthquakes in Oklahoma? And how much has the rate varied throughout historical and early instrumental times? We first review the historical catalog, including assessment of the completeness level of felt earthquakes, and show that seismicity rates since 2009 surpass previously observed rates throughout the 20th century. Furthermore, several lines of evidence suggest that most of the significant (Mw > 3.5) earthquakes in Oklahoma during the 20th century were likely induced by wastewater injection and/or enhanced oil recovery operations. We show that there is a statistically significant temporal and spatial correspondence between earthquakes and disposal wells permitted during the 1950s. The intensity distributions of the 1952 Mw5.7 El Reno earthquake and the 1956 Mw3.9 Tulsa county earthquake are similar to those from recent induced earthquakes, with significantly lower shaking than predicted given a regional intensity-prediction equation. The rate of tectonic earthquakes is thus inferred to be significantly lower than previously estimated throughout most of the state, but is difficult to estimate given scant incontrovertible evidence for significant tectonic earthquakes during the 20th century. We do find evidence for a low level of tectonic seismicity in southeastern Oklahoma associated with the Ouachita structural belt, and conclude that the 22 October 1882 Choctaw Nation earthquake, for which we estimate Mw4.8, occurred in this zone.

  8. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  9. Operations

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.

    2013-01-01

    Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this study,…

  10. Statistical tests of simple earthquake cycle models

    USGS Publications Warehouse

    Devries, Phoebe M. R.; Evans, Eileen

    2016-01-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM <~ 4.0 × 1019 Pa s and ηM >~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  11. Can We Predict Earthquakes?

    ScienceCinema

    Johnson, Paul

    2016-09-09

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  12. Can We Predict Earthquakes?

    SciTech Connect

    Johnson, Paul

    2016-08-31

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  13. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  14. Earthquakes, October 1975

    USGS Publications Warehouse

    Person, W.J.

    1976-01-01

    October was an active month seismically, although there were no damaging earthquakes in the United States. Several States experienced earthquakes that were felt sharply. There were four major earthquakes in other parts of the world, including a magntidue 7.4 in the Philippine Islands that killed on person. 

  15. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  16. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  17. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  18. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  19. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  20. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  1. Earthquake Prediction and Forecasting

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Prospects for earthquake prediction and forecasting, and even their definitions, are actively debated. Here, "forecasting" means estimating the future earthquake rate as a function of location, time, and magnitude. Forecasting becomes "prediction" when we identify special conditions that make the immediate probability much higher than usual and high enough to justify exceptional action. Proposed precursors run from aeronomy to zoology, but no identified phenomenon consistently precedes earthquakes. The reported prediction of the 1975 Haicheng, China earthquake is often proclaimed as the most successful, but the success is questionable. An earthquake predicted to occur near Parkfield, California in 1988±5 years has not happened. Why is prediction so hard? Earthquakes start in a tiny volume deep within an opaque medium; we do not know their boundary conditions, initial conditions, or material properties well; and earthquake precursors, if any, hide amongst unrelated anomalies. Earthquakes cluster in space and time, and following a quake earthquake probability spikes. Aftershocks illustrate this clustering, and later earthquakes may even surpass earlier ones in size. However, the main shock in a cluster usually comes first and causes the most damage. Specific models help reveal the physics and allow intelligent disaster response. Modeling stresses from past earthquakes may improve forecasts, but this approach has not yet been validated prospectively. Reliable prediction of individual quakes is not realistic in the foreseeable future, but probabilistic forecasting provides valuable information for reducing risk. Recent studies are also leading to exciting discoveries about earthquakes.

  2. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  3. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  4. Safety Basis Report

    SciTech Connect

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  5. Application of Seismic Array Processing to Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Meng, L.; Allen, R. M.; Ampuero, J. P.

    2013-12-01

    Earthquake early warning (EEW) systems that can issue warnings prior to the arrival of strong ground shaking during an earthquake are essential in mitigating seismic hazard. Many of the currently operating EEW systems work on the basis of empirical magnitude-amplitude/frequency scaling relations for a point source. This approach is of limited effectiveness for large events, such as the 2011 Tohoku-Oki earthquake, for which ignoring finite source effects may result in underestimation of the magnitude. Here, we explore the concept of characterizing rupture dimensions in real time for EEW using clusters of dense low-cost accelerometers located near active faults. Back tracing the waveforms recorded by such arrays allows the estimation of the earthquake rupture size, duration and directivity in real-time, which enables the EEW of M > 7 earthquakes. The concept is demonstrated with the 2004 Parkfield earthquake, one of the few big events (M>6) that have been recorded by a local small-scale seismic array (UPSAR array, Fletcher et al, 2006). We first test the approach against synthetic rupture scenarios constructed by superposition of empirical Green's functions. We find it important to correct for the bias in back azimuth induced by dipping structures beneath the array. We implemented the proposed methodology to the mainshock in a simulated real-time environment. After calibrating the dipping-layer effect with data from smaller events, we obtained an estimated rupture length of 9 km, consistent with the distance between the two main high frequency subevents identified by back-projection using all local stations (Allman and Shearer, 2007). We proposed to deploy small-scale arrays every 30 km along the San Andreas Fault. The array processing is performed in local processing centers at each array. The output is compared with finite fault solutions based on real-time GPS system and then incorporated into the standard ElarmS system. The optimal aperture and array geometry is

  6. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  7. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  8. Prospective testing based on joint pre-earthquake signal observations: Case studies for 2013

    NASA Astrophysics Data System (ADS)

    Kalenda, Pavel; Ouzounov, Dimitar

    2014-05-01

    We present results from our prospective testing of rock deformation measurements (Neumann & Kalenda 2010) in combination with atmospheric pre-earthquake signals (thermal radiation data from several polar orbit satellites) observed during 2013 and related to the major seismic events of M7+. We designed an atmospheric pre-earthquake signals approach according to the theoretical concept of LAIC - Lithosphere-Atmosphere-Ionosphere coupling (Pulinets & Ouzounov, 2011), operating between the crust and the atmosphere/ ionosphere. The basis of the joint analysis of different pre-earthquake signals follows the Dobrovolsky (1997) formula for the estimation of the earthquake preparation zone and the LAIC physical concept. The non-linear process of preparation of the strongest earthquakes influences the global stress field, which leads to the global response and coupling within the Earth geo-space-lithosphere-atmosphere-ionosphere and affects multi-parameter observations from the ground and space. In 2013 about 19 major earthquakes (M≥7) occurred at 16 independent localities. Six of them had been jointly alerted and studied in advance. The satellite monitoring and deformometry measurements could forecast (prospectively) four of them: M7.7 Jan 5, Alaska; M7.9 Feb 6, Santa Cruz; M7.8 April 16, Iran-Pakistan and M7.7 Sept 24, Pakistan. The largest event for 2013 the M8.3 in the Okhotsk Sea was alerted in advance using both the methods but the estimated location from the satellite measurement was outside the real epicenter (unsuccessful forecast). The M7.7 event in the Scotia Sea was alerted only by the deformometry measurement (only as a direction towards Chile from Europe), because the area was not part of the satellite monitoring regions. The primary outcome from the 2013 test shows two major results: (1) Real-time tests have showed the presence of anomalies in the rocks deformation measurements and the following atmospheric pre-earthquake signals associated with the tested M

  9. Anthropogenic seismicity rates and operational parameters at the Salton Sea Geothermal Field.

    PubMed

    Brodsky, Emily E; Lajoie, Lia J

    2013-08-02

    Geothermal power is a growing energy source; however, efforts to increase production are tempered by concern over induced earthquakes. Although increased seismicity commonly accompanies geothermal production, induced earthquake rate cannot currently be forecast on the basis of fluid injection volumes or any other operational parameters. We show that at the Salton Sea Geothermal Field, the total volume of fluid extracted or injected tracks the long-term evolution of seismicity. After correcting for the aftershock rate, the net fluid volume (extracted-injected) provides the best correlation with seismicity in recent years. We model the background earthquake rate with a linear combination of injection and net production rates that allows us to track the secular development of the field as the number of earthquakes per fluid volume injected decreases over time.

  10. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  11. [Biomechanical characteristics of spinal cord tissue--basis for the development of modifications of the DREZ (dorsal root entry zone) operation].

    PubMed

    Spaić, M; Mikicić, D; Ilić, S; Milosavljević, I; Ivanović, S; Slavik, E; Antić, B

    2004-01-01

    Mechanical properties of the spinal cord tissue--biological basis for the development of the modality of the DREZ surgery lesioning technique Succesful treatment of the chronic neurogenic pain of spinal cord and cauda equina injury origin remains a significant management problem. The mechanism of this pain phenomenon has been shown to be related to neurochemical changes that lead to the state of hypereactivity of the second order dorsal horn neurons. The DREZ surgery (Dorsal Root Entry Zone lesion), designed to destroy anatomy structures involved in pain generating thus interrupting the neurogenic pain mechanism, as a causative procedure in treating this chronic pain, has been performed by using different technical modalities: Radiofrequency (RF) coagulation technic, Laser, Ultrasound and Microsurgical DREZotomy technic. The purpose of the study was to assess the possibility for the establishment of the lesioning technic based on the natural difference in the mechanical properties between the white and gray cord substance. We experimentally deteminated mechanical properties of the human cadaveric cord white versus gray tissue for the purpose of testing possibility of selective suction of the dorsal horn gray substance as a DREZ lesioning procedure. Based on the fact of the difference in tissue elasticity between white and gray cord substance we established a new and simple DREZ surgical lesioning technique that was tested on cadaver cord. For the purpose of testing and comparing the size and shape of the DREZ lesion axchieved the DREZ surgery has been performed on cadaver cord by employing selective dorsal horn suction as a lesioning method. After the procedure cadaver cord underwent histological fixation and analysis of the DREZ lesions achieved. Our result revealed that the white cord substance with longitudinal fiber structure had four time higher dynamical viscosity than gray substance of local neuronal network structure (150 PaS versus 37.5 PaS) that provided

  12. Retrospective Evaluation of Earthquake Forecasts during the 2010-12 Canterbury, New Zealand, Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Marzocchi, W.; Taroni, M.; Zechar, J. D.; Gerstenberger, M.; Liukis, M.; Rhoades, D. A.; Cattania, C.; Christophersen, A.; Hainzl, S.; Helmstetter, A.; Jimenez, A.; Steacy, S.; Jordan, T. H.

    2014-12-01

    The M7.1 Darfield, New Zealand (NZ), earthquake triggered a complex earthquake cascade that provides a wealth of new scientific data to study earthquake triggering and the predictive skill of statistical and physics-based forecasting models. To this end, the Collaboratory for the Study of Earthquake Predictability (CSEP) is conducting a retrospective evaluation of over a dozen short-term forecasting models that were developed by groups in New Zealand, Europe and the US. The statistical model group includes variants of the Epidemic-Type Aftershock Sequence (ETAS) model, non-parametric kernel smoothing models, and the Short-Term Earthquake Probabilities (STEP) model. The physics-based model group includes variants of the Coulomb stress triggering hypothesis, which are embedded either in Dieterich's (1994) rate-state formulation or in statistical Omori-Utsu clustering formulations (hybrid models). The goals of the CSEP evaluation are to improve our understanding of the physical mechanisms governing earthquake triggering, to improve short-term earthquake forecasting models and time-dependent hazard assessment for the Canterbury area, and to understand the influence of poor-quality, real-time data on the skill of operational (real-time) forecasts. To assess the latter, we use the earthquake catalog data that the NZ CSEP Testing Center archived in near real-time during the earthquake sequence and compare the predictive skill of models using the archived data as input with the skill attained using the best available data today. We present results of the retrospective model comparison and discuss implications for operational earthquake forecasting.

  13. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed Central

    Grant, Rachel A.; Conlan, Hilary

    2013-01-01

    juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co-incidence. Statistical analysis of the data indicated frog swarms are unlikely to be connected with earthquakes. Reports of unusual behaviour giving rise to earthquake fears should be interpreted with caution, and consultation with experts in the field of earthquake biology is advised. PMID:26479746

  14. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  15. Comment on "Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set" [J. Chem. Phys. 139, 114104 (2013)

    NASA Astrophysics Data System (ADS)

    Brandbyge, Mads

    2014-05-01

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an "implicit decoupling assumption," leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, and that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.

  16. Comment on “Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set” [J. Chem. Phys. 139, 114104 (2013)

    SciTech Connect

    Brandbyge, Mads

    2014-05-07

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an “implicit decoupling assumption,” leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, and that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.

  17. Earthquakes and the office-based surgeon.

    PubMed Central

    Conover, W A

    1992-01-01

    A major earthquake may strike while a surgeon is performing an operation in an office surgical facility. A sudden major fault disruption will lead to thousands of casualties and widespread destruction. Surgeons who operate in offices can help lessen havoc by careful preparation. These plans should coordinate with other disaster plans for effective triage, evacuation, and the treatment of casualties. PMID:1413756

  18. Maximum magnitude earthquakes induced by fluid injection

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2014-02-01

    Analysis of numerous case histories of earthquake sequences induced by fluid injection at depth reveals that the maximum magnitude appears to be limited according to the total volume of fluid injected. Similarly, the maximum seismic moment seems to have an upper bound proportional to the total volume of injected fluid. Activities involving fluid injection include (1) hydraulic fracturing of shale formations or coal seams to extract gas and oil, (2) disposal of wastewater from these gas and oil activities by injection into deep aquifers, and (3) the development of enhanced geothermal systems by injecting water into hot, low-permeability rock. Of these three operations, wastewater disposal is observed to be associated with the largest earthquakes, with maximum magnitudes sometimes exceeding 5. To estimate the maximum earthquake that could be induced by a given fluid injection project, the rock mass is assumed to be fully saturated, brittle, to respond to injection with a sequence of earthquakes localized to the region weakened by the pore pressure increase of the injection operation and to have a Gutenberg-Richter magnitude distribution with a b value of 1. If these assumptions correctly describe the circumstances of the largest earthquake, then the maximum seismic moment is limited to the volume of injected liquid times the modulus of rigidity. Observations from the available case histories of earthquakes induced by fluid injection are consistent with this bound on seismic moment. In view of the uncertainties in this analysis, however, this should not be regarded as an absolute physical limit.

  19. Maximum magnitude earthquakes induced by fluid injection

    USGS Publications Warehouse

    McGarr, Arthur F.

    2014-01-01

    Analysis of numerous case histories of earthquake sequences induced by fluid injection at depth reveals that the maximum magnitude appears to be limited according to the total volume of fluid injected. Similarly, the maximum seismic moment seems to have an upper bound proportional to the total volume of injected fluid. Activities involving fluid injection include (1) hydraulic fracturing of shale formations or coal seams to extract gas and oil, (2) disposal of wastewater from these gas and oil activities by injection into deep aquifers, and (3) the development of enhanced geothermal systems by injecting water into hot, low-permeability rock. Of these three operations, wastewater disposal is observed to be associated with the largest earthquakes, with maximum magnitudes sometimes exceeding 5. To estimate the maximum earthquake that could be induced by a given fluid injection project, the rock mass is assumed to be fully saturated, brittle, to respond to injection with a sequence of earthquakes localized to the region weakened by the pore pressure increase of the injection operation and to have a Gutenberg-Richter magnitude distribution with a b value of 1. If these assumptions correctly describe the circumstances of the largest earthquake, then the maximum seismic moment is limited to the volume of injected liquid times the modulus of rigidity. Observations from the available case histories of earthquakes induced by fluid injection are consistent with this bound on seismic moment. In view of the uncertainties in this analysis, however, this should not be regarded as an absolute physical limit.

  20. Building Loss Estimation for Earthquake Insurance Pricing

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Erdik, M.; Sesetyan, K.; Demircioglu, M. B.; Fahjan, Y.; Siyahi, B.

    2005-12-01

    After the 1999 earthquakes in Turkey several changes in the insurance sector took place. A compulsory earthquake insurance scheme was introduced by the government. The reinsurance companies increased their rates. Some even supended operations in the market. And, most important, the insurance companies realized the importance of portfolio analysis in shaping their future market strategies. The paper describes an earthquake loss assessment methodology that can be used for insurance pricing and portfolio loss estimation that is based on our work esperience in the insurance market. The basic ingredients are probabilistic and deterministic regional site dependent earthquake hazard, regional building inventory (and/or portfolio), building vulnerabilities associated with typical construction systems in Turkey and estimations of building replacement costs for different damage levels. Probable maximum and average annualized losses are estimated as the result of analysis. There is a two-level earthquake insurance system in Turkey, the effect of which is incorporated in the algorithm: the national compulsory earthquake insurance scheme and the private earthquake insurance system. To buy private insurance one has to be covered by the national system, that has limited coverage. As a demonstration of the methodology we look at the case of Istanbul and use its building inventory data instead of a portfolio. A state-of-the-art time depent earthquake hazard model that portrays the increased earthquake expectancies in Istanbul is used. Intensity and spectral displacement based vulnerability relationships are incorporated in the analysis. In particular we look at the uncertainty in the loss estimations that arise from the vulnerability relationships, and at the effect of the implemented repair cost ratios.

  1. Observing Triggered Earthquakes Across Iran with Calibrated Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Karasozen, E.; Bergman, E.; Ghods, A.; Nissen, E.

    2016-12-01

    We investigate earthquake triggering phenomena in Iran by analyzing patterns of aftershock activity around mapped surface ruptures. Iran has an intense level of seismicity (> 40,000 events listed in the ISC Bulletin since 1960) due to it accommodating a significant portion of the continental collision between Arabia and Eurasia. There are nearly thirty mapped surface ruptures associated with earthquakes of M 6-7.5, mostly in eastern and northwestern Iran, offering a rich potential to study the kinematics of earthquake nucleation, rupture propagation, and subsequent triggering. However, catalog earthquake locations are subject to up to 50 km of location bias from the combination of unknown Earth structure and unbalanced station coverage, making it challenging to assess both the rupture directivity of larger events and the spatial patterns of their aftershocks. To overcome this limitation, we developed a new two-tiered multiple-event relocation approach to obtain hypocentral parameters that are minimally biased and have realistic uncertainties. In the first stage, locations of small clusters of well-recorded earthquakes at local spatial scales (100s of events across 100 km length scales) are calibrated either by using near-source arrival times or independent location constraints (e.g. local aftershock studies, InSAR solutions), using an implementation of the Hypocentroidal Decomposition relocation technique called MLOC. Epicentral uncertainties are typically less than 5 km. Then, these events are used as prior constraints in the code BayesLoc, a Bayesian relocation technique that can handle larger datasets, to yield region-wide calibrated hypocenters (1000s of events over 1000 km length scales). With locations and errors both calibrated, the pattern of aftershock activity can reveal the type of the earthquake triggering: dynamic stress changes promote an increase in the seismicity rate in the direction of unilateral propagation, whereas static stress changes should

  2. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  3. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  4. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  5. Investigations on Real-time GPS for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.; Aranha, M. A.; Melgar, D.; Allen, R. M.

    2015-12-01

    The Geodetic Alarm System (G-larmS) is a software system developed in a collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech (NMT) primarily for real-time Earthquake Early Warning (EEW). It currently uses high rate (1Hz), low latency (< ~5 seconds), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from existing EEW algorithms, e.g. ElarmS, to compute static offsets upon S-wave arrival. G-larmS performs a least squares inversion on these offsets to determine slip on a finite fault, which we use to estimate moment magnitude. These computations are repeated every second for the duration of the event. G-larmS has been in continuous operation at the BSL for over a year using event triggers from the California Integrated Seismic Network (CISN) ShakeAlert system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California. Pairs of stations are processed as baselines using trackRT (MIT software package). G-larmS produced good results in real-time during the South Napa (M 6.0, August 2014) earthquake as well as on several replayed and simulated test cases. We evaluate the performance of G-larmS for EEW by analysing the results using a set of well defined test cases to investigate the following: (1) using multiple fault regimes and concurrent processing with the ultimate goal of achieving model generation (slip and magnitude computations) within each 1 second GPS epoch on very large magnitude earthquakes (up to M 9.0), (2) the use of Precise Point Positioning (PPP) real-time data streams of various operators, accuracies, latencies and formats along with baseline data streams, (3) collaboratively expanding EEW coverage along the U.S. West Coast on a regional network basis for Northern California, Southern California and Cascadia.

  6. Compiling the 'Global Earthquake History' (1000-1903)

    NASA Astrophysics Data System (ADS)

    Albini, P.; Musson, R.; Locati, M.; Rovida, A.

    2013-12-01

    The study of historical earthquakes from historical sources, or historical seismology, is of wider interest than just the seismic hazard and risk community. In the scope of the two-year project (October 2010-March 2013) "Global Earthquake History", developed in the framework of GEM, a reassessment of world historical seismicity was made, from available published studies. The scope of the project is the time window 1000-1903, with magnitudes 7.0 and above. Events with lower magnitudes are included on a case by case, or region by region, basis. The Global Historical Earthquake Archive (GHEA) provides a complete account of the global situation in historical seismology. From GHEA, the Global Historical Earthquake Catalogue (GHEC, v1, available at http://www.emidius.eu/GEH/, under Creative Commons licence) was derived, i.e. a world catalogue of earthquakes for the period 1000-1903, with magnitude 7 and over, using publically-available materials, as for the Archive. This is intended to be the best global historical catalogue of large earthquakes presently available, with the best parameters selected, duplications and fakes removed, and in some cases, new earthquakes discovered. GHEA and GHEC are conceived as providing a basis for co-ordinating future research into historical seismology in any part of the world, and hopefully, encouraging new historical earthquake research initiatives that will continue to improve the information available.

  7. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  8. Earthquake fluctuations in wells in New Jersey

    USGS Publications Warehouse

    Austin, Charles R.

    1960-01-01

    New Jersey is fortunate to be situated in a region that is relatively stable, geologically. For this reason scientists believe, on the basis of the best scientific evidence available, that the chances of New Jersey experiencing a major earthquake are very small. The last major earthquake on the east coast occurred at Charleston, S. C., in 1886. Minor shocks have been felt in New Jersey, however, from time to time. Reports of dishes being rattled or even of plaster in buildings being cracked are not uncommon. These minor disturbances are generally restricted to relatively small areas.

  9. Earthquake forecasting during the complex Amatrice-Norcia seismic sequence

    PubMed Central

    Marzocchi, Warner; Taroni, Matteo; Falcone, Giuseppe

    2017-01-01

    Earthquake forecasting is the ultimate challenge for seismologists, because it condenses the scientific knowledge about the earthquake occurrence process, and it is an essential component of any sound risk mitigation planning. It is commonly assumed that, in the short term, trustworthy earthquake forecasts are possible only for typical aftershock sequences, where the largest shock is followed by many smaller earthquakes that decay with time according to the Omori power law. We show that the current Italian operational earthquake forecasting system issued statistically reliable and skillful space-time-magnitude forecasts of the largest earthquakes during the complex 2016–2017 Amatrice-Norcia sequence, which is characterized by several bursts of seismicity and a significant deviation from the Omori law. This capability to deliver statistically reliable forecasts is an essential component of any program to assist public decision-makers and citizens in the challenging risk management of complex seismic sequences. PMID:28924610

  10. Earthquake forecasting during the complex Amatrice-Norcia seismic sequence.

    PubMed

    Marzocchi, Warner; Taroni, Matteo; Falcone, Giuseppe

    2017-09-01

    Earthquake forecasting is the ultimate challenge for seismologists, because it condenses the scientific knowledge about the earthquake occurrence process, and it is an essential component of any sound risk mitigation planning. It is commonly assumed that, in the short term, trustworthy earthquake forecasts are possible only for typical aftershock sequences, where the largest shock is followed by many smaller earthquakes that decay with time according to the Omori power law. We show that the current Italian operational earthquake forecasting system issued statistically reliable and skillful space-time-magnitude forecasts of the largest earthquakes during the complex 2016-2017 Amatrice-Norcia sequence, which is characterized by several bursts of seismicity and a significant deviation from the Omori law. This capability to deliver statistically reliable forecasts is an essential component of any program to assist public decision-makers and citizens in the challenging risk management of complex seismic sequences.

  11. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  12. Earthquakes, November-December 1973

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria. 

  13. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  14. Earthquake history of Oregon

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Although situated between two States (California and Washington) that have has many violent earthquakes, Oregon is noticeably less active seismically. the greatest damage experienced resulted from a major shock near Olympia, Wash., in 1949. During the short history record available (since 1841), 34 earthquakes of intensity V, Modified Mercalli Scale, or greater have centered within Oregon or near its borders. Only 13 of the earthquakes had an intensity above V, and many of the shocks were local. However, a 1936 earthquake in the eastern Oregon-Washington region caused extensive damage and was felt over an area of 272,000 square kilometers. 

  15. Earthquakes in Alaska

    USGS Publications Warehouse

    Haeussler, Peter J.; Plafker, George

    1995-01-01

    Earthquake risk is high in much of the southern half of Alaska, but it is not the same everywhere. This map shows the overall geologic setting in Alaska that produces earthquakes. The Pacific plate (darker blue) is sliding northwestward past southeastern Alaska and then dives beneath the North American plate (light blue, green, and brown) in southern Alaska, the Alaska Peninsula, and the Aleutian Islands. Most earthquakes are produced where these two plates come into contact and slide past each other. Major earthquakes also occur throughout much of interior Alaska as a result of collision of a piece of crust with the southern margin.

  16. Earthquakes of the Holocene.

    USGS Publications Warehouse

    Schwartz, D.P.

    1987-01-01

    Areas in which significant new data and insights have been obtained are: 1) fault slip rates; 2) earthquake recurrence models; 3) fault segmentation; 4) dating past earthquakes; 5) paleoseismicity in the E and central US; 6) folds and earthquakes, and 7) future earthquake behavior. Summarizes important trends in each of these research areas based on information published between June 1982 and June 1986 and preprints of papers in press. The bibliography for this period contains mainly referred publications in journals and books.-from Author

  17. Investigating landslides caused by earthquakes - A historical review

    USGS Publications Warehouse

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  18. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes.

  19. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  20. Recent Progress and Development on Multi-parameters Remote Sensing Application in Earthquake Monitoring in China

    NASA Astrophysics Data System (ADS)

    Shen, Xuhui; Zhang, Xuemin; Hong, Shunying; Jing, Feng; Zhao, Shufan

    2014-05-01

    In the last ten years, a few national research plans and scientific projects on remote sensing application in Earthquake monitoring research are implemented in China. Focusing on advancing earthquake monitoring capability searching for the way of earthquake prediction, satellite electromagnetism, satellite infrared and D-InSAR technology were developed systematically and some remarkable progress were achieved by statistical research on historical earthquakes and summarized initially the space precursory characters, which laid the foundation for gradually promoting the practical use. On the basis of these works, argumentation on the first space-based platform has been finished in earthquake stereoscope observation system in China, and integrated earthquake remote sensing application system has been designed comprehensively. To develop the space-based earthquake observational system has become a major trend of technological development in earthquake monitoring and prediction. We shall pay more emphasis on the construction of the space segment of China earthquake stereoscope observation system and Imminent major scientific projects such as earthquake deformation observation system and application research combined INSAR, satellite gravity and GNSS with the goal of medium and long term earthquake monitoring and forcasting, infrared observation and technical system and application research with the goal of medium and short term earthquake monitoring and forcasting, and satellite-based electromagnetic observation and technical system and application system with the goal of short term and imminent earthquake monitoring.

  1. Earthquake activity in Oklahoma

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. )

    1989-08-01

    Oklahoma is one of the most seismically active areas in the southern Mid-Continent. From 1897 to 1988, over 700 earthquakes are known to have occurred in Oklahoma. The earliest documented Oklahoma earthquake took place on December 2, 1897, near Jefferson, in Grant County. The largest known Oklahoma earthquake happened near El Reno on April 9, 1952. This magnitude 5.5 (mb) earthquake was felt from Austin, Texas, to Des Moines, Iowa, and covered a felt area of approximately 362,000 km{sup 2}. Prior to 1962, all earthquakes in Oklahoma (59) were either known from historical accounts or from seismograph stations outside the state. Over half of these events were located in Canadian County. In late 1961, the first seismographs were installed in Oklahoma. From 1962 through 1976, 70 additional earthquakes were added to the earthquake database. In 1977, a statewide network of seven semipermanent and three radio-telemetry seismograph stations were installed. The additional stations have improved earthquake detection and location in the state of Oklahoma. From 1977 to 1988, over 570 additional earthquakes were located in Oklahoma, mostly of magnitudes less than 2.5. Most of these events occurred on the eastern margin of the Anadarko basin along a zone 135 km long by 40 km wide that extends from Canadian County to the southern edge of Garvin County. Another general area of earthquake activity lies along and north of the Ouachita Mountains in the Arkoma basin. A few earthquakes have occurred in the shelves that border the Arkoma and Anadarko basins.

  2. A starting earthquake with harmonic effects

    NASA Astrophysics Data System (ADS)

    Babeshko, V. A.; Evdokimova, O. V.; Babeshko, O. M.

    2016-11-01

    The possibility of the occurrence of a starting earthquake with harmonic vibrations (caused by the vertical harmonic effect) of the lithospheric plates and the base on which the plates are resting is considered. This case differs from the static one [1], for which the boundary problem operator is characterized by the presence of manifold eigenvalues. In the dynamic case, the eigenvalues of the operator are single. It is found that the starting earthquake also occurs in this case and, in addition, earthquake hazard can increase due to the appearance of fatigue breakdown conditions in the zone of the approach of lithospheric plates. In turn, fatigue breakdown is related to periodic changes in the effective directions of maximal stresses in this zone.

  3. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  4. Earthquake research in China

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    The prediction of the Haicheng earthquake was an extraordinary achievement by the geophysical workers of the People's Republic of China, whose national program in earthquake reserach was less than 10 years old at the time. To study the background to this prediction, a delgation of 10 U.S scientists, which I led, visited China in June 1976. 

  5. Earthquake history of Texas

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Seventeen earthquakes, intensity V or greater, have centered in Texas since 1882, when the first shock was reported. The strongest earthquake, a maximum intensity VIII, was in western Texas in 1931 and was felt over 1 165 000 km 2. Three shocks in the Panhandle region in 1925, 1936, and 1943 were widely felt. 

  6. Earthquake history of Oklahoma

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    The strongest and most widely felt earthquake in Oklahoma occured on April 9, 1952. The intensity VII (Modified Mercalli Scale) tremor was felt over 362,000 sqaure kilometres. A second intensity VII earthquake, felt over a very small area, occurred in October 1956. In addition, 15 other shocks, intensity V or VI, have originated within Oklahoma. 

  7. Earthquake history of Mississippi

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Since its admission into the Union in 1817, Mississippi has had only four earthquakes of intensity V or greater within its borders. Although the number of earthquakes known to have been centered within Mississippi's boundaries is small, the State has been affected by numerous shocks located in neighboring States. In 1811 and 1812, a series of great earthquakes near the New Madrid Missouri area was felt in Mississippi as far south as the gulf coast. The New Madrid series caused the banks of the Mississippi River to cave in as far as Vicksburg, mroe than 300 miles from the epicentral region. As a result of this great earthquake series, the northwest corner of Mississippi is in seismic risk zone 3, the highest risk zone. Expect for the new Madrid series, effects in Mississippi from earthquakes located outside of the State have been less than intensity V. 

  8. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  9. Maximum Earthquake Magnitude Assessments by Japanese Government Committees (Invited)

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2013-12-01

    The 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history and such a gigantic earthquake was not foreseen around Japan. After the 2011 disaster, various government committees in Japan have discussed and assessed the maximum credible earthquake size around Japan, but their values vary without definite consensus. I will review them with earthquakes along the Nankai Trough as an example. The Central Disaster Management Council, under Cabinet Office, set up a policy for the future tsunami disaster mitigation. The possible future tsunamis are classified into two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, for which saving people's lives is the first priority with soft measures such as tsunami hazard maps, evacuation facilities or disaster education. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared. The assessments of L1 and L2 events are left to local governments. The CDMC also assigned M 9.1 as the maximum size of earthquake along the Nankai trough, then computed the ground shaking and tsunami inundation for several scenario earthquakes. The estimated loss is about ten times the 2011 disaster, with maximum casualties of 320,000 and economic loss of 2 trillion dollars. The Headquarters of Earthquake Research Promotion, under MEXT, was set up after the 1995 Kobe earthquake and has made long-term forecast of large earthquakes and published national seismic hazard maps. The future probability of earthquake occurrence, for example in the next 30 years, was calculated from the past data of large earthquakes, on the basis of characteristic earthquake model. The HERP recently revised the long-term forecast of Naknai trough earthquake; while the 30 year probability (60 - 70 %) is similar to the previous estimate, they noted the size can be M 8 to 9, considering the variability of past

  10. Design of a Space Based Sensor to Predict the Intensity and Location of Earthquakes from Electromagnetic Radiation.

    DTIC Science & Technology

    1985-12-01

    Avai anjorSpecial ’a-ŘLVC IF TABLE OF CONTENTS W I. INTRODUCT ION .. . . .. . . .. . A. EARTHQUAKE PREDICTION THEORY . . . . . . . . . 10 B. SPACE... EARTHQUAKE PREDICTION THEORY The desirability of being able to predict when and where an earthquake will occur becomes immediately apparent in light of...the deep crust displacement imposes new stress on the upper brittle crust. To date, earthquake prediction has been on a long-term basis normally

  11. Strategies for rapid global earthquake impact estimation: the Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, D.J.

    2013-01-01

    This chapter summarizes the state-of-the-art for rapid earthquake impact estimation. It details the needs and challenges associated with quick estimation of earthquake losses following global earthquakes, and provides a brief literature review of various approaches that have been used in the past. With this background, the chapter introduces the operational earthquake loss estimation system developed by the U.S. Geological Survey (USGS) known as PAGER (for Prompt Assessment of Global Earthquakes for Response). It also details some of the ongoing developments of PAGER’s loss estimation models to better supplement the operational empirical models, and to produce value-added web content for a variety of PAGER users.

  12. How Small the Number of Test Items Can Be for the Basis of Estimating the Operating Characteristics of the Discrete Responses to Unknown Test Items.

    ERIC Educational Resources Information Center

    Samejima, Fumiko; Changas, Paul S.

    The methods and approaches for estimating the operating characteristics of the discrete item responses without assuming any mathematical form have been developed and expanded. It has been made possible that, even if the test information function of a given test is not constant for the interval of ability of interest, it is used as the Old Test.…

  13. Monitoring the Earthquake source process in North America

    USGS Publications Warehouse

    Herrmann, Robert B.; Benz, H.; Ammon, C.J.

    2011-01-01

    With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.

  14. North Anna Nuclear Power Plant Strong Motion Records of the Mineral, Virginia Earthquake of August 23, 2011

    NASA Astrophysics Data System (ADS)

    Graizer, V.

    2012-12-01

    The MW 5.8 Mineral, Virginia earthquake was recorded at a relatively short epicentral distance of about 18 km at the North Anna Nuclear Power Plant (NPP) by the SMA-3 magnetic tape digital accelerographs installed inside the plant's containment at the foundation and deck levels. The North Anna NPP is operated by the Virginia Electric and Power Company (VEPCO) and has two pressurized water reactors (PWR) units that began operation in 1978 and 1980, respectively. Following the earthquake, both units were safely shutdown. The strong-motion records were processed to get velocity, displacement, Fourier and 5% damped response spectra. The basemat record demonstrated relatively high amplitudes of acceleration of 0.26 g and velocity of 13.8 cm/sec with a relatively short duration of strong motion of 2-3 sec. Recorded 5% damped Response Spectra exceed Design Basis Earthquake for the existing Units 1 and 2, while comprehensive plant inspections performed by VEPCO and U.S. Nuclear Regulatory Commission have concluded that the damage to the plant was minimal not affecting any structures and equipment significant to plant operation. This can be explained in part by short duration of the earthquake ground motion at the plant. The North Anna NPP did not have free-field strong motion instrumentation at the time of the earthquake. Since the containment is founded on rock there is a tendency to consider basemat record as an approximation of the free-field recording. However, comparisons of deck and basemat records demonstrate that the basemat recording is also affected by structural resonance frequencies higher than 3 Hz. Structural resonances in the frequency range of 3-4 Hz can at least partially explain significant exceedance of observed motions relative to ground motion calculated using ground motion prediction equations.cceleration, velocity and displacement at the North Anna NPP basemat level. Amplitudes of acceleration, velocity and displacement at basemat and deck levels

  15. Earthquake parameters of historical earthquakes in Europe (Invited)

    NASA Astrophysics Data System (ADS)

    Stucchi, M.; Gomez Capera, A.; Musson, R.; Papaioannou, C. A.; Meletti, C.; Batllo-Ortiz, J.; Fäh, D.

    2009-12-01

    The assessment of earthquake parameters of historical earthquakes is a key issue for understanding the seismic potential and evaluation the seismic hazard of a region. The processing of historical data is a complicated affair, still performed according to subjective, non repeatable procedures. In the last ten years varied methods using macroseismic datapoints have been developed: Bakun & Wentworth, or BW (Bakun and Wentworth, 1997), Boxer (Gasperini et al., 1999), MEEP (Musson and Jimenez, 2008). These methods allow the assessment to be performed by means of computer codes and, therefore, results are independent from the operator to some extent. With the aim of re-assessing earthquake parameters for the main historical events (M>5) of the entire European region, the NA4 module (Distributed Archive of Historical Earthquake Data) of the European project NERIES started to calibrate the three methods, in a homogeneous way, in five areas: Aegean, Iberian, Italian, Great Britain and Switzerland. A dataset of about fifteen earthquakes of the 20th century was compiled according to homogeneous procedures for each area. Earthquakes were selected among those with both macroseismic datapoints and reliable instrumental moment magnitude (Mw) and epicentre available, in such a way to cover the largest possible magnitude range and geographical distribution of earthquakes. Boxer and MEEP come with codes which can provide calibrated coefficients from the input dataset. The BW method requires an intensity attenuation relation as function of the Mw and hypocentral distance. Such relations were obtained for each area calibrating the coefficients of a log-linear function with regression procedures from the same dataset used for calibrating the other methods. Recently published relations were also considered. The obtained, calibrated coefficients were then validated determining the parameters of another ten events. For Boxer and MEEP the validation exercise was devoted to survey whether

  16. Post-Earthquake Reconstruction — in Context of Housing

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Comprehensive rescue and relief operations are always launched with no loss of time with active participation of the Army, Governmental agencies, Donor agencies, NGOs, and other Voluntary organizations after each Natural Disaster. There are several natural disasters occurring throughout the world round the year and one of them is Earthquake. More than any other natural catastrophe, an earthquake represents the undoing of our most basic pre-conceptions of the earth as the source of stability or the first distressing factor due to earthquake is the collapse of our dwelling units. Earthquake has affected buildings since people began constructing them. So after each earthquake a reconstruction of housing program is very much essential since housing is referred to as shelter satisfying one of the so-called basic needs next to food and clothing. It is a well-known fact that resettlement (after an earthquake) is often accompanied by the creation of ghettos and ensuing problems in the provision of infrastructure and employment. In fact a housing project after Bhuj earthquake in Gujarat, India, illustrates all the negative aspects of resettlement in the context of reconstruction. The main theme of this paper is to consider few issues associated with post-earthquake reconstruction in context of housing, all of which are significant to communities that have had to rebuild after catastrophe or that will face such a need in the future. Few of them are as follows: (1) Why rebuilding opportunities are time consuming? (2) What are the causes of failure in post-earthquake resettlement? (3) How can holistic planning after an earthquake be planned? (4) What are the criteria to be checked for sustainable building materials? (5) What are the criteria for success in post-earthquake resettlement? (6) How mitigation in post-earthquake housing can be made using appropriate repair, restoration, and strengthening concepts?

  17. When the snakes awake: animals and earthquake prediction

    SciTech Connect

    Tributsch, H.

    1982-01-01

    Helmet Tributsch, a physical chemist, searched the literature and collected anecdotal reports of abnormal animal behavior during 77 earthquakes, besides the 1976 Friuli earthquake that destroyed his home village in northern Italy. His findings are presented in ''When the Snakes Awake''. In the book he also describes observations of several other phenomena thought to precede earthquakes--fog, lights and sound--and discusses various hypotheses. On the basis of the available circumstantial evidence, he makes a strong argument for the attribution of these phenomena to an increase in the number of electrostatically charged particles in the atmosphere, generated by the tectonically stressed crust.

  18. Earthquakes; January-February 1982

    USGS Publications Warehouse

    Person, W.J.

    1982-01-01

    In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine. 

  19. Earthquakes, September-October 1986

    USGS Publications Warehouse

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  20. Tidal controls on earthquake size-frequency statistics

    NASA Astrophysics Data System (ADS)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  1. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  2. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal.

  3. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  4. Improvements of the offshore earthquake locations in the Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Chen, Ta-Yi; Hsu, Hsin-Chih

    2017-04-01

    Since 2014 the Earthworm Based Earthquake Alarm Reporting (eBEAR) system has been operated and been used to issue warnings to schools. In 2015 the system started to provide warnings to the public in Taiwan via television and the cell phone. Online performance of the eBEAR system indicated that the average reporting times afforded by the system are approximately 15 and 28 s for inland and offshore earthquakes, respectively. The eBEAR system in average can provide more warning time than the current EEW system (3.2 s and 5.5 s for inland and offshore earthquakes, respectively). However, offshore earthquakes were usually located poorly because only P-wave arrivals were used in the eBEAR system. Additionally, in the early stage of the earthquake early warning system, only fewer stations are available. The poor station coverage may be a reason to answer why offshore earthquakes are difficult to locate accurately. In the Geiger's inversion procedure of earthquake location, we need to put an initial hypocenter and origin time into the location program. For the initial hypocenter, we defined some test locations on the offshore area instead of using the average of locations from triggered stations. We performed 20 programs concurrently running the Geiger's method with different pre-defined initial position to locate earthquakes. We assume that if the program with the pre-defined initial position is close to the true earthquake location, during the iteration procedure of the Geiger's method the processing time of this program should be less than others. The results show that using pre-defined locations for trial-hypocenter in the inversion procedure is able to improve the accurate of offshore earthquakes. Especially for EEW system, in the initial stage of the EEW system, only use 3 or 5 stations to locate earthquakes may lead to bad results because of poor station coverage. In this study, the pre-defined trial-locations provide a feasible way to improve the estimations of

  5. Triggered Earthquakes Following Parkfield?

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2004-12-01

    When the M5.0 Arvin earthquake struck approximately 30 hours after the 28 September 2004 M6.0 Parkfield earthquake, it seemed likely if not obvious that the latter had triggered the former. The odds of a M5.0 or greater event occurring by random chance in a given 2-day window is low, on the order of 2%. However, previously published results suggest that remotely triggered earthquakes are observed only following much larger mainshocks, typically M7 or above. Moreover, using a standard beta-statistic approach, one finds no pervasive regional increase of seismicity in the weeks following the Parkfield mainshock. (Neither were any moderate events observed at regional distances following the 1934 and 1966 Parkfield earthquakes.) Was Arvin a remotely triggered earthquake? To address this issue further I compare the seismicity rate changes following the Parkfield mainshock with those following 14 previous M5.3-7.1 earthquakes in central and southern California. I show that, on average, seismicity increased to a distance of at least 120 km following these events. For all but the M7.1 Hector Mine mainshock, this is well beyond the radius of what would be considered a traditional aftershock zone. Average seismicity rates also increase, albeit more weakly, to a distance of about 220 km. These results suggest that even moderate mainshocks in central and southern California do trigger seismicity at distances up to 220 km, supporting the inference that Arvin was indeed a remotely triggered earthquake. In general, only weak triggering is expected following moderate (M5.5-6.5) mainshocks. However, as illustrated by Arvin and, in retrospect, the 1986 M5.5 Oceanside earthquake, which struck just 5 days after the M5.9 North Palm Springs earthquake, triggered events can sometimes be large enough to generate public interest, and anxiety.

  6. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    NASA Astrophysics Data System (ADS)

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-01

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ɛ) ≔ T0 + ɛT1 + ɛ2T2 + … + ɛkTk + … forms a Riesz basis in L2(0, T), T > 0, where \\varepsilon in {C}, T0 is a closed densely defined linear operator on a separable Hilbert space H with domain D(T_0) having isolated eigenvalues with multiplicity one, while T1, T2, … are linear operators on H having the same domain Dsupset D(T_0) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  7. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    SciTech Connect

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-15

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ε) ≔ T{sub 0} + εT{sub 1} + ε{sup 2}T{sub 2} + … + ε{sup k}T{sub k} + … forms a Riesz basis in L{sup 2}(0, T), T > 0, where ε∈C, T{sub 0} is a closed densely defined linear operator on a separable Hilbert space H with domain D(T{sub 0}) having isolated eigenvalues with multiplicity one, while T{sub 1}, T{sub 2}, … are linear operators on H having the same domain D⊃D(T{sub 0}) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  8. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    SciTech Connect

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-15

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ε) ≔ T{sub 0} + εT{sub 1} + ε{sup 2}T{sub 2} + … + ε{sup k}T{sub k} + … forms a Riesz basis in L{sup 2}(0, T), T > 0, where ε∈C, T{sub 0} is a closed densely defined linear operator on a separable Hilbert space H with domain D(T{sub 0}) having isolated eigenvalues with multiplicity one, while T{sub 1}, T{sub 2}, … are linear operators on H having the same domain D⊃D(T{sub 0}) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  9. The SCEC-USGS Dynamic Earthquake Rupture Code Comparison Exercise - Simulations of Large Earthquakes and Strong Ground Motions

    NASA Astrophysics Data System (ADS)

    Harris, R.

    2015-12-01

    I summarize the progress by the Southern California Earthquake Center (SCEC) and U.S. Geological Survey (USGS) Dynamic Rupture Code Comparison Group, that examines if the results produced by multiple researchers' earthquake simulation codes agree with each other when computing benchmark scenarios of dynamically propagating earthquake ruptures. These types of computer simulations have no analytical solutions with which to compare, so we use qualitative and quantitative inter-code comparisons to check if they are operating satisfactorily. To date we have tested the codes against benchmark exercises that incorporate a range of features, including single and multiple planar faults, single rough faults, slip-weakening, rate-state, and thermal pressurization friction, elastic and visco-plastic off-fault behavior, complete stress drops that lead to extreme ground motion, heterogeneous initial stresses, and heterogeneous material (rock) structure. Our goal is reproducibility, and we focus on the types of earthquake-simulation assumptions that have been or will be used in basic studies of earthquake physics, or in direct applications to specific earthquake hazard problems. Our group's goals are to make sure that when our earthquake-simulation codes simulate these types of earthquake scenarios along with the resulting simulated strong ground shaking, that the codes are operating as expected. For more introductory information about our group and our work, please see our group's overview papers, Harris et al., Seismological Research Letters, 2009, and Harris et al., Seismological Research Letters, 2011, along with our website, scecdata.usc.edu/cvws.

  10. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  11. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  12. Properties of "started" earthquakes

    NASA Astrophysics Data System (ADS)

    Babeshko, V. A.; Evdokimova, O. V.; Babeshko, O. M.

    2016-04-01

    The properties of earthquakes called "started" in [1] are studied. The problems associated with the method of revealing them, the expected behavior of the event, and the determination of its place, time, and intensity are discussed. Certain characteristic properties of real earthquakes are compared with the modeled ones. It is emphasized that there are no data on earthquakes of a similar type in scientific publications. The method of using high-efficiency calculations is proposed by imbedding the investigations in topological spaces having a wider spectrum of properties than the functional ones.

  13. Earthquake history of Tennessee

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

     The western part of the State was shaken strongly by the New Madrid, Mo., earthquakes of 1811-12 and by earthquakes in 1843 and 1895. The area has also experienced minor shocks. Additional activity has occurred in the eastern part of the State, near the North Carolina border. Forty shocks of intensity V (Modified Mercalli scale) or greater have been cataloged as occurring within the State. Many other earthquakes centered in bordering States have affected points in Tennessee. The following summary covers only hose shocks of intensity VI or greater. 

  14. On near-source earthquake triggering

    USGS Publications Warehouse

    Parsons, T.; Velasco, A.A.

    2009-01-01

    When one earthquake triggers others nearby, what connects them? Two processes are observed: static stress change from fault offset and dynamic stress changes from passing seismic waves. In the near-source region (r ??? 50 km for M ??? 5 sources) both processes may be operating, and since both mechanisms are expected to raise earthquake rates, it is difficult to isolate them. We thus compare explosions with earthquakes because only earthquakes cause significant static stress changes. We find that large explosions at the Nevada Test Site do not trigger earthquakes at rates comparable to similar magnitude earthquakes. Surface waves are associated with regional and long-range dynamic triggering, but we note that surface waves with low enough frequency to penetrate to depths where most aftershocks of the 1992 M = 5.7 Little Skull Mountain main shock occurred (???12 km) would not have developed significant amplitude within a 50-km radius. We therefore focus on the best candidate phases to cause local dynamic triggering, direct waves that pass through observed near-source aftershock clusters. We examine these phases, which arrived at the nearest (200-270 km) broadband station before the surface wave train and could thus be isolated for study. Direct comparison of spectral amplitudes of presurface wave arrivals shows that M ??? 5 explosions and earthquakes deliver the same peak dynamic stresses into the near-source crust. We conclude that a static stress change model can readily explain observed aftershock patterns, whereas it is difficult to attribute near-source triggering to a dynamic process because of the dearth of aftershocks near large explosions.

  15. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  16. Seafloor earthquake measurement system, SEMS IV

    SciTech Connect

    Platzbecker, M.R.; Ehasz, J.P.; Franco, R.J.

    1997-07-01

    Staff of the Telemetry Technology Development Department (2664) have, in support of the U.S. Interior Department Mineral Management Services (MMS), developed and deployed the Seafloor Earthquake Measurement System IV (SEMS IV). The result of this development project is a series of three fully operational seafloor seismic monitor systems located at offshore platforms: Eureka, Grace, and Irene. The instrument probes are embedded from three to seven feet into the seafloor and hardwired to seismic data recorders installed top side at the offshore platforms. The probes and underwater cables were designed to survive the seafloor environment with an operation life of five years. The units have been operational for two years and have produced recordings of several minor earthquakes in that time. Sandia Labs will transfer operation of SEMS IV to MMS contractors in the coming months. 29 figs., 25 tabs.

  17. Building losses assessment for Lushan earthquake utilization multisource remote sensing data and GIS

    NASA Astrophysics Data System (ADS)

    Nie, Juan; Yang, Siquan; Fan, Yida; Wen, Qi; Xu, Feng; Li, Lingling

    2015-12-01

    On 20 April 2013, a catastrophic earthquake of magnitude 7.0 struck the Lushan County, northwestern Sichuan Province, China. This earthquake named Lushan earthquake in China. The Lushan earthquake damaged many buildings. The situation of building loss is one basis for emergency relief and reconstruction. Thus, the building losses of the Lushan earthquake must be assessed. Remote sensing data and geographic information systems (GIS) can be employed to assess the building loss of the Lushan earthquake. The building losses assessment results for Lushan earthquake disaster utilization multisource remote sensing dada and GIS were reported in this paper. The assessment results indicated that 3.2% of buildings in the affected areas were complete collapsed. 12% and 12.5% of buildings were heavy damaged and slight damaged, respectively. The complete collapsed buildings, heavy damaged buildings, and slight damaged buildings mainly located at Danling County, Hongya County, Lushan County, Mingshan County, Qionglai County, Tianquan County, and Yingjing County.

  18. Metrics for comparing dynamic earthquake rupture simulations

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  19. Earthquake swarms on Mount Erebus, Antarctica

    NASA Astrophysics Data System (ADS)

    Kaminuma, Katsutada; Baba, Megumi; Ueki, Sadato

    1986-12-01

    Mount Erebus (3794 m), located on Ross Island in McMurdo Sound, is one of the few active volcanoes in Antartica. A high-sensitivity seismic network has been operated by Japanese and US parties on and around the Volcano since December, 1980. The results of these observations show two kinds of seismic activity on Ross Island: activity concentrated near the summit of Mount Erebus associated with Strombolian eruptions, and micro-earthquake activity spread through Mount Erebus and the surrounding area. Seismicity on Mount Erebus has been quite high, usually exceeding 20 volcanic earthquakes per day. They frequently occur in swarms with daily counts exceeding 100 events. Sixteen earthquake swarms with more than 250 events per day were recorded by the seismic network during the three year period 1982-1984, and three notable earthquake swarms out of the sixteen were recognized, in October, 1982 (named 82-C), March-April, 1984 (84-B) and July, 1984 (84-F). Swarms 84-B and 84-F have a large total number of earthquakes and large Ishimoto-Iida's "m"; hence these two swarms are presumed to constitute on one of the precursor phenomena to the new eruption, which took place on 13 September, 1984, and lasted a few months.

  20. Historical Earthquakes and Active Structure for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashivli, Otar

    2014-05-01

    Long-term seismic history is an important foundation for reliable assessment of seismic hazard and risk. Therefore, completeness of earthquake catalogues in the longest historical part is very important. Survived historical sources, as well as special researches from the institutes, museums, libraries and archives in Georgia, the Caucasus and the Middle East indicate to high level of seismicity which entailed numerous human casualties and destruction on the territory of Georgia during the historical period. The study and detailed analysis of these original documents and researches have allowed us to create a new catalogue of historical earthquakes of Georgia from 1250 BC to 1900 AD. The method of the study is based on a multidisciplinary approach, i.e. on the joint use of methods of history and paleoseismology, archeoseismology, seismotectonics, geomorphology, etc. We present here a new parametric catalogue of 44 historic earthquakes of Georgia and a full "descriptor" of all the phenomena described in it. Constructed on its basis, the summarized map of the distribution of maximum damage in the historical period (before 1900) on the territory of Georgia clearly shows the main features of the seismic field during this period. In particular, in the axial part and the southern slope of the Greater Caucasus there is a seismic gap, which was filled in 1991 by the strongest earthquake and its aftershocks in Racha. In addition, it is also obvious that very high seismic activity in the central and eastern parts of the Javakheti highland is not described in historical materials and this fact requires further searches of various kinds of sources that contain data about historical earthquakes. We hope that this catalogue will enable to create a new joint (instrumental and historical) parametric earthquake catalogue of Georgia and will serve to assess the real seismic hazard and risk in the country.

  1. Dynamic stress changes during earthquake rupture

    USGS Publications Warehouse

    Day, S.M.; Yu, G.; Wald, D.J.

    1998-01-01

    We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.

  2. Source processes of strong earthquakes in the North Tien-Shan region

    NASA Astrophysics Data System (ADS)

    Kulikova, G.; Krueger, F.

    2013-12-01

    Tien-Shan region attracts attention of scientists worldwide due to its complexity and tectonic uniqueness. A series of very strong destructive earthquakes occurred in Tien-Shan at the turn of XIX and XX centuries. Such large intraplate earthquakes are rare in seismology, which increases the interest in the Tien-Shan region. The presented study focuses on the source processes of large earthquakes in Tien-Shan. The amount of seismic data is limited for those early times. In 1889, when a major earthquake has occurred in Tien-Shan, seismic instruments were installed in very few locations in the world and these analog records did not survive till nowadays. Although around a hundred seismic stations were operating at the beginning of XIX century worldwide, it is not always possible to get high quality analog seismograms. Digitizing seismograms is a very important step in the work with analog seismic records. While working with historical seismic records one has to take into account all the aspects and uncertainties of manual digitizing and the lack of accurate timing and instrument characteristics. In this study, we develop an easy-to-handle and fast digitization program on the basis of already existing software which allows to speed up digitizing process and to account for all the recoding system uncertainties. Owing to the lack of absolute timing for the historical earthquakes (due to the absence of a universal clock at that time), we used time differences between P and S phases to relocate the earthquakes in North Tien-Shan and the body-wave amplitudes to estimate their magnitudes. Combining our results with geological data, five earthquakes in North Tien-Shan were precisely relocated. The digitizing of records can introduce steps into the seismograms which makes restitution (removal of instrument response) undesirable. To avoid the restitution, we simulated historic seismograph recordings with given values for damping and free period of the respective instrument and

  3. Earthquake history of Wisconsin

    USGS Publications Warehouse

    von Hake, C. A.

    1978-01-01

    Only one earthquake of intensity V on the Modified Mercalli Intensity Scale (MM) or greater has occurred within Wisconsin during historic times. Some shocks originating in Illinois, Michigan, Missouri, Ohio, and Canada have been felt. 

  4. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  5. Using a pruned basis, a non-product quadrature grid, and the exact Watson normal-coordinate kinetic energy operator to solve the vibrational Schrödinger equation for C2H4

    NASA Astrophysics Data System (ADS)

    Avila, Gustavo; Carrington, Tucker

    2011-08-01

    In this paper we propose and test a method for computing numerically exact vibrational energy levels of a molecule with six atoms. We use a pruned product basis, a non-product quadrature, the Lanczos algorithm, and the exact normal-coordinate kinetic energy operator (KEO) with the πtμπ term. The Lanczos algorithm is applied to a Hamiltonian with a KEO for which μ is evaluated at equilibrium. Eigenvalues and eigenvectors obtained from this calculation are used as a basis to obtain the final energy levels. The quadrature scheme is designed, so that integrals for the most important terms in the potential will be exact. The procedure is tested on C2H4. All 12 coordinates are treated explicitly. We need only ˜1.52 × 108 quadrature points. A product Gauss grid with which one could calculate the same energy levels has at least 5.67 × 1013 points.

  6. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  7. The New Madrid earthquakes

    SciTech Connect

    Obermeier, S.F.

    1989-01-01

    Two interpreted 1811-12 epicenters generally agree well with zones of seismicity defined by modern, small earthquakes. Bounds on accelerations are placed at the limits of sand blows, generated by the 1811-12 earthquakes in the St. Francis Basin. Conclusions show how the topstratum thickness, sand size of the substratum, and thickness of alluvium affected the distribution of sand blows in the St. Francis Basin.

  8. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  9. Earthquake prediction, societal implications

    NASA Astrophysics Data System (ADS)

    Aki, Keiiti

    1995-07-01

    "If I were a brilliant scientist, I would be working on earthquake prediction." This is a statement from a Los Angeles radio talk show I heard just after the Northridge earthquake of January 17, 1994. Five weeks later, at a monthly meeting of the Southern California Earthquake Center (SCEC), where more than two hundred scientists and engineers gathered to exchange notes on the earthquake, a distinguished French geologist who works on earthquake faults in China envied me for working now in southern California. This place is like northeastern China 20 years ago, when high seismicity and research activities led to the successful prediction of the Haicheng earthquake of February 4, 1975 with magnitude 7.3. A difficult question still haunting us [Aki, 1989] is whether the Haicheng prediction was founded on the physical reality of precursory phenomena or on the wishful thinking of observers subjected to the political pressure which encouraged precursor reporting. It is, however, true that a successful life-saving prediction like the Haicheng prediction can only be carried out by the coordinated efforts of decision makers and physical scientists.

  10. Injection-induced earthquakes.

    PubMed

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  11. Induced earthquake during the 2016 Kumamoto earthquake (Mw7.0): Importance of real-time shake monitoring for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Ogiso, M.

    2016-12-01

    Sequence of the 2016 Kumamoto earthquakes (Mw6.2 on April 14, Mw7.0 on April 16, and many aftershocks) caused a devastating damage at Kumamoto and Oita prefectures, Japan. During the Mw7.0 event, just after the direct S waves passing the central Oita, another M6 class event occurred there more than 80 km apart from the Mw7.0 event. The M6 event is interpreted as an induced earthquake; but it brought stronger shaking at the central Oita than that from the Mw7.0 event. We will discuss the induced earthquake from viewpoint of Earthquake Early Warning. In terms of ground shaking such as PGA and PGV, the Mw7.0 event is much smaller than those of the M6 induced earthquake at the central Oita (for example, 1/8 smaller at OIT009 station for PGA), and then it is easy to discriminate two events. However, PGD of the Mw7.0 is larger than that of the induced earthquake, and its appearance is just before the occurrence of the induced earthquake. It is quite difficult to recognize the induced earthquake from displacement waveforms only, because the displacement is strongly contaminated by that of the preceding Mw7.0 event. In many methods of EEW (including current JMA EEW system), magnitude is used for prediction of ground shaking through Ground Motion Prediction Equation (GMPE) and the magnitude is often estimated from displacement. However, displacement magnitude does not necessarily mean the best one for prediction of ground shaking, such as PGA and PGV. In case of the induced earthquake during the Kumamoto earthquake, displacement magnitude could not be estimated because of the strong contamination. Actually JMA EEW system could not recognize the induced earthquake. One of the important lessons we learned from eight years' operation of EEW is an issue of the multiple simultaneous earthquakes, such as aftershocks of the 2011 Mw9.0 Tohoku earthquake. Based on this lesson, we have proposed enhancement of real-time monitor of ground shaking itself instead of rapid estimation of

  12. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  13. An efficient repeating signal detector to investigate earthquake swarms

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2016-08-01

    Repetitive earthquake swarms have been recognized as key signatures in fluid injection induced seismicity, precursors to volcanic eruptions, and slow slip events preceding megathrust earthquakes. We investigate earthquake swarms by developing a Repeating Signal Detector (RSD), a computationally efficient algorithm utilizing agglomerative clustering to identify similar waveforms buried in years of seismic recordings using a single seismometer. Instead of relying on existing earthquake catalogs of larger earthquakes, RSD identifies characteristic repetitive waveforms by rapidly identifying signals of interest above a low signal-to-noise ratio and then grouping based on spectral and time domain characteristics, resulting in dramatically shorter processing time than more exhaustive autocorrelation approaches. We investigate seismicity in four regions using RSD: (1) volcanic seismicity at Mammoth Mountain, California, (2) subduction-related seismicity in Oaxaca, Mexico, (3) induced seismicity in Central Alberta, Canada, and (4) induced seismicity in Harrison County, Ohio. In each case, RSD detects a similar or larger number of earthquakes than existing catalogs created using more time intensive methods. In Harrison County, RSD identifies 18 seismic sequences that correlate temporally and spatially to separate hydraulic fracturing operations, 15 of which were previously unreported. RSD utilizes a single seismometer for earthquake detection which enables seismicity to be quickly identified in poorly instrumented regions at the expense of relying on another method to locate the new detections. Due to the smaller computation overhead and success at distances up to ~50 km, RSD is well suited for real-time detection of low-magnitude earthquake swarms with permanent regional networks.

  14. Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment

    USGS Publications Warehouse

    Lin, K.-W.; Wald, D.J.

    2012-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.

  15. Trial application of guidelines for nuclear plant response to an earthquake. Final report

    SciTech Connect

    Schmidt, W.; Oliver, R.; O`Connor, W.

    1993-09-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. These guidelines are published in EPRI report NP-6695, ``Guidelines for Nuclear Plant Response to an Earthquake,`` dated December 1989. This report includes two sets of nuclear plant procedures which were prepared to implement the guidelines of EPRI report NP-6695. The first set were developed by the Toledo Edison Company Davis-Besse plant. Davis-Besse is a pressurized water reactor (PWR) and contains relatively standard seismic monitoring instrumentation typical of many domestic nuclear plants. The second set of procedures were prepared by Yankee Atomic Electric Company for the Vermont Yankee facility. This plant is a boiling water reactor (BWR) with state-of-the-art seismic monitoring and PC-based data processing equipment, software developed specifically to implement the OBE Exceedance Criterion presented in EPRI report NP-5930, ``A Criterion for Determining Exceedance of the operating Basis Earthquake.`` The two sets of procedures are intended to demonstrate how two different nuclear utilities have interpreted and applied the EPRI guidance given in report NP-6695.

  16. Earthquakes; January-March 1976

    USGS Publications Warehouse

    Person, W.J.

    1976-01-01

    The year 1976 started out quite active, seismically. Four major earthquakes occurred in different parts of the world during the first 3 months of the year. Three earthquakes rattled the western rim of the Pacific Ocean from the Kuril Islands to the Kermadec Islands. The fourth major earthquake struck Guatemala, killing thousands of people, injuring many, and leaving thousands homeless. Earthquakes in Kentucky and Arkansas caused little damage but were felt in several States. Arizona experienced a sharp earthquake in the Chico Valley, which caused very little damage. Other States experienced earthquakes, but none caused damage. 

  17. Large magnitude (M > 7.5) offshore earthquakes in 2012: few examples of absent or little tsunamigenesis, with implications for tsunami early warning

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Tinti, Stefano

    2013-04-01

    We take into account some examples of offshore earthquakes occurred worldwide in year 2012 that were characterised by a "large" magnitude (Mw equal or larger than 7.5) but which produced no or little tsunami effects. Here, "little" is intended as "lower than expected on the basis of the parent earthquake magnitude". The examples we analyse include three earthquakes occurred along the Pacific coasts of Central America (20 March, Mw=7.8, Mexico; 5 September, Mw=7.6, Costa Rica; 7 November, Mw=7.5, Mexico), the Mw=7.6 and Mw=7.7 earthquakes occurred respectively on 31 August and 28 October offshore Philippines and offshore Alaska, and the two Indian Ocean earthquakes registered on a single day (11 April) and characterised by Mw=8.6 and Mw=8.2. For each event, we try to face the problem related to its tsunamigenic potential from two different perspectives. The first can be considered purely scientific and coincides with the question: why was the ensuing tsunami so weak? The answer can be related partly to the particular tectonic setting in the source area, partly to the particular position of the source with respect to the coastline, and finally to the focal mechanism of the earthquake and to the slip distribution on the ruptured fault. The first two pieces of information are available soon after the earthquake occurrence, while the third requires time periods in the order of tens of minutes. The second perspective is more "operational" and coincides with the tsunami early warning perspective, for which the question is: will the earthquake generate a significant tsunami and if so, where will it strike? The Indian Ocean events of 11 April 2012 are perfect examples of the fact that the information on the earthquake magnitude and position alone may not be sufficient to produce reliable tsunami warnings. We emphasise that it is of utmost importance that the focal mechanism determination is obtained in the future much more quickly than it is at present and that this

  18. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  19. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  20. The earthquake potential of the New Madrid seismic zone

    USGS Publications Warehouse

    Tuttle, M.P.; Schweig, E.S.; Sims, J.D.; Lafferty, R.H.; Wolf, L.W.; Haynes, M.L.

    2002-01-01

    The fault system responsible for New Madrid seismicity has generated temporally clustered very large earthquakes in A.D. 900 ?? 100 years and A.D. 1450 ?? 150 years as well as in 1811-1812. Given the uncertainties in dating liquefaction features, the time between the past three New Madrid events may be as short as 200 years and as long as 800 years, with an average of 500 years. This advance in understanding the Late Holocene history of the New Madrid seismic zone and thus, the contemporary tectonic behavior of the associated fault system was made through studies of hundreds of earthquake-induced liquefaction features at more than 250 sites across the New Madrid region. We have found evidence that prehistoric sand blows, like those that formed during the 1811-1812 earthquakes, are probably compound structures resulting from multiple earthquakes closely clustered in time or earthquake sequences. From the spatial distribution and size of sand blows and their sedimentary units, we infer the source zones and estimate the magnitudes of earthquakes within each sequence and thereby characterize the detailed behavior of the fault system. It appears that fault rupture was complex and that the central branch of the seismic zone produced very large earthquakes during the A.D. 900 and A.D. 1450 events as well as in 1811-1812. On the basis of a minimum recurrence rate of 200 years, we are now entering the period during which the next 1811-1812-type event could occur.

  1. Persistent earthquake clusters and gaps from slip on irregular faults

    NASA Astrophysics Data System (ADS)

    Parsons, Tom

    2008-01-01

    Earthquake-producing fault systems like the San Andreas fault in California show self-similar structural variation; earthquakes cluster in space, leaving aseismic gaps between clusters. Whether gaps represent overdue earthquakes or signify diminished risk is a question with which seismic-hazard forecasters wrestle. Here I use spectral analysis of the spatial distribution of seismicity along the San Andreas fault (for earthquakes that are at least 2 in magnitude), which reveals that it obeys a power-law relationship, indicative of self-similarity in clusters across a range of spatial scales. To determine whether the observed clustering of earthquakes is the result of a heterogeneous stress distribution, I use a finite-element method to simulate the motion of two rigid blocks past each other along a model fault surface that shows three-dimensional complexity on the basis of mapped traces of the San Andreas fault. The results indicate that long-term slip on the model fault generates a temporally stable, spatially variable distribution of stress that shows the same power-law relationship as the earthquake distribution. At the highest rates of San Andreas fault slip (40mmyr-1), stress patterns produced are stable over a minimum of 25,000 years before the model fault system evolves into a new configuration. These results suggest that although gaps are not immune to rupture propagation they are less likely to be nucleation sites for earthquakes.

  2. Earthquake forecasting studies using radon time series data in Taiwan

    NASA Astrophysics Data System (ADS)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  3. Applications of Multi-Cycle Earthquake Simulations to Earthquake Hazard

    NASA Astrophysics Data System (ADS)

    Gilchrist, Jacquelyn Joan

    This dissertation seeks to contribute to earthquake hazard analyses and forecasting by conducting a detailed study of the processes controlling the occurrence, and particularly the clustering, of large earthquakes, the probabilities of these large events, and the dynamics of their ruptures. We use the multi-cycle earthquake simulator RSQSim to investigate several fundamental aspects of earthquake occurrence in order to improve the understanding of earthquake hazard. RSQSim, a 3D, boundary element code that incorporates rate- and state-friction to simulate earthquakes in fully interacting, complex fault systems has been successful at modeling several aspects of fault slip and earthquake occurrence. Multi-event earthquake models with time-dependent nucleation based on rate- and state-dependent friction, such as RSQSim, provide a viable physics-based method for modeling earthquake processes. These models can provide a better understanding of earthquake hazard by improving our knowledge of earthquake processes and probabilities. RSQSim is fast and efficient, and therefore is able to simulate very long sequences of earthquakes (from hundreds of thousands to millions of events). This makes RSQSim an ideal instrument for filling in the current gaps in earthquake data, from short and incomplete earthquake catalogs to unrealistic initial conditions used for dynamic rupture models. RSQSim catalogs include foreshocks, aftershocks, and occasional clusters of large earthquakes, the statistics of which are important for the estimation of earthquake probabilities. Additionally, RSQSim finds a near optimal nucleation location that enables ruptures to propagate at minimal stress conditions and thus can provide suites of heterogeneous initial conditions for dynamic rupture models that produce reduced ground motions compared to models with homogeneous initial stresses and arbitrary forced nucleation locations.

  4. Summary of the September 19, 1985 Mexico earthquake

    SciTech Connect

    Burdick, R.B.

    1986-01-10

    The Lawrence Livermore National Laboratory (LLNL) maintains a Post-Earthquake Inspection Team. The team is composed of professionals specializing in mechanical, structural, and civil engineering, and seismology. The primary focus of the team is the structural and operating behavior of industrial facilities. The September 19, 1985 earthquake in central Mexico presented an unique opportunity for a member of LLNL's inspection team to participate in a multi-organizational, multi-disciplinary post-earthquake investigation. This document presents a summary of the observation made by that team.

  5. Seismic survey probes urban earthquake hazards in Pacific Northwest

    USGS Publications Warehouse

    Fisher, M.A.; Brocher, T.M.; Hyndman, R.D.; Trehu, A.M.; Weaver, C.S.; Creager, K.C.; Crosson, R.S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B.C.; Hammer, P.T.; Childs, J. R.; Cochrane, G.R.; Chopra, S.; Walia, R.

    1999-01-01

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region. The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  6. Seismic survey probes urban earthquake hazards in Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Fisher, M. A.; Brocher, T. M.; Hyndman, R. D.; Trehu, A. M.; Weaver, C. S.; Creager, K. C.; Crosson, R. S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B. C.; Hammer, P. T.; ten Brink, U.; Pratt, T. L.; Miller, K. C.; Childs, J. R.; Cochrane, G. R.; Chopra, S.; Walia, R.

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region.The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  7. Estimating surface faulting impacts from the shakeout scenario earthquake

    USGS Publications Warehouse

    Treiman, J.A.; Pontib, D.J.

    2011-01-01

    An earthquake scenario, based on a kinematic rupture model, has been prepared for a Mw 7.8 earthquake on the southern San Andreas Fault. The rupture distribution, in the context of other historic large earthquakes, is judged reasonable for the purposes of this scenario. This model is used as the basis for generating a surface rupture map and for assessing potential direct impacts on lifelines and other infrastructure. Modeling the surface rupture involves identifying fault traces on which to place the rupture, assigning slip values to the fault traces, and characterizing the specific displacements that would occur to each lifeline impacted by the rupture. Different approaches were required to address variable slip distribution in response to a variety of fault patterns. Our results, involving judgment and experience, represent one plausible outcome and are not predictive because of the variable nature of surface rupture. ?? 2011, Earthquake Engineering Research Institute.

  8. Reducing the Risks of Nonstructural Earthquake Damage: A Practical Guide. Earthquake Hazards Reduction Series 1.

    ERIC Educational Resources Information Center

    Reitherman, Robert

    The purpose of this booklet is to provide practical information to owners, operators, and occupants of office and commercial buildings on the vulnerabilities posed by earthquake damage to nonstructural items and the means available to deal with these potential problems. Examples of dangerous nonstructural damages that have occurred in past…

  9. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  10. Earthquake number forecasts testing

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  11. Rupture, waves and earthquakes.

    PubMed

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  12. Rupture, waves and earthquakes

    PubMed Central

    UENISHI, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808

  13. Rupture, waves and earthquakes

    NASA Astrophysics Data System (ADS)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  14. Earthquakes, September-October 1984

    USGS Publications Warehouse

    Person, W.J.

    1985-01-01

    In the United States, Wyoming experienced a couple of moderate earthquakes, and off the coast of northern California, a strong earthquake shook much of the northern coast of California and parts of the Oregon coast. 

  15. Earthquakes, July-August 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There was one major earthquake during this reporting period-a magnitude 7.1 shock off the coast of Northern California on August 17. Earthquake-related deaths were reported from Indonesia, Romania, Peru, and Iraq. 

  16. Distribution of similar earthquakes in aftershocks of inland earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, M.; Hiramatsu, Y.; Aftershock Observations Of 2007 Noto Hanto, G.

    2010-12-01

    Frictional properties control the slip behavior on a fault surface such as seismic slip and aseismic slip. Asperity, as a seismic slip area, is characterized by a strong coupling in the interseismic period and large coseismic slip. On the other hand, steady slip or afterslip occurs in an aseismic slip area around the asperity. If an afterslip area includes small asperities, a repeating rupture of single asperity can generate similar earthquakes due to the stress accumulation caused by the afterslip. We here investigate a detail distribution of similar earthquakes in the aftershocks of the 2007 Noto Hanto earthquake (Mjma 6.9) and the 2000 Western Tottori earthquake (Mjma 7.3), inland large earthquakes in Japan. We use the data obtained by the group for the aftershock observations of the 2007 Noto Hanto Earthquake and by the group for the aftershock observations of the 2000 Western Tottori earthquake. First, we select pairs of aftershocks whose cross correlation coefficients in 10 s time window of band-pass filtered waveforms of 1~4 Hz are greater than 0.95 at more than 5 stations and divide those into groups by a link of the cross correlation coefficients. Second, we reexamine the arrival times of P and S waves and the maximum amplitude for earthquakes of each group and apply the double-difference method (Waldhouser and Ellsworth, 2000) to relocate them. As a result of the analysis, we find 24 groups of similar earthquakes in the aftershocks on the source fault of the 2007 Noto Hanto Earthquake and 86 groups of similar earthquakes in the aftershocks on the source fault of the 2000 Western Tottori Earthquake. Most of them are distributed around or outside the asperity of the main shock. Geodetic studies reported that postseismic deformation was detected for the both earthquakes (Sagiya et al., 2002; Hashimoto et al., 2008). The source area of similar earthquakes seems to correspond to the afterslip area. These features suggest that the similar earthquakes observed

  17. Aseismic blocks and destructive earthquakes in the Aegean

    NASA Astrophysics Data System (ADS)

    Stiros, Stathis

    2017-04-01

    Aseismic areas are not identified only in vast, geologically stable regions, but also within regions of active, intense, distributed deformation such as the Aegean. In the latter, "aseismic blocks" about 200m wide were recognized in the 1990's on the basis of the absence of instrumentally-derived earthquake foci, in contrast to surrounding areas. This pattern was supported by the available historical seismicity data, as well as by geologic evidence. Interestingly, GPS evidence indicates that such blocks are among the areas characterized by small deformation rates relatively to surrounding areas of higher deformation. Still, the largest and most destructive earthquake of the 1990's, the 1995 M6.6 earthquake occurred at the center of one of these "aseismic" zones at the northern part of Greece, found unprotected against seismic hazard. This case was indeed a repeat of the case of the tsunami-associated 1956 Amorgos Island M7.4 earthquake, the largest 20th century event in the Aegean back-arc region: the 1956 earthquake occurred at the center of a geologically distinct region (Cyclades Massif in Central Aegean), till then assumed aseismic. Interestingly, after 1956, the overall idea of aseismic regions remained valid, though a "promontory" of earthquake prone-areas intruding into the aseismic central Aegean was assumed. Exploitation of the archaeological excavation evidence and careful, combined analysis of historical and archaeological data and other palaeoseismic, mostly coastal data, indicated that destructive and major earthquakes have left their traces in previously assumed aseismic blocks. In the latter earthquakes typically occur with relatively low recurrence intervals, >200-300 years, much smaller than in adjacent active areas. Interestingly, areas assumed a-seismic in antiquity are among the most active in the last centuries, while areas hit by major earthquakes in the past are usually classified as areas of low seismic risk in official maps. Some reasons

  18. Earthquakes; January-February 1977

    USGS Publications Warehouse

    Person, W.J.

    1977-01-01

    There were no major earthquakes (7.0-7.9) during the first 2 months of the year, and no fatalities were reported. Three strong earthquakes occurred- New Guinea, Tadzhik S.S.R, and the Aleutian Islands. The Tadzhik earthquake on January 31 caused considerable damage and possible injuries. The United States experienced a number of earthquakes, but only very minor damage was reported. 

  19. Damaging earthquakes: A scientific laboratory

    USGS Publications Warehouse

    Hays, Walter W.; ,

    1996-01-01

    This paper reviews the principal lessons learned from multidisciplinary postearthquake investigations of damaging earthquakes throughout the world during the past 15 years. The unique laboratory provided by a damaging earthquake in culturally different but tectonically similar regions of the world has increased fundamental understanding of earthquake processes, added perishable scientific, technical, and socioeconomic data to the knowledge base, and led to changes in public policies and professional practices for earthquake loss reduction.

  20. Earthquakes; January-February, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The first major earthquake (magnitude 7.0 to 7.9) of the year struck in southeastern Alaska in a sparsely populated area on February 28. On January 16, Iran experienced the first destructive earthquake of the year causing a number of casualties and considerable damage. Peru was hit by a destructive earthquake on February 16 that left casualties and damage. A number of earthquakes were experienced in parts of the Untied States, but only minor damage was reported. 

  1. [Infectious disease surveillance in Miyagi after the Great East Japan Earthquake].

    PubMed

    Kim, Mihyun; Kamigaki, Taro; Mimura, Satoshi; Oshitani, Hitoshi

    2013-10-01

    The Great East Japan Earthquake, which occurred on March 11, 2011, damaged many health facilities and compelled many inhabitants to live in evacuation centers. For the purpose of monitoring infectious disease outbreaks, infectious disease surveillance targeted at evacuation centers was established in Miyagi Prefecture. In this study, we summarized the monitoring activities of infectious diseases through this surveillance after the earthquake. Infectious disease surveillance was implemented from March 18 to November 6, 2011. The surveillance consisted of two phases (hereafter, surveillance 1 and 2) reflecting the difference in frequencies of reporting as well as the number of targeted diseases. Surveillance 1 operated between March 18 and May 13, 2011, and Surveillance 2 operated between May 10 and November 6, 2011. We reviewed the number of cases reported, the number of evacuation centers, and demographic information of evacuees with the surveillance. In Surveillance 1, there were 8,737 reported cases; 84% of them were acute respiratory symptoms, and 16% were acute digestive symptoms. Only 4.4% of evacuation centers were covered by the surveillance one week after the earthquake. In Surveillance 2, 1,339 cases were reported; 82% of them were acute respiratory symptoms, and 13% were acute digestive symptoms. Surveillance 2 revealed that the proportion of children aged 5 years and younger was lower than that of other age groups in all targeted diseases. No particular outbreaks were detected through those surveillances. Infectious disease surveillance operated from one week after the earthquake to the closure of all evacuation centers in Miyagi Prefecture. No outbreaks were detected in that period. However, low coverage of evacuation centers just after the earthquake as well as skewed frequencies of reported syndromes draw attention to the improvement of the early warning system. It is important to coordinate with the medical aid team that visits the evacuation centers

  2. InSAR observations of the 2009 Racha earthquake, Georgia

    NASA Astrophysics Data System (ADS)

    Nikolaeva, Elena; Walter, Thomas R.

    2016-09-01

    Central Georgia is an area strongly affected by earthquake and landslide hazards. On 29 April 1991 a major earthquake (Mw  =  7.0) struck the Racha region in Georgia, followed by aftershocks and significant afterslip. The same region was hit by another major event (Mw  =  6.0) on 7 September 2009. The aim of the study reported here was to utilize interferometric synthetic aperture radar (InSAR) data to improve knowledge about the spatial pattern of deformation due to the 2009 earthquake. There were no actual earthquake observations by InSAR in Georgia. We considered all available SAR data images from different space agencies. However, due to the long wavelength and the frequent acquisitions, only the multi-temporal ALOS L-band SAR data allowed us to produce interferograms spanning the 2009 earthquake. We detected a local uplift around 10 cm (along the line-of-sight propagation) in the interferogram near the earthquake's epicenter, whereas evidence of surface ruptures could not be found in the field along the active thrust fault. We simulated a deformation signal which could be created by the 2009 Racha earthquake on the basis of local seismic records and by using an elastic dislocation model. We compared our modeled fault surface of the September 2009 with the April 1991 Racha earthquake fault surfaces and identify the same fault or a sub-parallel fault of the same system as the origin. The patch that was active in 2009 is just adjacent to the 1991 patch, indicating a possible mainly westward propagation direction, with important implications for future earthquake hazards.

  3. Turkish Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  4. Earthquakes, March-April 1978

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    Earthquakes caused fatalities in Mexico and Sicily; injuries and damage were sustained in eastern Kazakh SSR and Yugoslavia. There were four major earthquakes; one south of Honshu, Japan, two in the Kuril Islands region, and one in the Soviet Union. The United States experienced a number of earthquakes, but only very minor damage was reported. 

  5. Earthquakes, May-June 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage. 

  6. Earthquakes, March-April 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    Two major earthquakes (7.0-7.9) occurred during this reporting period: a magnitude 7.6 in Costa Rica on April 22 and a magntidue 7.0 in the USSR on April 29. Destructive earthquakes hit northern Peru on April 4 and 5. There were no destructive earthquakes in the United States during this period. 

  7. Earthquakes, May-June 1984

    USGS Publications Warehouse

    Person, W.J.

    1984-01-01

    No major earthquakes (7.0-7.9) occurred during this reporting period. earthquake-rated deaths were reported from Italy, the Dominican Republic, and Yugoslavia. A number of earthquakes occurred in the United States but none caused casualties or any significant damage. 

  8. Earthquakes, September-October 1993

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    The fatalities in the United States were caused by two earthquakes in southern Oregon on September 21. These earthquakes, both with magnitude 6.0 and separated in time by about 2 hrs, led to the deaths of two people. One of these deaths was apparently due to a heart attack induced by the earthquake

  9. Organizational changes at Earthquakes & Volcanoes

    USGS Publications Warehouse

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  10. Earthquakes, March-April, 1993

    USGS Publications Warehouse

    Person, Waverly J.

    1993-01-01

    Worldwide, only one major earthquake (7.0earthquake, a magnitude 7.2 shock, struck the Santa Cruz Islands region in the South Pacific on March 6. Earthquake-related deaths occurred in the Fiji Islands, China, and Peru.

  11. Earthquakes, January-February 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In terms of seismic activity, the first two months of 1992 were somewhat quiet. There was one major earthquake (7.0-7.9) during this reporting period-a magntidue 7.1 earthquake in the Vanuatu Islands. There were no earthquake-related deaths for the first two months.

  12. Earthquakes, September-October 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region. 

  13. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  14. Subducted sediment thickness and Mw 9 earthquakes

    NASA Astrophysics Data System (ADS)

    Seno, Tetsuzo

    2017-01-01

    I measure the thickness of subducted sediment (Δss) beneath the décollement in the fore-arc wedge and show that the average value of Δss over a subduction zone segment (Δss>¯) is greater than 1.3 km in segments where Mw ≥ 9 earthquakes have occurred and less than 1.2 km in segments without such large earthquakes. In a previous study, I showed that the stress drop (Δσ) of large earthquakes (Mw ≥ 7) averaged over a subduction zone segment (Δσ>¯) is larger in segments where Mw ≥ 9 earthquakes have occurred than in segments without such an event. It has also been shown that Δσ>¯ is linearly related to 1 - λ (λ = the pore fluid pressure ratio in the interplate megathrust). In this study, I revise the previous estimates of Δσ>¯ and λ and show that there is a positive correlation between Δss>¯, Δσ>¯, and 1 - λ. I present a model that relates Δss to 1 - λ based on the porous flow of H2O in the subducted sediments, which gives a theoretical basis for the correlation between Δss>¯ and Δσ>¯. The combination of these parameters thus provides a better indicator for identifying segments where Mw ≥ 9 earthquakes may occur. Based on this, I propose that the tectonic environments where such huge events are likely to occur are (1) near collision zones, (2) near subduction of spreading centers, and (3) erosive margins with compressional fore arcs. Near the Japanese islands, SE Hokkaido is prone to such an event, but the Nankai Trough is not.

  15. WEST Physics Basis

    NASA Astrophysics Data System (ADS)

    Bourdelle, C.; Artaud, J. F.; Basiuk, V.; Bécoulet, M.; Brémond, S.; Bucalossi, J.; Bufferand, H.; Ciraolo, G.; Colas, L.; Corre, Y.; Courtois, X.; Decker, J.; Delpech, L.; Devynck, P.; Dif-Pradalier, G.; Doerner, R. P.; Douai, D.; Dumont, R.; Ekedahl, A.; Fedorczak, N.; Fenzi, C.; Firdaouss, M.; Garcia, J.; Ghendrih, P.; Gil, C.; Giruzzi, G.; Goniche, M.; Grisolia, C.; Grosman, A.; Guilhem, D.; Guirlet, R.; Gunn, J.; Hennequin, P.; Hillairet, J.; Hoang, T.; Imbeaux, F.; Ivanova-Stanik, I.; Joffrin, E.; Kallenbach, A.; Linke, J.; Loarer, T.; Lotte, P.; Maget, P.; Marandet, Y.; Mayoral, M. L.; Meyer, O.; Missirlian, M.; Mollard, P.; Monier-Garbet, P.; Moreau, P.; Nardon, E.; Pégourié, B.; Peysson, Y.; Sabot, R.; Saint-Laurent, F.; Schneider, M.; Travère, J. M.; Tsitrone, E.; Vartanian, S.; Vermare, L.; Yoshida, M.; Zagorski, R.; Contributors, JET

    2015-06-01

    With WEST (Tungsten Environment in Steady State Tokamak) (Bucalossi et al 2014 Fusion Eng. Des. 89 907-12), the Tore Supra facility and team expertise (Dumont et al 2014 Plasma Phys. Control. Fusion 56 075020) is used to pave the way towards ITER divertor procurement and operation. It consists in implementing a divertor configuration and installing ITER-like actively cooled tungsten monoblocks in the Tore Supra tokamak, taking full benefit of its unique long-pulse capability. WEST is a user facility platform, open to all ITER partners. This paper describes the physics basis of WEST: the estimated heat flux on the divertor target, the planned heating schemes, the expected behaviour of the L-H threshold and of the pedestal and the potential W sources. A series of operating scenarios has been modelled, showing that ITER-relevant heat fluxes on the divertor can be achieved in WEST long pulse H-mode plasmas.

  16. Forecasters of earthquakes

    NASA Astrophysics Data System (ADS)

    Maximova, Lyudmila

    1987-07-01

    For the first time Soviet scientists have set up a bioseismological proving ground which will stage a systematic extensive experiment of using birds, ants, mountain rodents including marmots, which can dig holes in the Earth's interior to a depth of 50 meters, for the purpose of earthquake forecasting. Biologists have accumulated extensive experimental data on the impact of various electromagnetic fields, including fields of weak intensity, on living organisms. As far as mammals are concerned, electromagnetic waves with frequencies close to the brain's biorhythms have the strongest effect. How these observations can be used to forecast earthquakes is discussed.

  17. The 1976 Tangshan earthquake

    USGS Publications Warehouse

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  18. Earthquakes in New England

    USGS Publications Warehouse

    Fratto, E. S.; Ebel, J.E.; Kadinsky-Cade, K.

    1990-01-01

    New England has a long history of earthquakes. Some of the first explorers were startled when they experienced strong shaking and rumbling of the earth below their feet. they soon learned from the Indians that this was not an uncommon occurrence in the New World. the Plymouth Pilgrims felt their first earthquake in 1638. that first shock rattled dishes, doors, and buildings. The shaking so frightened those working in the fields that they threw down their tools and ran panic-stricken through the countryside. 

  19. Characterisation of active faulting and earthquake hazard in the Mongolian Altay Mountains based on previously unknown ancient earthquake surface ruptures

    NASA Astrophysics Data System (ADS)

    Gregory, L. C.; Walker, R.; Nissen, E.; Mac Niocaill, C.; Gantulga, B.; Amgalan, B.

    2012-12-01

    Earthquakes in continental collision zones are typically distributed across a region that may be several thousands of kilometres away from the main collisional margin. This far-field deformation is poorly understood in terms of how strain is distributed onto upper crustal faults, particularly because active faults can be difficult to identify in regions where historical seismicity is sparse. The collision between India and Asia forms the most impressive example of active continental deformation on earth, with several zones of faulting and uplift extending across a region over 2500 km wide. The Altay Mountains, in western Mongolia, are at the northern edge of the India-Asia collision zone. Active dextral strike-slip faults in the Altay have produced M 8 earthquakes (such as the 1931 Fu Yun earthquake), and according to GPS measurements, the region accommodates approximately 7 mm/yr of shortening. Surface ruptures of pre-historic earthquakes are exceptionally preserved due to the cold and arid climate of the Altay. Observed surface ruptures are an effective extension to the historical seismic record, because the size and expression of ruptures may reveal important characteristics of the Altay active faults, such as typical earthquake magnitudes and definitive locations of active faults. We present observations of, previously unknown, surface ruptures and active faulting from the central Altay. The moment magnitudes of the ancient earthquakes are estimated based on the length of the ruptures using classic earthquake scaling laws. The newly discovered ruptures are combined with previously described earthquake ruptures to estimate the combined strike-slip rates of the Altay faults over the past ~1000 years on the basis of total moment release. This strike-slip rate will be discussed in the context of the modern-day estimates of shortening rate and the implications for the earthquake hazard in western Mongolia.

  20. The Doctrinal Basis for Medical Stability Operations

    DTIC Science & Technology

    2010-01-01

    security forces (HNSFs). As most developing countries have poor health systems, all levels and branches of the health sector should be targeted including...actors will be available in more permissive environments, perhaps by orders of magnitude. Poor coordi- nation fragments efforts, weakens health systems...further services unless there is imme- diate threat to life, limb, or eyesight . MEDCAPs may have a role in rural areas without services but government

  1. Seismogeodesy for rapid earthquake and tsunami characterization

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2016-12-01

    dozens of seismogeodetic stations available through the Pacific Northwest Seismic Network (University of Washington), the Plate Boundary Observatory (UNAVCO) and the Pacific Northwest Geodetic Array (Central Washington University) as the basis for local tsunami warnings for a large subduction zone earthquake in Cascadia.

  2. Earthquake science in resilient societies

    NASA Astrophysics Data System (ADS)

    Stahl, T.; Clark, M. K.; Zekkos, D.; Athanasopoulos-Zekkos, A.; Willis, M.; Medwedeff, William; Knoper, Logan; Townsend, Kirk; Jin, Jonson

    2017-04-01

    Earthquake science is critical in reducing vulnerability to a broad range of seismic hazards. Evidence-based studies drawing from several branches of the Earth sciences and engineering can effectively mitigate losses experienced in earthquakes. Societies that invest in this research have lower fatality rates in earthquakes and can recover more rapidly. This commentary explores the scientific pathways through which earthquake-resilient societies are developed. We highlight recent case studies of evidence-based decision making and how modern research is improving the way societies respond to earthquakes.

  3. Earthquakes, November-December 1992

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California. 

  4. Earthquakes; March-April 1975

    USGS Publications Warehouse

    Person, W.J.

    1975-01-01

    There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971. 

  5. Monitoring road losses for Lushan 7.0 earthquake disaster utilization multisource remote sensing images

    NASA Astrophysics Data System (ADS)

    Huang, He; Yang, Siquan; Li, Suju; He, Haixia; Liu, Ming; Xu, Feng; Lin, Yueguan

    2015-12-01

    Earthquake is one major nature disasters in the world. At 8:02 on 20 April 2013, a catastrophic earthquake with Ms 7.0 in surface wave magnitude occurred in Sichuan province, China. The epicenter of this earthquake located in the administrative region of Lushan County and this earthquake was named the Lushan earthquake. The Lushan earthquake caused heavy casualties and property losses in Sichuan province. After the earthquake, various emergency relief supplies must be transported to the affected areas. Transportation network is the basis for emergency relief supplies transportation and allocation. Thus, the road losses of the Lushan earthquake must be monitoring. The road losses monitoring results for Lushan earthquake disaster utilization multisource remote sensing images were reported in this paper. The road losses monitoring results indicated that there were 166 meters' national roads, 3707 meters' provincial roads, 3396 meters' county roads, 7254 meters' township roads, and 3943 meters' village roads were damaged during the Lushan earthquake disaster. The damaged roads mainly located at Lushan County, Baoxing County, Tianquan County, Yucheng County, Mingshan County, and Qionglai County. The results also can be used as a decision-making information source for the disaster management government in China.

  6. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    NASA Astrophysics Data System (ADS)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  7. Study on Earthquake Emergency Evacuation Drill Trainer Development

    NASA Astrophysics Data System (ADS)

    ChangJiang, L.

    2016-12-01

    With the improvement of China's urbanization, to ensure people survive the earthquake needs scientific routine emergency evacuation drills. Drawing on cellular automaton, shortest path algorithm and collision avoidance, we designed a model of earthquake emergency evacuation drill for school scenes. Based on this model, we made simulation software for earthquake emergency evacuation drill. The software is able to perform the simulation of earthquake emergency evacuation drill by building spatial structural model and selecting the information of people's location grounds on actual conditions of constructions. Based on the data of simulation, we can operate drilling in the same building. RFID technology could be used here for drill data collection which read personal information and send it to the evacuation simulation software via WIFI. Then the simulation software would contrast simulative data with the information of actual evacuation process, such as evacuation time, evacuation path, congestion nodes and so on. In the end, it would provide a contrastive analysis report to report assessment result and optimum proposal. We hope the earthquake emergency evacuation drill software and trainer can provide overall process disposal concept for earthquake emergency evacuation drill in assembly occupancies. The trainer can make the earthquake emergency evacuation more orderly, efficient, reasonable and scientific to fulfill the increase in coping capacity of urban hazard.

  8. An experiment in earthquake control at rangely, colorado.

    PubMed

    Raleigh, C B; Healy, J H; Bredehoeft, J D

    1976-03-26

    An experiment in an oil field at Rangely, Colorado, has demonstrated the feasibility of earthquake control. Variations in seismicity were produced by controlled variations in the fluid pressure in a seismically active zone. Precise earthquake locations revealed that the earthquakes clustered about a fault trending through a zone of high pore pressure produced by secondary recovery operations. Laboratory measurements of the frictional properties of the reservoir rocks and an in situ stress measurement made near the earthquake zone were used to predict the fluid pressure required to trigger earthquakes on preexisting fractures. Fluid pressure was controlled by alternately injecting and recovering water from wells that penetrated the seismic zone. Fluid pressure was monitored in observation wells, and a computer model of the reservoir was used to infer the fluid pressure distributions in the vicinity of the injection wells. The results of this experiment confirm the predicted effect of fluid pressure on earthquake activity and indicate that earthquakes can be controlled wherever we can control the fluid pressure in a fault zone.

  9. Force and pressure characteristics for a series of nose inlets at Mach numbers from 1.59 to 1.99 V : analysis and comparison on basis of ram-jet aircraft range and operational characteristics

    NASA Technical Reports Server (NTRS)

    Howard, E; Luidens, R W; Allen, J L

    1951-01-01

    Performance of four experimentally investigated axially symmetric spike-type nose inlets is compared on basis of ram-jet-engine aircraft range and operational problems. At design conditions, calculated peak engine efficiencies varied 25 percent from the highest value which indicates importance of inlet design. Calculations for a typical supersonic aircraft indicate possible increase in range if engine is flown at moderate angle of attack and result in engine lift utilized. For engines with fixed exhaust nozzle, propulsive thrust increases with increasing heat addition in subcritical flow region in spite of increasing additive drag. For the perforated inlet there is a range of increasing total-temperature ratios in subcritical flow region that does not yield an increase in propulsive thrust. Effects of inlet characteristics on speed stability of a typical aircraft for three types of fuel control is discussed.

  10. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  11. Baja Earthquake Perspective View

    NASA Image and Video Library

    2010-04-05

    The topography surrounding the Laguna Salada Fault in the Mexican state of Baja, California, is shown in this combined radar image and topographic view with data from NASA Shuttle Radar Topography Mission where a 7.2 earthquake struck on April 4, 2010.

  12. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  13. Earthquake damage to schools

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    These unusual slides show earthquake damage to school and university buildings around the world. They graphically illustrate the potential danger to our schools, and to the welfare of our children, that results from major earthquakes. The slides range from Algeria, where a collapsed school roof is held up only by students' desks; to Anchorage, Alaska, where an elementary school structure has split in half; to California and other areas, where school buildings have sustained damage to walls, roofs, and chimneys. Interestingly, all the United States earthquakes depicted in this set of slides occurred either on a holiday or before or after school hours, except the 1935 tremor in Helena, Montana, which occurred at 11:35 am. It undoubtedly would have caused casualties had the schools not been closed days earlier by Helena city officials because of a damaging foreshock. Students in Algeria, the People's Republic of China, Armenia, and other stricken countries were not so fortunate. This set of slides represents 17 destructive earthquakes that occurred in 9 countries, and covers more than a century--from 1886 to 1988. Two of the tremors, both of which occurred in the United States, were magnitude 8+ on the Richter Scale, and four were magnitude 7-7.9. The events represented by the slides (see table below) claimed more than a quarter of a million lives.

  14. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  15. HOMOGENEOUS CATALOGS OF EARTHQUAKES*

    PubMed Central

    Knopoff, Leon; Gardner, J. K.

    1969-01-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967. PMID:16578700

  16. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  17. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  18. Earthquake history of Missouri

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Most of Missouri's earthquake activity has been concentrated in the southeast corner of the State, which lies within the New Madrid seismic zone. As recently as Merch 29, 1972, the region was jolted by a magnitude 3.7 shock that was felt over a 168,000 square kilometre area including parts of Arkansas, Illinois, Mississippi, and Tennessee. 

  19. Fractal dynamics of earthquakes

    SciTech Connect

    Bak, P.; Chen, K.

    1995-05-01

    Many objects in nature, from mountain landscapes to electrical breakdown and turbulence, have a self-similar fractal spatial structure. It seems obvious that to understand the origin of self-similar structures, one must understand the nature of the dynamical processes that created them: temporal and spatial properties must necessarily be completely interwoven. This is particularly true for earthquakes, which have a variety of fractal aspects. The distribution of energy released during earthquakes is given by the Gutenberg-Richter power law. The distribution of epicenters appears to be fractal with dimension D {approx} 1--1.3. The number of after shocks decay as a function of time according to the Omori power law. There have been several attempts to explain the Gutenberg-Richter law by starting from a fractal distribution of faults or stresses. But this is a hen-and-egg approach: to explain the Gutenberg-Richter law, one assumes the existence of another power-law--the fractal distribution. The authors present results of a simple stick slip model of earthquakes, which evolves to a self-organized critical state. Emphasis is on demonstrating that empirical power laws for earthquakes indicate that the Earth`s crust is at the critical state, with no typical time, space, or energy scale. Of course the model is tremendously oversimplified; however in analogy with equilibrium phenomena they do not expect criticality to depend on details of the model (universality).

  20. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    NASA Astrophysics Data System (ADS)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Barış, Şerif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is

  1. Detailed source process of the 2007 Tocopilla earthquake.

    NASA Astrophysics Data System (ADS)

    Peyrat, S.; Madariaga, R.; Campos, J.; Asch, G.; Favreau, P.; Bernard, P.; Vilotte, J.

    2008-05-01

    We investigated the detail rupture process of the Tocopilla earthquake (Mw 7.7) of the 14 November 2007 and of the main aftershocks that occurred in the southern part of the North Chile seismic gap using strong motion data. The earthquake happen in the middle of the permanent broad band and strong motion network IPOC newly installed by GFZ and IPGP, and of a digital strong-motion network operated by the University of Chile. The Tocopilla earthquake is the last large thrust subduction earthquake that occurred since the major Iquique 1877 earthquake which produced a destructive tsunami. The Arequipa (2001) and Antofagasta (1995) earthquakes already ruptured the northern and southern parts of the gap, and the intraplate intermediate depth Tarapaca earthquake (2005) may have changed the tectonic loading of this part of the Peru-Chile subduction zone. For large earthquakes, the depth of the seismic rupture is bounded by the depth of the seismogenic zone. What controls the horizontal extent of the rupture for large earthquakes is less clear. Factors that influence the extent of the rupture include fault geometry, variations of material properties and stress heterogeneities inherited from the previous ruptures history. For subduction zones where structures are not well known, what may have stopped the rupture is not obvious. One crucial problem raised by the Tocopilla earthquake is to understand why this earthquake didn't extent further north, and at south, what is the role of the Mejillones peninsula that seems to act as a barrier. The focal mechanism was determined using teleseismic waveforms inversion and with a geodetic analysis (cf. Campos et al.; Bejarpi et al., in the same session). We studied the detailed source process using the strong motion data available. This earthquake ruptured the interplate seismic zone over more than 150 km and generated several large aftershocks, mainly located south of the rupture area. The strong-motion data show clearly two S

  2. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  3. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  4. The HayWired earthquake scenario—Earthquake hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-01-01

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  5. Rationalizing Hybrid Earthquake Probabilities

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Reasenberg, P.; Beeler, N.; Cocco, M.; Belardinelli, M.

    2003-12-01

    An approach to including stress transfer and frictional effects in estimates of the probability of failure of a single fault affected by a nearby earthquake has been suggested in Stein et al. (1997). This `hybrid' approach combines conditional probabilities, which depend on the time elapsed since the last earthquake on the affected fault, with Poissonian probabilities that account for friction and depend only on the time since the perturbing earthquake. The latter are based on the seismicity rate change model developed by Dieterich (1994) to explain the temporal behavior of aftershock sequences in terms of rate-state frictional processes. The model assumes an infinite population of nucleation sites that are near failure at the time of the perturbing earthquake. In the hybrid approach, assuming the Dieterich model can lead to significant transient increases in failure probability. We explore some of the implications of applying the Dieterich model to a single fault and its impact on the hybrid probabilities. We present two interpretations that we believe can rationalize the use of the hybrid approach. In the first, a statistical distribution representing uncertainties in elapsed and/or mean recurrence time on the fault serves as a proxy for Dieterich's population of nucleation sites. In the second, we imagine a population of nucleation patches distributed over the fault with a distribution of maturities. In both cases we find that the probability depends on the time since the last earthquake. In particular, the size of the transient probability increase may only be significant for faults already close to failure. Neglecting the maturity of a fault may lead to overestimated rate and probability increases.

  6. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  7. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

  8. Improving the RST Approach for Earthquake Prone Areas Monitoring: Results of Correlation Analysis among Significant Sequences of TIR Anomalies and Earthquakes (M>4) occurred in Italy during 2004-2014

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Coviello, I.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2015-12-01

    Looking toward the assessment of a multi-parametric system for dynamically updating seismic hazard estimates and earthquake short term (from days to weeks) forecast, a preliminary step is to identify those parameters (chemical, physical, biological, etc.) whose anomalous variations can be, to some extent, associated to the complex process of preparation of a big earthquake. Among the different parameters, the fluctuations of Earth's thermally emitted radiation, as measured by sensors on board of satellite system operating in the Thermal Infra-Red (TIR) spectral range, have been proposed since long time as potential earthquake precursors. Since 2001, a general approach called Robust Satellite Techniques (RST) has been used to discriminate anomalous thermal signals, possibly associated to seismic activity from normal fluctuations of Earth's thermal emission related to other causes (e.g. meteorological) independent on the earthquake occurrence. Thanks to its full exportability on different satellite packages, RST has been implemented on TIR images acquired by polar (e.g. NOAA-AVHRR, EOS-MODIS) and geostationary (e.g. MSG-SEVIRI, NOAA-GOES/W, GMS-5/VISSR) satellite sensors, in order to verify the presence (or absence) of TIR anomalies in presence (absence) of earthquakes (with M>4) in different seismogenic areas around the world (e.g. Italy, Turkey, Greece, California, Taiwan, etc.).In this paper, a refined RST (Robust Satellite Techniques) data analysis approach and RETIRA (Robust Estimator of TIR Anomalies) index were used to identify Significant Sequences of TIR Anomalies (SSTAs) during eleven years (from May 2004 to December 2014) of TIR satellite records, collected over Italy by the geostationary satellite sensor MSG-SEVIRI. On the basis of specific validation rules (mainly based on physical models and results obtained by applying RST approach to several earthquakes all around the world) the level of space-time correlation among SSTAs and earthquakes (with M≥4

  9. The health effects of earthquakes in the mid-1990s.

    PubMed

    Alexander, D

    1996-09-01

    This paper gives an overview of the global pattern of casualties in earthquakes which occurred during the 30-month period from 1 September 1993 to 29 February 1996. It also describes some of the behavioural and logistical regularities associated with mortality and morbidity in these events. Of 83 earthquakes studied, there were casualties in 49. Lethal earthquakes occurred in rapid succession in Indonesia, China, Colombia and Iran. In the events studied, a disproportionate number of deaths and injuries occurred during the first six hours of the day and in earthquakes with magnitudes between 6.5 and 7.4. Ratios of death to injury varied markedly (though with some averages close to 1:3), as did the nature and causes of mortality and morbidity and the proportion of serious to slight injuries. As expected on the basis of previous knowledge, few problems were caused by post-earthquake illness and disease. Also, as expected, building collapse was the principal source of casualties: tsunamis, landslides, debris flows and bridge collapses were the main secondary causes. In addition, new findings are presented on the temporal sequence of casualty estimates after seismic disaster. In synthesis, though mortality in earthquakes may have been low in relation to long-term averages, the interval of time studied was probably typical of other periods in which seismic catastrophes were relatively limited in scope.

  10. Smartphone-Based Earthquake and Tsunami Early Warning in Chile

    NASA Astrophysics Data System (ADS)

    Brooks, B. A.; Baez, J. C.; Ericksen, T.; Barrientos, S. E.; Minson, S. E.; Duncan, C.; Guillemot, C.; Smith, D.; Boese, M.; Cochran, E. S.; Murray, J. R.; Langbein, J. O.; Glennie, C. L.; Dueitt, J.; Parra, H.

    2016-12-01

    Many locations around the world face high seismic hazard, but do not have the resources required to establish traditional earthquake and tsunami warning systems (E/TEW) that utilize scientific grade seismological sensors. MEMs accelerometers and GPS chips embedded in, or added inexpensively to, smartphones are sensitive enough to provide robust E/TEW if they are deployed in sufficient numbers. We report on a pilot project in Chile, one of the most productive earthquake regions world-wide. There, magnitude 7.5+ earthquakes occurring roughly every 1.5 years and larger tsunamigenic events pose significant local and trans-Pacific hazard. The smartphone-based network described here is being deployed in parallel to the build-out of a scientific-grade network for E/TEW. Our sensor package comprises a smartphone with internal MEMS and an external GPS chipset that provides satellite-based augmented positioning and phase-smoothing. Each station is independent of local infrastructure, they are solar-powered and rely on cellular SIM cards for communications. An Android app performs initial onboard processing and transmits both accelerometer and GPS data to a server employing the FinDer-BEFORES algorithm to detect earthquakes, producing an acceleration-based line source model for smaller magnitude earthquakes or a joint seismic-geodetic finite-fault distributed slip model for sufficiently large magnitude earthquakes. Either source model provides accurate ground shaking forecasts, while distributed slip models for larger offshore earthquakes can be used to infer seafloor deformation for local tsunami warning. The network will comprise 50 stations by Sept. 2016 and 100 stations by Dec. 2016. Since Nov. 2015, batch processing has detected, located, and estimated the magnitude for Mw>5 earthquakes. Operational since June, 2016, we have successfully detected two earthquakes > M5 (M5.5, M5.1) that occurred within 100km of our network while producing zero false alarms.

  11. Scientific aspects of the Tohoku earthquake and Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Koketsu, Kazuki

    2016-04-01

    We investigated the 2011 Tohoku earthquake, the accident of the Fukushima Daiichi nuclear power plant, and assessments conducted beforehand for earthquake and tsunami potential in the Pacific offshore region of the Tohoku District. The results of our investigation show that all the assessments failed to foresee the earthquake and its related tsunami, which was the main cause of the accident. Therefore, the disaster caused by the earthquake, and the accident were scientifically unforeseeable at the time. However, for a zone neighboring the reactors, a 2008 assessment showed tsunamis higher than the plant height. As a lesson learned from the accident, companies operating nuclear power plants should be prepared using even such assessment results for neighboring zones.

  12. Development of a High-Power Wideband Amplifier on the Basis of a Free-Electron Maser Having an Operating Frequency Near 30 GHz: Modeling and Results of the Initial Experiments

    NASA Astrophysics Data System (ADS)

    Bandurkin, I. V.; Donets, D. E.; Kaminsky, A. K.; Kuzikov, S. V.; Perel'shteyn, E. A.; Peskov, N. Yu.; Savilov, A. V.; Sedykh, S. N.

    2017-01-01

    We develop a high-power wideband amplifier based on a free-electron maser for particle acceleration, which will be operated in the 30 GHz frequency band, on the basis of the LIU-3000 linear induction accelerator forming an electron beam with an electron energy of 0.8 MeV, a current of 250 A, and a pulse duration of 200 ns. As the operating regime, we chose the regime of grazing of dispersion curves, since, according to the modeling performed, it allows one to ensure an instantaneous amplification band of about 5-7% in an undulator with regular winding for an output radiation power at a level of 20 MW and a gain of 30-35 dB. The results of the first experiments studying this FEM-based scheme are presented, in which the specified power level is achieved in the range around 30 GHz, and fast tuning of ±0.5 GHz in the band of variations in the frequency of the master magnetron is demonstrated. Modeling shows that the use of the non-resonance trapping/braking regime, which is realized in an undulator with profiled parameters, allows one to expect an increase in the radiation power of up to 35-40 MW with simultaneous widening of the amplification band up to 30% under the conditions of the LIU-3000 experiments.

  13. Listening to Earthquakes with Infrasound

    NASA Astrophysics Data System (ADS)

    Mucek, A. E.; Langston, C. A.

    2011-12-01

    A tripartite infrasound array was installed to listen to earthquakes occurring along the Guy-Greenbrier fault in Arkansas. The active earthquake swarm is believed to be caused by deep waste water injections and will allow us to explain the mechanisms causing earthquake "booms" that have been heard during an earthquake. The array has an aperture of 50 meters and is installed next to the X301 seismograph station run by the Center for Earthquake Research and Information (CERI). This arrangement allows simultaneous recording of seismic and acoustic changes from the arrival of an earthquake. Other acoustic and seismic sources that have been found include thunder from thunderstorms, gunshots, quarry explosions and hydraulic fracturing activity from the local gas wells. The duration of the experiment is from the last week of June to the last week of September 2011. During the first month and a half, seven local earthquakes were recorded, along with numerous occurrences of the other infrasound sources. Phase arrival times of the recorded waves allow us to estimate wave slowness and azimuth of infrasound events. Using these two properties, we can determine whether earthquake "booms" occur at a site from the arrival of the P-wave or whether the earthquake "booms" occur elsewhere and travel through the atmosphere. Preliminary results show that the infrasound correlates well to the ground motion during an earthquake for frequencies below 15 Hertz.

  14. Global Review of Induced and Triggered Earthquakes

    NASA Astrophysics Data System (ADS)

    Foulger, G. R.; Wilson, M.; Gluyas, J.; Julian, B. R.; Davies, R. J.

    2016-12-01

    Natural processes associated with very small incremental stress changes can modulate the spatial and temporal occurrence of earthquakes. These processes include tectonic stress changes, the migration of fluids in the crust, Earth tides, surface ice and snow loading, heavy rain, atmospheric pressure, sediment unloading and groundwater loss. It is thus unsurprising that large anthropogenic projects which may induce stress changes of a similar size also modulate seismicity. As human development accelerates and industrial projects become larger in scale and more numerous, the number of such cases is increasing. That mining and water-reservoir impoundment can induce earthquakes has been accepted for several decades. Now, concern is growing about earthquakes induced by activities such as hydraulic fracturing for shale-gas extraction and waste-water disposal via injection into boreholes. As hydrocarbon reservoirs enter their tertiary phases of production, seismicity may also increase there. The full extent of human activities thought to induce earthquakes is, however, much wider than generally appreciated. We have assembled as near complete a catalog as possible of cases of earthquakes postulated to have been induced by human activity. Our database contains a total of 705 cases and is probably the largest compilation made to date. We include all cases where reasonable arguments have been made for anthropogenic induction, even where these have been challenged in later publications. Our database presents the results of our search but leaves judgment about the merits of individual cases to the user. We divide anthropogenic earthquake-induction processes into: a) Surface operations, b) Extraction of mass from the subsurface, c) Introduction of mass into the subsurface, and d) Explosions. Each of these categories is divided into sub-categories. In some cases, categorization of a particular case is tentative because more than one anthropogenic activity may have preceded or been

  15. VLF/LF EM emissions as main precursor of earthquakes and their searching possibilities for Georgian s/a region

    NASA Astrophysics Data System (ADS)

    Kachakhidze, Manana; Kachakhidze, Nino

    2016-04-01

    Authors of abstract have created work which offers model of earth electromagnetic emissions generation detected in the process of earthquake preparation on the basis of electrodynamics. The model gives qualitative explanation of a mechanism of generation of electromagnetic waves emitted in the earthquake preparation period. Besides, scheme of the methodology of earthquake forecasting is created based on avalanche-like unstable model of fault formation and an analogous model of electromagnetic contour, synthesis of which, is rather harmonious. According to the authors of the work electromagnetic emissions in radiodiapason is more universal and reliable than other anomalous variations of various geophysical phenomena in earthquake preparation period; Besides, VLF/LF electromagnetic emissions might be declared as the main precursor of earthquake because it might turn out very useful with the view of prediction of large (M ≥5) inland earthquakes and to govern processes going on in lithosphere-atmosphere - ionosphere coupling (LAIC) system. Since the other geophysical phenomena, which may accompany earthquake preparation process and expose themselves several months, weeks or days prior to earthquakes are less informative with the view of earthquake forecasting, it is admissible to consider them as earthquake indicators. Physical mechanisms of mentioned phenomena are explained on the basis of the model of generation of electromagnetic emissions detected before earthquake, where a process of earthquake preparation and its realization are considered taking into account distributed and conservative systems properties. Up to these days electromagnetic emissions detection network did not exist in Georgia. European colleagues helped us (Prof. Dr. PF Biagi, Prof. Dr. Aydın BÜYÜKSARAÇ) and made possible the installation of a receiver. We are going to develop network and put our share in searching of earthquakes problem. Participation in conference is supported by financial

  16. Earthquake triggering at alaskan volcanoes following the 3 November 2002 denali fault earthquake

    USGS Publications Warehouse

    Moran, S.C.; Power, J.A.; Stihler, S.D.; Sanchez, J.J.; Caplan-Auerbach, J.

    2004-01-01

    The 3 November 2002 Mw 7.9 Denali fault earthquake provided an excellent opportunity to investigate triggered earthquakes at Alaskan volcanoes. The Alaska Volcano Observatory operates short-period seismic networks on 24 historically active volcanoes in Alaska, 247-2159 km distant from the mainshock epicenter. We searched for evidence of triggered seismicity by examining the unfiltered waveforms for all stations in each volcano network for ???1 hr after the Mw 7.9 arrival time at each network and for significant increases in located earthquakes in the hours after the mainshock. We found compelling evidence for triggering only at the Katmai volcanic cluster (KVC, 720-755 km southwest of the epicenter), where small earthquakes with distinct P and 5 arrivals appeared within the mainshock coda at one station and a small increase in located earthquakes occurred for several hours after the mainshock. Peak dynamic stresses of ???0.1 MPa at Augustine Volcano (560 km southwest of the epicenter) are significantly lower than those recorded in Yellowstone and Utah (>3000 km southeast of the epicenter), suggesting that strong directivity effects were at least partly responsible for the lack of triggering at Alaskan volcanoes. We describe other incidents of earthquake-induced triggering in the KVC, and outline a qualitative magnitude/distance-dependent triggering threshold. We argue that triggering results from the perturbation of magmatic-hydrothermal systems in the KVC and suggest that the comparative lack of triggering at other Alaskan volcanoes could be a result of differences in the nature of magmatic-hydrothermal systems.

  17. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  18. Earthquake Activity in the North Greenland Region

    NASA Astrophysics Data System (ADS)

    Larsen, Tine B.; Dahl-Jensen, Trine; Voss, Peter H.

    2017-04-01

    Many local and regional earthquakes are recorded on a daily basis in northern Greenland. The majority of the earthquakes originate at the Arctic plate boundary between the Eurasian and the North American plates. Particularly active regions away from the plate boundary are found in NE Greenland and in northern Baffin Bay. The seismograph coverage in the region is sparse with the main seismograph stations located at the military outpost, Stations Nord (NOR), the weather station outpost Danmarkshavn (DAG), Thule Airbase (TULEG), and the former ice core drilling camp (NEEM) in the middle of the Greenland ice sheet. Furthermore, data is available from Alert (ALE), Resolute (RES), and other seismographs in northern Canada as well as from a temporary deployment of BroadBand seismographs along the north coast of Greenland from 2004 to 2007. The recorded earthquakes range in magnitude from less than 2 to a 4.8 event, the largest in NE Greenland, and a 5.7 event, the largest recorded in northern Baffin Bay. The larger events are recorded widely in the region allowing for focal mechanisms to be calculated. Only a few existing focal mechanisms for the region can be found in the ISC bulletin. Two in NE Greenland representing primarily normal faulting and one in Baffin Bay resulting from reverse faulting. New calculations of focal mechanisms for the region will be presented as well as improved hypocenters resulting from analysis involving temporary stations and regional stations that are not included in routine processing.

  19. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  20. Pipeline experiment co-located with USGS Parkfield earthquake prediction project

    SciTech Connect

    Isenberg, J.; Richardson, E.

    1995-12-31

    A field experiment to investigate the response of buried pipelines to lateral offsets and traveling waves has been operational since June, 1988 at the Owens` Pasture site near Parkfield, CA where the US Geological Survey has predicted a M6 earthquake. Although the predicted earthquake has not yet occurred, the 1989 Loma Prieta earthquake and 1992 M4.7 earthquake near Parkfield produced measurable response at the pipeline experiment. The present paper describes upgrades to the experiment which were introduced after Loma Prieta which performed successfully in the 1992 event.

  1. The Earthquake That Tweeted

    NASA Astrophysics Data System (ADS)

    Petersen, D.

    2011-12-01

    Advances in mobile technology and social networking are enabling new behaviors that were not possible even a few short years ago. When people experience a tiny earthquake, it's more likely they're going to reach for their phones and tell their friends about it than actually take cover under a desk. With 175 million Twitter accounts, 750 million Facebook users and more than five billion mobile phones in the world today, people are generating terrific amounts of data simply by going about their everyday lives. Given the right tools and guidance these connected individuals can act as the world's largest sensor network, doing everything from reporting on earthquakes to anticipating global crises. Drawing on the author's experience as a user researcher and experience designer, this presentation will discuss these trends in crowdsourcing the collection and analysis of data, and consider their implications for how the public encounters the earth sciences in their everyday lives.

  2. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  3. Distant, delayed and ancient earthquake-induced landslides

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Torgoev, Almaz; Braun, Anika; Schlögel, Romy; Micu, Mihai

    2016-04-01

    On the basis of a new classification of seismically induced landslides we outline particular effects related to the delayed and distant triggering of landslides. Those cannot be predicted by state-of-the-art methods. First, for about a dozen events the 'predicted' extension of the affected area is clearly underestimated. The most problematic cases are those for which far-distant triggering of landslides had been reported, such as for the 1988 Saguenay earthquake. In Central Asia reports for such cases are known for areas marked by a thick cover of loess. One possible contributing effect could be a low-frequency resonance of the thick soils induced by distant earthquakes, especially those in the Pamir - Hindu Kush seismic region. Such deep focal and high magnitude (>>7) earthquakes are also found in Europe, first of all in the Vrancea region (Romania). For this area and others in Central Asia we computed landslide event sizes related to scenario earthquakes with M>7.5. The second particular and challenging type of triggering is the one delayed with respect to the main earthquake event: case histories have been reported for the Racha earthquake in 1991 when several larger landslides only started moving 2 or 3 days after the main shock. Similar observations were also made after other earthquake events in the U.S., such as after the 1906 San Francisco, the 1949 Tacoma, the 1959 Hebgen Lake and the 1983 Bora Peak earthquakes. Here, we will present a series of detailed examples of (partly monitored) mass movements in Central Asia that mainly developed after earthquakes, some even several weeks after the main shock: e.g. the Tektonik and Kainama landslides triggered in 1992 and 2004, respectively. We believe that the development of the massive failures is a consequence of the opening of tension cracks during the seismic shaking and their filling up with water during precipitations that followed the earthquakes. The third particular aspect analysed here is the use of large

  4. Tien Shan Geohazards Database: Earthquakes and landslides

    NASA Astrophysics Data System (ADS)

    Havenith, H. B.; Strom, A.; Torgoev, I.; Torgoev, A.; Lamair, L.; Ischuk, A.; Abdrakhmatov, K.

    2015-11-01

    In this paper we present new and review already existing landslide and earthquake data for a large part of the Tien Shan, Central Asia. For the same area, only partial databases for sub-regions have been presented previously. They were compiled and new data were added to fill the gaps between the databases. Major new inputs are products of the Central Asia Seismic Risk Initiative (CASRI): a tentative digital map of active faults (even with indication of characteristic or possible maximum magnitude) and the earthquake catalogue of Central Asia until 2009 that was now updated with USGS data (to May 2014). The new compiled landslide inventory contains existing records of 1600 previously mapped mass movements and more than 1800 new landslide data. Considering presently available seismo-tectonic and landslide data, a target region of 1200 km (E-W) by 600 km (N-S) was defined for the production of more or less continuous geohazards information. This target region includes the entire Kyrgyz Tien Shan, the South-Western Tien Shan in Tajikistan, the Fergana Basin (Kyrgyzstan, Tajikistan and Uzbekistan) as well as the Western part in Uzbekistan, the North-Easternmost part in Kazakhstan and a small part of the Eastern Chinese Tien Shan (for the zones outside Kyrgyzstan and Tajikistan, only limited information was available and compiled). On the basis of the new landslide inventory and the updated earthquake catalogue, the link between landslide and earthquake activity is analysed. First, size-frequency relationships are studied for both types of geohazards, in terms of Gutenberg-Richter Law for the earthquakes and in terms of probability density function for the landslides. For several regions and major earthquake events, case histories are presented to outline further the close connection between earthquake and landslide hazards in the Tien Shan. From this study, we concluded first that a major hazard component is still now insufficiently known for both types of geohazards

  5. Do Earthquakes Shake Stock Markets?

    PubMed

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  6. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  7. Testing Earthquake Source Inversion Methodologies

    NASA Astrophysics Data System (ADS)

    Page, Morgan; Mai, P. Martin; Schorlemmer, Danijel

    2011-03-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquake-related computations, such as ground motion simulations and static stress change calculations.

  8. Human casualties in earthquakes: modelling and mitigation

    USGS Publications Warehouse

    Spence, R.J.S.; So, E.K.M.

    2011-01-01

    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  9. Evaluation of earthquake and tsunami on JSFR

    SciTech Connect

    Chikazawa, Y.; Enuma, Y.; Kisohara, N.; Yamano, H.; Kubo, S.; Hayafune, H.; Sagawa, H.; Okamura, S.; Shimakawa, Y.

    2012-07-01

    Evaluation of earthquake and tsunami on JSFR has been analyzed. For seismic design, safety components are confirmed to maintain their functions even against recent strong earthquakes. As for Tsunami, some parts of reactor building might be submerged including component cooling water system whose final heat sink is sea water. However, in the JSFR design, safety grade components are independent from component cooling water system (CCWS). The JSFR emergency power supply adopts a gas turbine system with air cooling, since JSFR does not basically require quick start-up of the emergency power supply thanks to the natural convection DHRS. Even in case of long station blackout, the DHRS could be activated by emergency batteries or manually and be operated continuously by natural convection. (authors)

  10. Pain after earthquake

    PubMed Central

    2012-01-01

    Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009). Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%). Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations. PMID:22747796

  11. Application of the Titius-Bode law in earthquakes study

    NASA Astrophysics Data System (ADS)

    Hu, H.; Malkin, Z.; Wang, R.

    2015-08-01

    This article introduces application of the commensurability revealed by Titius-Bode Law in earthquake (EQ) prediction study. The results show that occurrence of the most of the world's major earthquakes is not accidental, and they occurred at the commensurable points of time axis. As an example, both EQ 7.0 in Lushan, China on 2013-04-20 and EQ 8.2 in Iquique, Chile on 2014-04-01 occurred at their commensurable epochs. This provides an important scientific basis for the prediction of major EQ, which will occur in the area in future.

  12. Earthquake early warning for the 2016 Kumamoto earthquake: performance evaluation of the current system and the next-generation methods of the Japan Meteorological Agency

    NASA Astrophysics Data System (ADS)

    Kodera, Yuki; Saitou, Jun; Hayashimoto, Naoki; Adachi, Shimpei; Morimoto, Masahiko; Nishimae, Yuji; Hoshiba, Mitsuyuki

    2016-12-01

    The 2016 Kumamoto earthquake (Kumamoto earthquake sequence) is an extremely high-seismicity event that has been occurring across Kumamoto and Oita Prefectures in Japan since April 14, 2016 (JST). The earthquake early warning system of the Japan Meteorological Agency (JMA) issued warnings for 19 events in the Kumamoto earthquake sequence from April 14 to 19, under some of the heaviest loading conditions since the system began operating in 2007. We analyzed the system performance for cases where a warning was issued and/or strong motion was actually observed. The results indicated that the system exhibited remarkable performance, especially for the most destructive earthquakes in the Kumamoto earthquake sequence. In addition, the system did not miss or seriously under-predict strong motion of any large earthquake from April 14 to 30. However, in four cases, the system issued over-predicted warnings due to the simultaneous occurrence of small earthquakes within a short distance, which implies a fundamental obstacle in trigger-data classifications based solely on arrival time. We also performed simulations using the integrated particle filter (IPF) and propagation of local undamped motion (PLUM) methods, which JMA plans to implement to address over-prediction for multiple simultaneous earthquakes and under-prediction for massive earthquakes with large rupture zones. The simulation results of the IPF method indicated that the IPF method is highly effective at minimizing over-prediction even for multiple simultaneous earthquakes within a short distance, since it adopts a trigger-data classification using velocity amplitude and hypocenter determinations using not-yet-arrived data. The simulation results of the PLUM method demonstrated that the PLUM method is capable of issuing warnings for destructive inland earthquakes more rapidly than the current system owing to the use of additional seismometers that can only be incorporated by this method.[Figure not available: see

  13. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  14. Foreshocks of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Guglielmi, A. V.; Sobisevich, L. E.; Sobisevich, A. L.; Lavrov, I. P.

    2014-07-01

    The specific enhancement of ultra-low-frequency (ULF) electromagnetic oscillations a few hours prior to the strong earthquakes, which was previously mentioned in the literature, motivated us to search for the distinctive features of the mechanical (foreshock) activity of the Earth's crust in the epicentral zones of the future earthquakes. Activation of the foreshocks three hours before the main shock is revealed, which is roughly similar to the enhancement of the specific electromagnetic ULF emission. It is hypothesized that the round-the-world seismic echo signals from the earthquakes, which form the peak of energy release 2 h 50 min before the main events, act as the triggers of the main shocks due to the cumulative action of the surface waves converging to the epicenter. It is established that the frequency of the fluctuations in the foreshock activity decreases at the final stages of the preparation of the main shocks, which probably testifies to the so-called mode softening at the approach of the failure point according to the catastrophe theory.

  15. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  16. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  17. Probabilistic modeling of earthquakes

    NASA Astrophysics Data System (ADS)

    Duputel, Z.; Jolivet, R.; Jiang, J.; Simons, M.; Rivera, L. A.; Ampuero, J. P.; Gombert, B.; Minson, S. E.

    2015-12-01

    By exploiting increasing amounts of geophysical data we are able to produce increasingly sophisticated fault slip models. Such detailed models, while they are essential ingredients towards better understanding fault mechanical behavior, can only inform us in a meaningful way if we can assign uncertainties to the inferred slip parameters. This talk will present our recent efforts to infer fault slip models with realistic error estimates. Bayesian analysis is a useful tool for this purpose as it handles uncertainty in a natural way. One of the biggest obstacles to significant progress in observational earthquake source modeling arises from imperfect predictions of geodetic and seismic data due to uncertainties in the material parameters and fault geometries used in our forward models - the impact of which are generally overlooked. We recently developed physically based statistics for the model prediction error and showed how to account for inaccuracies in the Earth model elastic parameters. We will present applications of this formalism to recent large earthquakes such as the 2014 Pisagua earthquake. We will also discuss novel approaches to integrate the large amount of information available from GPS, InSAR, tide-gauge, tsunami and seismic data.

  18. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  19. Performed Surgical Interventions After the 1999 Marmara Earthquake in Turkey, and Their Importance Regarding Nursing Practices.

    PubMed

    Gul, Asiye; Andsoy, Isil Isik

    2015-01-01

    Effectively dealing with earthquakes is especially important for the people who live in areas prone to earthquakes such as the country of Turkey. Trauma related to earthquakes has specific relevance to nursing practice. The purpose of this review was to describe the types of surgical interventions after the Marmara earthquake and to evaluate the implications for nursing care. English and Turkish articles about the Marmara earthquake were reviewed between May and July 2013. A total of 7 studies were evaluated. The number of patients admitted to the units, types of injuries, and surgical treatments were recorded, with a total of 2378 patients with earthquake-related injuries. The most commonly traumatized parts of the body were the extremities. Fasciotomy operations were performed on 286 patients and 75 patients underwent extremity amputations. Predetermining surgical problems and interventions may be useful in planning for possible future problems in the case of a disaster.

  20. Earthquakes, September-October 1980

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    There were two major (magnitudes 7.0-7.9) earthquakes during this reporting period; a magnitude (M) 7.3 in Algeria where many people were killed or injured and extensive damage occurred, and an M=7.2 in the Loyalty Islands region of the South Pacific. Japan was struck by a damaging earthquake on September 24, killing two people and causing injuries. There were no damaging earthquakes in the United States. 

  1. Earthquakes; July-August 1977

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    July and August were somewhat active seismically speaking, compared to previous months of this year. There were seven earthquakes having magnitudes of 6.5 or greater. The largest was a magnitudes of 6.5 or greater. The largest was a magnitude 8.0 earthquake south of Sumbawa Island on August 19 that killed at least 111. The United States experienced a number of earthquakes during this period, but only one, in California, caused some minor damage. 

  2. Space geodesy and earthquake prediction

    NASA Technical Reports Server (NTRS)

    Bilham, Roger

    1987-01-01

    Earthquake prediction is discussed from the point of view of a new development in geodesy known as space geodesy, which involves the use of extraterrestrial sources or reflectors to measure earth-based distances. Space geodesy is explained, and its relation to terrestrial geodesy is examined. The characteristics of earthquakes are reviewed, and the ways that they can be exploited by space geodesy to predict earthquakes is demonstrated.

  3. Space geodesy and earthquake prediction

    NASA Technical Reports Server (NTRS)

    Bilham, Roger

    1987-01-01

    Earthquake prediction is discussed from the point of view of a new development in geodesy known as space geodesy, which involves the use of extraterrestrial sources or reflectors to measure earth-based distances. Space geodesy is explained, and its relation to terrestrial geodesy is examined. The characteristics of earthquakes are reviewed, and the ways that they can be exploited by space geodesy to predict earthquakes is demonstrated.

  4. Earthquakes, November-December 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were three major earthquakes (7.0-7.9) during the last two months of the year: a magntidue 7.0 on November 19 in Columbia, a magnitude 7.4 in the Kuril Islands on December 22, and a magnitude 7.1 in the South Sandwich Islands on December 27. Earthquake-related deaths were reported in Colombia, Yemen, and Iran. there were no significant earthquakes in the United States during this reporting period. 

  5. Relevance of seismicity in Kumaun-Garhwal Himalaya in context of recent 25th April 2015 Mw7.8 Nepal earthquake

    NASA Astrophysics Data System (ADS)

    Paul, Ajay; Singh, Rakesh

    2017-06-01

    Due to the continuous movement of underthrusting Indian plate beneath the Eurasian plate the strain energy accumulates, which is released substantially in the form of great earthquakes. It is yet to be released in the unruptured zone which lies between the two great earthquakes viz. 1934 Bihar-Nepal earthquake and 1905 Kangra earthquake. This zone has been termed as the Central Seismic Gap (CSG). The occurrence of previous great earthquakes and recent Nepal earthquakes, gives a clue to the probable location of future great earthquake. On the basis of analysis of seismicity in the locked portion and the stress drop analysis the CSG has been redefined and location of probable zone of future great earthquake has been demarcated.

  6. Earthquake prediction: Simple methods for complex phenomena

    NASA Astrophysics Data System (ADS)

    Luen, Bradley

    2010-09-01

    Earthquake predictions are often either based on stochastic models, or tested using stochastic models. Tests of predictions often tacitly assume predictions do not depend on past seismicity, which is false. We construct a naive predictor that, following each large earthquake, predicts another large earthquake will occur nearby soon. Because this "automatic alarm" strategy exploits clustering, it succeeds beyond "chance" according to a test that holds the predictions _xed. Some researchers try to remove clustering from earthquake catalogs and model the remaining events. There have been claims that the declustered catalogs are Poisson on the basis of statistical tests we show to be weak. Better tests show that declustered catalogs are not Poisson. In fact, there is evidence that events in declustered catalogs do not have exchangeable times given the locations, a necessary condition for the Poisson. If seismicity followed a stochastic process, an optimal predictor would turn on an alarm when the conditional intensity is high. The Epidemic-Type Aftershock (ETAS) model is a popular point process model that includes clustering. It has many parameters, but is still a simpli_cation of seismicity. Estimating the model is di_cult, and estimated parameters often give a non-stationary model. Even if the model is ETAS, temporal predictions based on the ETAS conditional intensity are not much better than those of magnitude-dependent automatic (MDA) alarms, a much simpler strategy with only one parameter instead of _ve. For a catalog of Southern Californian seismicity, ETAS predictions again o_er only slight improvement over MDA alarms

  7. The great Lisbon earthquake and tsunami of 1755: lessons from the recent Sumatra earthquakes and possible link to Plato's Atlantis

    NASA Astrophysics Data System (ADS)

    Gutscher, M.-A.

    2006-05-01

    Great earthquakes and tsunami can have a tremendous societal impact. The Lisbon earthquake and tsunami of 1755 caused tens of thousands of deaths in Portugal, Spain and NW Morocco. Felt as far as Hamburg and the Azores islands, its magnitude is estimated to be 8.5 9. However, because of the complex tectonics in Southern Iberia, the fault that produced the earthquake has not yet been clearly identified. Recently acquired data from the Gulf of Cadiz area (tomography, seismic profiles, high-resolution bathymetry, sampled active mud volcanoes) provide strong evidence for an active east dipping subduction zone beneath Gibraltar. Eleven out of 12 of the strongest earthquakes (M>8.5) of the past 100 years occurred along subduction zone megathrusts (including the December 2004 and March 2005 Sumatra earthquakes). Thus, it appears likely that the 1755 earthquake and tsunami were generated in a similar fashion, along the shallow east-dipping subduction fault plane. This implies that the Cadiz subduction zone is locked (like the Cascadia and Nankai/Japan subduction zones), with great earthquakes occurring over long return periods. Indeed, the regional paleoseismic record (contained in deep-water turbidites and shallow lagoon deposits) suggests great earthquakes off South West Iberia every 1500 2000 years. Tsunami deposits indicate an earlier great earthquake struck SW Iberia around 200 BC, as noted by Roman records from Cadiz. A written record of even older events may also exist. According to Plato's dialogues The Critias and The Timaeus, Atlantis was destroyed by ‘strong earthquakes and floods … in a single day and night’ at a date given as 11,600 BP. A 1 m thick turbidite deposit, containing coarse grained sediments from underwater avalanches, has been dated at 12,000 BP and may correspond to the destructive earthquake and tsunami described by Plato. The effects on a paleo-island (Spartel) in the straits of Gibraltar would have been devastating, if inhabited, and may

  8. Earthquakes, January-February 1974

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    During the first 2 months of 1974, earthquakes caused fatalities in Peru and Turkey. The largest earthquake during the period was a magnitude 7.2 shock in the New Hebrides Islands. A local tsunami was generated by a magnitude 7.0 earthquake in the Solomon Islands. The relative quiet that characterized world seismicity during the last year continued through the period. There have been no great earthquakes (magnitude 8.0 or larger) since January 10, 1971, when a magnitude 8.1 shock occurred in western New Guinea. 

  9. Radon in earthquake prediction research.

    PubMed

    Friedmann, H

    2012-04-01

    The observation of anomalies in the radon concentration in soil gas and ground water before earthquakes initiated systematic investigations on earthquake precursor phenomena. The question what is needed for a meaningful earthquake prediction as well as what types of precursory effects can be expected is shortly discussed. The basic ideas of the dilatancy theory are presented which in principle can explain the occurrence of earthquake forerunners. The reasons for radon anomalies in soil gas and in ground water are clarified and a possible classification of radon anomalies is given.

  10. ViscoSim Earthquake Simulator

    USGS Publications Warehouse

    Pollitz, Fred

    2012-01-01

    Synthetic seismicity simulations have been explored by the Southern California Earthquake Center (SCEC) Earthquake Simulators Group in order to guide long‐term forecasting efforts related to the Unified California Earthquake Rupture Forecast (Tullis et al., 2012a). In this study I describe the viscoelastic earthquake simulator (ViscoSim) of Pollitz, 2009. Recapitulating to a large extent material previously presented by Pollitz (2009, 2011) I describe its implementation of synthetic ruptures and how it differs from other simulators being used by the group.

  11. Earthquakes, July-August 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  12. Earthquakes in the United States

    USGS Publications Warehouse

    Stover, C.

    1977-01-01

    To supplement data in the report Preliminary Determination of Epicenters (PDE), the National earthquake Information Service (NEIS) also publishes a quarterly circular, Earthquakes in the United States. This provides information on the felt area of U.S earthquakes and their intensity. The main purpose is to describe the larger effects of these earthquakes so that they can be used in seismic risk studies, site evaluations for nuclear power plants, and answering inquiries by the general public.

  13. Earthquakes, May-June 1981

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    The months of May and June were somewhat quiet, seismically speaking. There was one major earthquake (7.0-7.9) off the west coast of South Island, New Zealand. The most destructive earthquake during this reporting period was in southern Iran on June 11 which caused fatalities and extensive damage. Peru also experienced a destructive earthquake on June 22 which caused fatalities and damage. In the United States, a number of earthquakes were experienced, but none caused significant damage. 

  14. Modified mercalli intensities for nine earthquakes in central and western Washington between 1989 and 1999

    USGS Publications Warehouse

    Brocher, Thomas M.; Dewey, James W.; Cassidy, John F.

    2017-08-15

    We determine Modified Mercalli (Seismic) Intensities (MMI) for nine onshore earthquakes of magnitude 4.5 and larger that occurred in central and western Washington between 1989 and 1999, on the basis of effects reported in postal questionnaires, the press, and professional collaborators. The earthquakes studied include four earthquakes of M5 and larger: the M5.0 Deming earthquake of April 13, 1990, the M5.0 Point Robinson earthquake of January 29, 1995, the M5.4 Duvall earthquake of May 3, 1996, and the M5.8 Satsop earthquake of July 3, 1999. The MMI are assigned using data and procedures that evolved at the U.S. Geological Survey (USGS) and its Department of Commerce predecessors and that were used to assign MMI to felt earthquakes occurring in the United States between 1931 and 1986. We refer to the MMI assigned in this report as traditional MMI, because they are based on responses to postal questionnaires and on newspaper reports, and to distinguish them from MMI calculated from data contributed by the public by way of the internet. Maximum traditional MMI documented for the M5 and larger earthquakes are VII for the 1990 Deming earthquake, V for the 1995 Point Robinson earthquake, VI for the 1996 Duvall earthquake, and VII for the 1999 Satsop earthquake; the five other earthquakes were variously assigned maximum intensities of IV, V, or VI. Starting in 1995, the Pacific Northwest Seismic Network (PNSN) published MMI maps for four of the studied earthquakes, based on macroseismic observations submitted by the public by way of the internet. With the availability now of the traditional USGS MMI interpreted for all the sites from which USGS postal questionnaires were returned, the four Washington earthquakes join a rather small group of earthquakes for which both traditional USGS MMI and some type of internet-based MMI have been assigned. The values and distributions of the traditional MMI are broadly similar to the internet-based PNSN intensities; we discuss some

  15. Some differences in seismic hazard assessment for natural and fluid-induced earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2014-12-01

    Although there is little doubt that fluid-induced earthquakes contribute significantly to the seismic hazard in some parts of the United States, assessing this contribution in ways consistent with hazard assessment for natural earthquakes is proving to be challenging. For natural earthquakes, the hazard is considered to be independent of time whereas for fluid-induced seismicity there is considerable time dependence as evidenced, for instance, by the dramatic increase in recent years of the seismicity in Oklahoma. Case histories of earthquakes induced by the development of Enhanced Geothermal Systems and wastewater injection at depth illustrate a few of the problems. Analyses of earthquake sequences induced by these operations indicate that the rate of earthquake occurrence is proportional to the rate of injection, a factor that, on a broad scale, depends on the level of energy production activities. For natural earthquakes, in contrast, the rate of earthquake occurrence depends on time-independent tectonic factors including the long-term slip rates across known faults. Maximum magnitude assessments for natural and fluid-induced earthquake sources also show a contrast in behavior. For a natural earthquake source, maximum magnitude is commonly assessed from empirical relations between magnitude and the area of a potentially-active fault. The same procedure applied to fluid-induced earthquakes yields magnitudes that are systematically higher than what is observed. For instance, the maximum magnitude estimated from the fault area of the Prague, OK, main shock of 6 November 2011 is 6.2 whereas the magnitude measured from seismic data is 5.65 (Sun and Hartzell, 2014). For fluid-induced earthquakes, maximum magnitude appears to be limited according to the volume of fluid injected before the largest earthquake. This implies that for a given fluid-injection project, the upper limit on magnitude increases as long as injection continues.

  16. Atmospheric Baseline Monitoring Data Losses Due to the Samoa Earthquake

    NASA Astrophysics Data System (ADS)

    Schnell, R. C.; Cunningham, M. C.; Vasel, B. A.; Butler, J. H.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA) operates an Atmospheric Baseline Observatory at Cape Matatula on the north-eastern point of American Samoa, opened in 1973. The manned observatory conducts continuous measurements of a wide range of climate forcing and atmospheric composition data including greenhouse gas concentrations, solar radiation, CFC and HFC concentrations, aerosols and ozone as well as less frequent measurements of many other parameters. The onset of September 29, 2009 earthquake is clearly visible in the continuous data streams in a variety of ways. The station electrical generator came online when the Samoa power grid failed so instruments were powered during and subsequent to the earthquake. Some instruments ceased operation in a spurt of spurious data followed by silence. Other instruments just stopped sending data abruptly when the shaking from the earthquake broke a data or power links, or an integral part of the instrument was damaged. Others survived the shaking but were put out of calibration. Still others suffered damage after the earthquake as heaters ran uncontrolled or rotating shafts continued operating in a damaged environment grinding away until they seized up or chewed a new operating space. Some instruments operated as if there was no earthquake, others were brought back online within a few days. Many of the more complex (and in most cases, most expensive) instruments will be out of service, some for at least 6 months or more. This presentation will show these results and discuss the impact of the earthquake on long-term measurements of climate forcing agents and other critical climate measurements.

  17. Did an Earthquake Trigger the Eruption of the Sidoarjo (Lusi) Mud Volcano?

    NASA Astrophysics Data System (ADS)

    Brumm, M.; Manga, M.; Davies, R. J.

    2007-12-01

    On May 29th 2006, a mud volcano started to erupt in the Porong District of Sidoarjo. The volcano, now known as "Lusi", has displaced tens of thousands of people. It also offers a unique opportunity to observe the processes that initiate and sustain mud volcano eruptions. Three trigger mechanisms have been proposed, (a) the May 27th 2006 Yogyakarta earthquake, (b) well drilling operations in progress near the initial eruption site at the time of the eruption, and (c) a combination of earthquake and drilling operations. Here we consider possibility (a). We compare the distance and magnitude of the Yogyakarta earthquake to the relationship between distance and magnitude of historical earthquakes that caused liquefaction and triggered the eruption of mud volcanoes elsewhere. We also evaluate the static stress changes caused by the Yogyakarta earthquake, and compare the strength of ground shaking with that caused by other regional earthquakes. We find that (1) the Yogyakarta earthquake was smaller and farther away than earthquakes that have been observed to trigger liquefaction in other settings, (2) the static stress changes caused by the Yogyakarta earthquake was much smaller than changes in stress caused by tides or barometric pressure changes, (3) in the past 35 years, tens to hundreds of other earthquakes caused stronger ground shaking at the site of the eruption but did not trigger an eruption, and (4) the period immediately preceding the eruption was seismically quieter than average, suggesting that previous earthquakes did not bring the subsurface into a critical state. Based on these results, we conclude that the Yogyakarta earthquake, by itself, was not sufficient to trigger an eruption.

  18. Catalog of earthquakes along the San Andreas fault system in Central California, April-June 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period April - June, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). A catalog for the first quarter of 1972 has been prepared by Wesson and others (1972). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 910 earthquakes in Central California. A substantial portion of the earthquakes reported in this catalog represents a continuation of the sequence of earthquakes in the Bear Valley area which began in February, 1972 (Wesson and others, 1972). Arrival times at 126 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 101 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement

  19. Catalog of earthquakes along the San Andreas fault system in Central California: January-March, 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Meagher, K.L.

    1973-01-01

    Numerous small earthquakes occur each day in the Coast Ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period January - March, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b,c,d). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1,718 earthquakes in Central California. Of particular interest is a sequence of earthquakes in the Bear Valley area which contained single shocks with local magnitudes of S.O and 4.6. Earthquakes from this sequence make up roughly 66% of the total and are currently the subject of an interpretative study. Arrival times at 118 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 94 are telemetered stations operated by NCER. Readings from the remaining 24 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley,have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the

  20. Automated Radar Image of Deformation for Amatrice, Italy Earthquake

    NASA Image and Video Library

    2016-08-31

    Amatrice earthquake in central Italy, which caused widespread building damage to several towns throughout the region. This earthquake was the strongest in that area since the 2009 earthquake that destroyed the city of L'Aquila. The Advanced Rapid Imaging and Analysis (ARIA) data system, a collaborative project between NASA's Jet Propulsion Laboratory, Pasadena, California, and the California Institute of Technology in Pasadena, automatically generated interferometric synthetic aperture radar images from the Copernicus Sentinel 1A satellite operated by the European Space Agency (ESA) for the European Commission to calculate a map of the deformation of Earth's surface caused by the quake. This false-color map shows the amount of permanent surface movement, as viewed by the satellite, during a 12-day interval between two Sentinel 1 images acquired on Aug. 15, 2016, and Aug. 27, 2016. The movement was caused almost entirely by the earthquake. In this map, the colors of the surface displacements are proportional to the surface motion. The red and pink tones show the areas where the land moved toward the satellite by up to 2 inches (5 centimeters). The area with various shades of blue moved away from the satellite, mostly downward, by as much as 8 inches (20 centimeters). Contours on the surface motion are 2 inches (5 centimeters) The green star shows the epicenter where the earthquake started as located by the U.S. Geological Survey National Earthquake Information Center. Black dots show town locations. Scientists use these maps to build detailed models of the fault slip at depth and associated land movements to better understand the impact on future earthquake activity. The map shows the fault or faults that moved in the earthquake is about 14 miles (22 kilometers) long between Amatrice and Norcia and slopes to the west beneath the area that moved downward. http://photojournal.jpl.nasa.gov/catalog/PIA20896

  1. Source Rupture Process of the 2005 Tarapaca Intermediate Depth Earthquake

    NASA Astrophysics Data System (ADS)

    Peyrat, S.; Favreau, P.; de Chabalier, J.; Bouin, M.

    2007-12-01

    We investigate the details of the rupture process of the large (Mw 7.7) intermediate-depth earthquake that occurred on 13 June 2005 in the Tarapaca region of the Northern Chile seismic gap, using different data sets and different methods. The high quality and variety of seismic and geodetic data available for this event provided an unprecedented opportunity to study its source in detail. This earthquake is a slab-pull event with down dip extensional source mechanism. The aftershock distribution, determined from a post-seismic temporary array, indicates a sub-horizontal fault plane lying between the upper and lower planes of the double seismic zone. This earthquake was also recorded by a permanent digital strong-motion network operated by the University of Chile. These records have absolute time and high dynamic range so that they contain direct information about the rupture process. We used a systematic, fully nonlinear inversion method based on the neighbourhood algorithm to invert for the kinematic slip distribution using the accelerometric data set. This low frequency inversion provides a relatively smooth image of the rupture history. The kinematic inversion shows that the earthquake occurred by the rupture of two asperities. Based on the kinematic inversion result, we propose dynamic rupture models in order to quantify the dynamic rupture process. We simulate the dynamic rupture process and the strong ground motion using a 3D finite-difference method. In our simulation, dynamic rupture grows under the simultaneous control of initial stress and rupture resistance by friction. We constrain dynamic rupture parameters of the Tarapaca earthquake by simple trial and error. Large intraplate earthquakes in subduction zone are quite common although very few have been studied in detail. These earthquakes occurred at depth where the mechanism by which they are triggered remains poorly understood. Consequently, the determination of source rupture for intermediate

  2. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    NASA Astrophysics Data System (ADS)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  3. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  4. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  5. Ensemble Model Earthquake Forecasts During the 2010-12 Canterbury, New Zealand, Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Taroni, M.; Werner, M. J.; Marzocchi, W.; Zechar, J. D.

    2016-12-01

    Ensemble models provide at least two advantages for earthquake forecasting. Firstly, they circumvent the question of which model to choose for operational purposes by objectively and transparently merging all available ones. And secondly, the optimally merged model may provide better forecasts than any single model. In addition, complementary or inconsistent models, hypotheses and input data sets can easily be combined according to defined rules. Our purpose here is to investigate ensemble modeling techniques in the context of the 2010-12 Canterbury, New Zealand, earthquake sequence, for which over a dozen time-dependent earthquake forecast models have been developed that are currently under evaluation by the Collaboratory for the Study of Earthquake Predictability (CSEP). The models include recently refined and developed physics-based Coulomb stress models as well as new statistical models and hybrid models that employ physics-based components within a statistical framework. Here, we explore ensemble modeling techniques to create optimal forecasts by merging all available model forecasts. The mixing of the models is determined by a dynamic, weighted average over all forecasts, where the weights are determined according to a continually updated measure of past predictive skill. The ensemble model thus changes from day to day, giving greater weight to more informative models. We compare several methods for assembling ensemble models in terms of their predictive skills during the sequence and compare the optimal models with the individual best models.

  6. Neoliberalism and criticisms of earthquake insurance arrangements in New Zealand.

    PubMed

    Hay, I

    1996-03-01

    Global collapse of the Fordist-Keynesian regime of accumulation and an attendant philosophical shift in New Zealand politics to neoliberalism have prompted criticisms of, and changes to, the Earthquake and War Damage Commission. Earthquake insurance arrangements made 50 years ago in an era of collectivist, welfarist political action are now set in an environment in which emphasis is given to competitive relations and individualism. Six specific criticisms of the Commission are identified, each of which is founded in the rhetoric and ideology of a neoliberal political project which has underpinned radical social and economic changes in New Zealand since the early 1980s. On the basis of those criticisms, and in terms of the Earthquake Commission Act 1993, the Commission has been restructured. The new Commission is withdrawing from its primary position as the nation's non-residential property hazards insurer and is restricting its coverage of residential properties.

  7. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    NASA Astrophysics Data System (ADS)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  8. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    USGS Publications Warehouse

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  9. Integrated Seismicity Model to Detect Pairs of Possible Interdependent Earthquakes and Its Application to Aftershocks of the 2011 Tohoku-Oki Earthquake and Sequence of the 2014 Kermadec and Rat Islands Earthquakes

    NASA Astrophysics Data System (ADS)

    Miyazawa, M.; Tamura, R.

    2015-12-01

    We introduce an integrated seismicity model to stochastically evaluate the time intervals of consecutive earthquakes at global scales, making it possible to detect a pair of earthquakes that are remotely located and possibly related to each other. The model includes seismicity in non-overlapping areas and comprehensively explains the seismicity on the basis of point process models, which include the stationary Poisson model, the aftershock decay model following Omori-Utsu's law, and/or the epidemic-type aftershock sequence (ETAS) model. By use of this model, we examine the possibility of remote triggering of the 2011 M6.4 eastern Shizuoka earthquake in the vicinity of Mt. Fuji that occurred 4 days after the Mw9.0 Tohoku-Oki earthquake and 4 minutes after the M6.2 off-Fukushima earthquake that located about 400 km away, and that of the 2014 Mw7.9 Rat Islands earthquake that occurred within one hour after the Mw6.7 Kermadec earthquake that located about 9,000 km away and followed two large (Mw6.9, 6.5) earthquakes in the region. Both target earthquakes occurred during the passage of surface waves propagating from the previous large events. We estimated probability that the time interval is shorter than that between consecutive events and obtained dynamic stress changes on the faults. The results indicate that the M6.4 eastern Shizuoka event may be rather triggered by the static stress changes from the Tohoku-Oki earthquake and that the Mw7.9 Rat Islands event may have been remotely triggered by the Kermadec events possibly via cyclic fatigue.

  10. Analysis of worldwide earthquake mortality using multivariate demographic and seismic data.

    PubMed

    Gutiérrez, E; Taucer, F; De Groeve, T; Al-Khudhairy, D H A; Zaldivar, J M

    2005-06-15

    In this paper, mortality in the immediate aftermath of an earthquake is studied on a worldwide scale using multivariate analysis. A statistical method is presented that analyzes reported earthquake fatalities as a function of a heterogeneous set of parameters selected on the basis of their presumed influence on earthquake mortality. The ensemble was compiled from demographic, seismic, and reported fatality data culled from available records of past earthquakes organized in a geographic information system. The authors consider the statistical relation between earthquake mortality and the available data ensemble, analyze the validity of the results in view of the parametric uncertainties, and propose a multivariate mortality analysis prediction method. The analysis reveals that, although the highest mortality rates are expected in poorly developed rural areas, high fatality counts can result from a wide range of mortality ratios that depend on the effective population size.

  11. Who cares about Mid-Ocean Ridge Earthquakes? And Why?

    NASA Astrophysics Data System (ADS)

    Tolstoy, M.

    2004-12-01

    Every day the surface of our planet is being slowly ripped apart by the forces of plate tectonics. Much of this activity occurs underwater and goes unnoticed except for by a few marine seismologists who avidly follow the creaks and groans of the ocean floor in an attempt to understand the spreading and formation of oceanic crust. Are marine seismologists really the only ones that care? As it turns out, deep beneath the ocean surface, earthquakes play a fundamental role in a myriad of activity centered on mid-ocean ridges where new crust forms and breaks on a regular basis. This activity takes the form of exotic geological structures hosting roasting hot fluids and bizarre chemosynthetic life forms. One of the fundamental drivers for this other world on the seafloor is earthquakes. Earthquakes provide cracks that allow seawater to penetrate the rocks, heat up, and resurface as hydrothermal vent fluids, thus providing chemicals to feed a thriving biological community. Earthquakes can cause pressure changes along cracks that can fundamentally alter fluid flow rates and paths. Thus earthquakes can both cut off existing communities from their nutrient source and provide new oases on the seafloor around which life can thrive. This poster will present some of the fundamental physical principals of how earthquakes can impact fluid flow, and hence life on the seafloor. Using these other-wordly landscapes and alien-like life forms to woe the unsuspecting passerby, we will sneak geophysics into the picture and tell the story of why earthquakes are so fundamental to life on the seafloor, and perhaps life elsewhere in the universe.

  12. Statistical Earthquake Focal Mechanism Forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.

    2013-12-01

    The new whole Earth focal mechanism forecast, based on the GCMT catalog, has been created. In the present forecast, the sum of normalized seismic moment tensors within 1000 km radius is calculated and the P- and T-axes for the focal mechanism are evaluated on the basis of the sum. Simultaneously we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms. This average angle shows tectonic complexity of a region and indicates the accuracy of the prediction. The method was originally proposed by Kagan and Jackson (1994, JGR). Recent interest by CSEP and GEM has motivated some improvements, particularly to extend the previous forecast to polar and near-polar regions. The major problem in extending the forecast is the focal mechanism calculation on a spherical surface. In the previous forecast as our average focal mechanism was computed, it was assumed that longitude lines are approximately parallel within 1000 km radius. This is largely accurate in the equatorial and near-equatorial areas. However, when one approaches the 75 degree latitude, the longitude lines are no longer parallel: the bearing (azimuthal) difference at points separated by 1000 km reach about 35 degrees. In most situations a forecast point where we calculate an average focal mechanism is surrounded by earthquakes, so a bias should not be strong due to the difference effect cancellation. But if we move into polar regions, the bearing difference could approach 180 degrees. In a modified program focal mechanisms have been projected on a plane tangent to a sphere at a forecast point. New longitude axes which are parallel in the tangent plane are corrected for the bearing difference. A comparison with the old 75S-75N forecast shows that in equatorial regions the forecasted focal mechanisms are almost the same, and the difference in the forecasted focal mechanisms rotation angle is close to zero. However, though the forecasted focal mechanisms are similar

  13. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  14. Earthquakes Threaten Many American Schools

    ERIC Educational Resources Information Center

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  15. Heavy tails and earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.

    2012-01-01

    The 21st century has already seen its share of devastating earthquakes, some of which have been labeled as “unexpected,” at least in the eyes of some seismologists and more than a few journalists. A list of seismological surprises could include the 2004 Sumatra-Andaman Islands; 2008 Wenchuan, China; 2009 Haiti; 2011 Christchurch, New Zealand; and 2011 Tohoku, Japan, earthquakes

  16. Earthquakes, July-August, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, on August 6, central California experienced a moderately strong earthquake, which injured several people and caused some damage. A number of earthquakes occurred in other parts of the United States but caused very little damage. 

  17. Earthquakes; May-June 1977

    USGS Publications Warehouse

    Person, W.J.

    1977-01-01

    The months of May and June were somewhat quiet seismically speaking. There was only on significant earthquake, a magnitude 7.2 on June 22 in teh Tonga Islands. In teh United States, the two largest earthquakes occurred in California and on Hawaii. 

  18. Earthquakes Threaten Many American Schools

    ERIC Educational Resources Information Center

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  19. Earthquake prediction; fact and fallacy

    USGS Publications Warehouse

    Hunter, R.N.

    1976-01-01

    Earthquake prediction is a young and growing area in the field of seismology. Only a few years ago, experts in seismology were declaring flatly that it was impossible. Now, some successes have been achieved and more are expected. Within a few years, earthquakes may be predicted as routinely as the weather, and possibly with greater accuracy. 

  20. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  1. Earthquakes March-April 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of March and April were quite active seismically speaking. There was one major earthquake (7.0Earthquake-related deaths were reported in Iran, Costa Rica, Turkey, and Germany.

  2. Self-Organized Earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.; Klein, W.

    2011-12-01

    Self-Organized Criticality was proposed by the Per Bak et al. [1] as a means of explaining scaling laws observed in driven natural systems, usually in (slowly) driven threshold systems. The example used by Bak was a simple cellular automaton model of a sandpile, in which grains of sand were slowly dropped (randomly) onto a flat plate. After a period of time, during which the 'critical state' was approached, a series of self-similar avalanches would begin. Scaling exponents for the frequency-area statistics of the sandpile avalanches were found to be approximately 1, a value that characterizes 'flicker noise' in natural systems. SOC is associated with a critical point in the phase diagram of the system, and it was found that the usual 2-scaling field theory applies. A model related to SOC is the Self-Organized Spinodal (SOS), or intermittent criticality model. Here a slow but persistent driving force leads to quasi-periodic approach to, and retreat from, the classical limit of stability, or spinodal. Scaling exponents for this model can be related to Gutenberg-Richter and Omori exponents observed in earthquake systems. In contrast to SOC models, nucleation, both classical and non-classical types, is possible in SOS systems. Tunneling or nucleation rates can be computed from Langer-Klein-Landau-Ginzburg theories for comparison to observations. Nucleating droplets play a role similar to characteristic earthquake events. Simulations of these systems reveals much of the phenomenology associated with earthquakes and other types of "burst" dynamics. Whereas SOC is characterized by the full scaling spectrum of avalanches, SOS is characterized by both system-size events above the nominal frequency-size scaling curve, and scaling of small events. Applications to other systems including integrate-and-fire neural networks and financial crashes will be discussed. [1] P. Bak, C. Tang and K. Weisenfeld, Self-Organized Criticality, Phys. Rev. Lett., 59, 381 (1987).

  3. Towards Modelling slow Earthquakes with Geodynamics

    NASA Astrophysics Data System (ADS)

    Regenauer-Lieb, K.; Yuen, D. A.

    2006-12-01

    We explore a new, properly scaled, thermal-mechanical geodynamic model{^1} that can generate timescales now very close to those of earthquakes and of the same order as slow earthquakes. In our simulations we encounter two basically different bifurcation phenomena. One in which the shear zone nucleates in the ductile field, and the second which is fully associated with elasto-plastic (brittle, pressure- dependent) displacements. A quartz/feldspar composite slab has all two modes operating simultaneously in three different depth levels. The bottom of the crust is predominantly controlled by the elasto-visco-plastic mode while the top is controlled by the elasto-plastic mode. The exchange of the two modes appears to communicate on a sub-horizontal layer in a flip-flop fashion, which may yield a fractal-like signature in time and collapses into a critical temperature which for crustal rocks is around 500-580 K; in the middle of the brittle-ductile transition-zone. Near the critical temperature, stresses close to the ideal strength can be reached at local, meter-scale. Investigations of the thermal-mechanical properties under such extreme conditions are pivotal for understanding the physics of earthquakes. 1. Regenauer-Lieb, K., Weinberg, R. & Rosenbaum, G. The effect of energy feedbacks on continental strength. Nature 442, 67-70 (2006).

  4. Earthquake Catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  5. Earthquake Information System

    NASA Astrophysics Data System (ADS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  6. The Lusi mud eruption was not triggered by an earthquake

    NASA Astrophysics Data System (ADS)

    Manga, M.; Rudolph, M. L.; Tingay, M. R.; Davies, R.; Wang, C.; Shirzaei, M.; Fukushima, Y.

    2013-12-01

    The Lusi mud eruption in East Java, Indonesia has displaced tens of thousands of people with economic costs that exceed $4 billion USD to date. Consequently, understanding the cause and future of the eruption are important. There has been considerable debate as to whether the eruption was triggered by the MW 6.3 Yogyakarta earthquake, which struck two days prior to the eruption, or by drilling operations at a gas exploration well (BJP-1) 200 m from the 700 m lineament, along which mud first erupted. A recent letter by Lupi et al. (Nature Geoscience, 2013) argues for an earthquake trigger, invoking the presence of a seismically fast structure that amplifies seismic shaking in the mud source region. The absence of an eruption during larger and closer earthquakes reveals that an earthquake trigger is unlikely. Furthermore, the high seismic velocities, central to the model of Lupi et al. , are impossibly high and are primarily artifacts associated with steel casing installed in the well where the velocities were measured. Finally, the stress changes caused by drilling operations greatly exceeded those produced by the earthquake. Assuming no major changes in plumbing, we conclude by using satellite InSAR to reveal the evolution of surface deformation caused by the eruption and predict a 10 fold decrease in discharge in the next 5 years.

  7. Exaggerated Claims About Earthquake Predictions

    NASA Astrophysics Data System (ADS)

    Kafka, Alan L.; Ebel, John E.

    2007-01-01

    The perennial promise of successful earthquake prediction captures the imagination of a public hungry for certainty in an uncertain world. Yet, given the lack of any reliable method of predicting earthquakes [e.g., Geller, 1997; Kagan and Jackson, 1996; Evans, 1997], seismologists regularly have to explain news stories of a supposedly successful earthquake prediction when it is far from clear just how successful that prediction actually was. When journalists and public relations offices report the latest `great discovery' regarding the prediction of earthquakes, seismologists are left with the much less glamorous task of explaining to the public the gap between the claimed success and the sober reality that there is no scientifically proven method of predicting earthquakes.

  8. Earthquake Simulator Finds Tremor Triggers

    ScienceCinema

    Johnson, Paul

    2016-07-12

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  9. Earthquake Simulator Finds Tremor Triggers

    SciTech Connect

    Johnson, Paul

    2015-03-27

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  10. Are Earthquakes a Critical Phenomenon?

    NASA Astrophysics Data System (ADS)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  11. Early Earthquakes of the Americas

    NASA Astrophysics Data System (ADS)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  12. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  13. Intermediate-term earthquake prediction

    USGS Publications Warehouse

    Knopoff, L.

    1990-01-01

    The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

  14. Testing earthquake source inversion methodologies

    USGS Publications Warehouse

    Page, M.; Mai, P.M.; Schorlemmer, D.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  15. Testing an earthquake prediction algorithm

    USGS Publications Warehouse

    Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

    1997-01-01

    A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

  16. Review of variations in Mw < 7 earthquake motions on position and TEC (Mw = 6.5 Aegean Sea earthquake sample)

    NASA Astrophysics Data System (ADS)

    Yildirim, Omer; Inyurt, Samed; Mekik, Cetin

    2016-02-01

    Turkey is a country located in the middle latitude zone, where tectonic activity is intensive. Recently, an earthquake of magnitude 6.5 Mw occurred offshore in the Aegean Sea on 24 May 2014 at 09:25 UTC, which lasted about 40 s. The earthquake was also felt in Greece, Romania, and Bulgaria in addition to Turkey. In recent years, ionospheric anomaly detection studies have been carried out because of seismicity with total electron content (TEC) computed from the global navigation satellite system's (GNSS) signal delays and several interesting findings have been published. In this study, both TEC and positional variations have been examined separately following a moderate size earthquake in the Aegean Sea. The correlation of the aforementioned ionospheric variation with the positional variation has also been investigated. For this purpose, a total of 15 stations was used, including four continuously operating reference stations in Turkey (CORS-TR) and stations in the seismic zone (AYVL, CANA, IPSA, and YENC), as well as international GNSS service (IGS) and European reference frame permanent network (EPN) stations. The ionospheric and positional variations of the AYVL, CANA, IPSA, and YENC stations were examined using Bernese v5.0 software. When the precise point positioning TEC (PPP-TEC) values were examined, it was observed that the TEC values were approximately 4 TECU (total electron content unit) above the upper-limit TEC value at four stations located in Turkey, 3 days before the earthquake at 08:00 and 10:00 UTC. At the same stations, on the day before the earthquake at 06:00, 08:00, and 10:00 UTC, the TEC values were approximately 5 TECU below the lower-limit TEC value. The global ionosphere model TEC (GIM-TEC) values published by the Centre for Orbit Determination in Europe (CODE) were also examined. Three days before the earthquake, at all stations, it was observed that the TEC values in the time period between 08:00 and 10:00 UTC were approximately 2 TECU

  17. What to Expect from the Virtual Seismologist: Delay Times and Uncertainties of Initial Earthquake Alerts in California

    NASA Astrophysics Data System (ADS)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Racine, R.; Meier, M.; Cauzzi, C.

    2013-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland, western Greece and Istanbul. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS installations in southern Italy, Romania, and Iceland are planned or underway. The possible use cases for an EEW system will be determined by the speed and reliability of earthquake source parameter estimates. A thorough understanding of both is therefore essential to evaluate the usefulness of VS. For California, we present state-wide theoretical alert times for hypothetical earthquakes by analyzing time delays introduced by the different components in the VS EEW system. Taking advantage of the fully probabilistic formulation of the VS algorithm we further present an improved way to describe the uncertainties of every magnitude estimate by evaluating the width and shape of the probability density function that describes the relationship between waveform envelope amplitudes and magnitude. We evaluate these new uncertainty values for past seismicity in California through off-line playbacks and compare them to the previously defined static definitions of uncertainty based on real-time detections. Our results indicate where VS alerts are most useful in California and also suggest where most effective improvements to the VS EEW system

  18. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  19. Performance Basis for Airborne Separation

    NASA Technical Reports Server (NTRS)

    Wing, David J.

    2008-01-01

    Emerging applications of Airborne Separation Assistance System (ASAS) technologies make possible new and powerful methods in Air Traffic Management (ATM) that may significantly improve the system-level performance of operations in the future ATM system. These applications typically involve the aircraft managing certain components of its Four Dimensional (4D) trajectory within the degrees of freedom defined by a set of operational constraints negotiated with the Air Navigation Service Provider. It is hypothesized that reliable individual performance by many aircraft will translate into higher total system-level performance. To actually realize this improvement, the new capabilities must be attracted to high demand and complexity regions where high ATM performance is critical. Operational approval for use in such environments will require participating aircraft to be certified to rigorous and appropriate performance standards. Currently, no formal basis exists for defining these standards. This paper provides a context for defining the performance basis for 4D-ASAS operations. The trajectory constraints to be met by the aircraft are defined, categorized, and assessed for performance requirements. A proposed extension of the existing Required Navigation Performance (RNP) construct into a dynamic standard (Dynamic RNP) is outlined. Sample data is presented from an ongoing high-fidelity batch simulation series that is characterizing the performance of an advanced 4D-ASAS application. Data of this type will contribute to the evaluation and validation of the proposed performance basis.

  20. MyShake: A smartphone seismic network for earthquake early warning and beyond

    PubMed Central

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis; Kwon, Young-Woo

    2016-01-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682