Science.gov

Sample records for operating basis earthquake

  1. Study on Cumulative Absolute Velocity as Operating Basis Earthquake Criteria to Nuclear Power Plants in Korea

    NASA Astrophysics Data System (ADS)

    Park, D.; Yun, K.; Chang, C.; Cho, S.; Choi, W.

    2013-12-01

    In recognition of the need to develop a new criteria for determining when the operating basis earthquake(OBE) has been exceeded at nuclear power plants in Korea, cumulative absolute velocity(CAV) is introduced in this paper. CAV OBE was determined as the minimum CAV value for the modified Mercalli intensity(MMI) VII based on the relation between the CAV and the seismic intensity. The MMI VII intensity can be defined as the ground-motion level that could cause a minor damage to a well-designed structure. Therefore, no damage to the more rugged NPP structure, which is reinforced against earthquakes, will be guaranteed if the minimum CAV value is used as a threshold of OBE exceedance criteria. In deriving the CAV OBE exceedance criteria, it is necessary to generate a suite of simulated ground-motions for a range of earthquake magnitudes and calibrated distances to the site. It is also necessary to use an instrumental MMI intensity of Fourier acceleration spectra(FAS) MMI because there have been no strong ground-motion records or experienced intensity data from damaging earthquakes in Korea. The empirical Green's function method and stochastic ground motion simulation method is used to simulate ground motion. Based on the relation between the CAV values given for a specific NPP site and the values for the instrumental MMI intensity (FAS MMI), the CAV OBE value was calculated as 0.16g.sec. However, since this result is totally based on the simulation, there still remains a margin of the CAV threshold value that must consider characteristics of the real strong ground-motion records. For the future work, data on the limited earthquake damage reported in Korea will be used to validate the simulated CAV values.

  2. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  3. [Autism after an earthquake: the experience of L'Aquila (Central Italy) as a basis for an operative guideline].

    PubMed

    Valenti, Marco; Di Giovanni, Chiara; Mariano, Melania; Pino, Maria Chiara; Sconci, Vittorio; Mazza, Monica

    2016-01-01

    People with autism, their families, and their specialised caregivers are a social group at high health risk after a disruptive earthquake. They need emergency assistance and immediate structured support according to definite protocols and quality standards. We recommend to establish national guidelines for taking-in-charge people with autism after an earthquake. The adaptive behaviour of participants with autism declined dramatically in the first months after the earthquake in all the dimensions examined (i.e., communication, daily living, socialisation, and motor skills). After relatively stable conditions returned and with immediate and intensive post-disaster intervention, children and adolescents with autism showed a trend towards partial recovery of adaptive functioning. As to the impact on services, this study indicates the need for supporting exposed caregivers at high risk of burnout over the first two years after the disaster and for an immediate reorganisation of person-tailored services. PMID:27291209

  4. Earthquake!

    ERIC Educational Resources Information Center

    Markle, Sandra

    1987-01-01

    A learning unit about earthquakes includes activities for primary grade students, including making inferences and defining operationally. Task cards are included for independent study on earthquake maps and earthquake measuring. (CB)

  5. The potential uses of operational earthquake forecasting

    USGS Publications Warehouse

    Field, Ned; Jordan, Thomas; Jones, Lucille; Michael, Andrew; Blanpied, Michael L.

    2016-01-01

    This article reports on a workshop held to explore the potential uses of operational earthquake forecasting (OEF). We discuss the current status of OEF in the United States and elsewhere, the types of products that could be generated, the various potential users and uses of OEF, and the need for carefully crafted communication protocols. Although operationalization challenges remain, there was clear consensus among the stakeholders at the workshop that OEF could be useful.

  6. Linking earthquakes and hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2013-01-01

    Hydraulic fracturing, also known as fracking, to extract oil and gas from rock, has been a controversial but increasingly common practice; some studies have linked it to groundwater contamination and induced earthquakes. Scientists discussed several studies on the connection between fracking and earthquakes at the AGU Fall Meeting in San Francisco in December.

  7. The Bender-Dunne basis operators as Hilbert space operators

    SciTech Connect

    Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph

    2014-02-15

    The Bender-Dunne basis operators, T{sub −m,n}=2{sup −n}∑{sub k=0}{sup n}(n/k )q{sup k}p{sup −m}q{sup n−k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub −m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)

  8. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart.

  9. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  10. Minimization of Basis Risk in Parametric Earthquake Cat Bonds

    NASA Astrophysics Data System (ADS)

    Franco, G.

    2009-12-01

    A catastrophe -cat- bond is an instrument used by insurance and reinsurance companies, by governments or by groups of nations to cede catastrophic risk to the financial markets, which are capable of supplying cover for highly destructive events, surpassing the typical capacity of traditional reinsurance contracts. Parametric cat bonds, a specific type of cat bonds, use trigger mechanisms or indices that depend on physical event parameters published by respected third parties in order to determine whether a part or the entire bond principal is to be paid for a certain event. First generation cat bonds, or cat-in-a-box bonds, display a trigger mechanism that consists of a set of geographic zones in which certain conditions need to be met by an earthquake’s magnitude and depth in order to trigger payment of the bond principal. Second generation cat bonds use an index formulation that typically consists of a sum of products of a set of weights by a polynomial function of the ground motion variables reported by a geographically distributed seismic network. These instruments are especially appealing to developing countries with incipient insurance industries wishing to cede catastrophic losses to the financial markets because the payment trigger mechanism is transparent and does not involve the parties ceding or accepting the risk, significantly reducing moral hazard. In order to be successful in the market, however, parametric cat bonds have typically been required to specify relatively simple trigger conditions. The consequence of such simplifications is the increase of basis risk. This risk represents the possibility that the trigger mechanism fails to accurately capture the actual losses of a catastrophic event, namely that it does not trigger for a highly destructive event or vice versa, that a payment of the bond principal is caused by an event that produced insignificant losses. The first case disfavors the sponsor who was seeking cover for its losses while the

  11. Retrospective tests of hybrid operational earthquake forecasting models for Canterbury

    NASA Astrophysics Data System (ADS)

    Rhoades, D. A.; Liukis, M.; Christophersen, A.; Gerstenberger, M. C.

    2016-01-01

    The Canterbury, New Zealand, earthquake sequence, which began in September 2010, occurred in a region of low crustal deformation and previously low seismicity. Because, the ensuing seismicity in the region is likely to remain above previous levels for many years, a hybrid operational earthquake forecasting model for Canterbury was developed to inform decisions on building standards and urban planning for the rebuilding of Christchurch. The model estimates occurrence probabilities for magnitudes M ≥ 5.0 in the Canterbury region for each of the next 50 yr. It combines two short-term, two medium-term and four long-term forecasting models. The weight accorded to each individual model in the operational hybrid was determined by an expert elicitation process. A retrospective test of the operational hybrid model and of an earlier informally developed hybrid model in the whole New Zealand region has been carried out. The individual and hybrid models were installed in the New Zealand Earthquake Forecast Testing Centre and used to make retrospective annual forecasts of earthquakes with magnitude M > 4.95 from 1986 on, for time-lags up to 25 yr. All models underpredict the number of earthquakes due to an abnormally large number of earthquakes in the testing period since 2008 compared to those in the learning period. However, the operational hybrid model is more informative than any of the individual time-varying models for nearly all time-lags. Its information gain relative to a reference model of least information decreases as the time-lag increases to become zero at a time-lag of about 20 yr. An optimal hybrid model with the same mathematical form as the operational hybrid model was computed for each time-lag from the 26-yr test period. The time-varying component of the optimal hybrid is dominated by the medium-term models for time-lags up to 12 yr and has hardly any impact on the optimal hybrid model for greater time-lags. The optimal hybrid model is considerably more

  12. Operational earthquake forecasting in the South Iceland Seismic Zone: improving the earthquake catalogue

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Vogfjörd, Kristin; Zechar, J. Douglas; Eberhard, David

    2014-05-01

    A major earthquake sequence is ongoing in the South Iceland Seismic Zone (SISZ), where experts expect earthquakes of up to MW = 7.1 in the coming years to decades. The historical seismicity in this region is well known and many major faults here and on Reykjanes Peninsula (RP) have already been mapped. The faults are predominantly N-S with right-lateral strike-slip motion, while the overall motion in the SISZ is E-W oriented left-lateral motion. The area that we propose for operational earthquake forecasting(OEF) contains both the SISZ and the RP. The earthquake catalogue considered for OEF, called the SIL catalogue, spans the period from 1991 until September 2013 and contains more than 200,000 earthquakes. Some of these events have a large azimuthal gap between stations, and some have large horizontal and vertical uncertainties. We are interested in building seismicity models using high-quality data, so we filter the catalogue using the criteria proposed by Gomberg et al. (1990) and Bondar et al. (2004). The resulting filtered catalogue contains around 130,000 earthquakes. Magnitude estimates in the Iceland catalogue also require special attention. The SIL system uses two methods to estimate magnitude. The first method is based on an empirical local magnitude (ML) relationship. The other magnitude scale is a so-called "local moment magnitude" (MLW), originally constructed by Slunga et al. (1984) to agree with local magnitude scales in Sweden. In the SIL catalogue, there are two main problems with the magnitude estimates and consequently it is not immediately possible to convert MLW to moment magnitude (MW). These problems are: (i) immediate aftershocks of large events are assigned magnitudes that are too high; and (ii) the seismic moment of large earthquakes is underestimated. For this reason the magnitude values in the catalogue must be corrected before developing an OEF system. To obtain a reliable MW estimate, we calibrate a magnitude relationship based on

  13. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  14. The Establishment of an Operational Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Lombardi, Anna Maria; Casarotti, Emanuele

    2014-05-01

    Just after the Mw 6.2 earthquake that hit L'Aquila, on April 6 2009, the Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) that paved the way to the development of the Operational Earthquake Forecasting (OEF), defined as the "procedures for gathering and disseminating authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes". In this paper we introduce the first official OEF system in Italy that has been developed by the new-born Centro di Pericolosità Sismica at the Istituto Nazionale di Geofisica e Vulcanologia. The system provides every day an update of the weekly probabilities of ground shaking over the whole Italian territory. In this presentation, we describe in detail the philosophy behind the system, the scientific details, and the output format that has been preliminary defined in agreement with Civil Protection. To our knowledge, this is the first operational system that fully satisfies the ICEF guidelines. Probably, the most sensitive issue is related to the communication of such a kind of message to the population. Acknowledging this inherent difficulty, in agreement with Civil Protection we are planning pilot tests to be carried out in few selected areas in Italy; the purpose of such tests is to check the effectiveness of the message and to receive feedbacks.

  15. FB Line Basis for Interim Operation

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The safety analysis of the FB-Line Facility indicates that the operation of FB-Line to support the current mission does not present undue risk to the facility and co-located workers, general public, or the environment.

  16. Design basis for the NRC Operations Center

    SciTech Connect

    Lindell, M.K.; Wise, J.A.; Griffin, B.N.; Desrosiers, A.E.; Meitzler, W.D.

    1983-05-01

    This report documents the development of a design for a new NRC Operations Center (NRCOC). The project was conducted in two phases: organizational analysis and facility design. In order to control the amount of traffic, congestion and noise within the facility, it is recommended that information flow in the new NRCOC be accomplished by means of an electronic Status Information Management System. Functional requirements and a conceptual design for this system are described. An idealized architectural design and a detailed design program are presented that provide the appropriate amount of space for operations, equipment and circulation within team areas. The overall layout provides controlled access to the facility and, through the use of a zoning concept, provides each team within the NRCOC the appropriate balance of ready access and privacy determined from the organizational analyses conducted during the initial phase of the project.

  17. Geological and seismological survey for new design-basis earthquake ground motion of Kashiwazaki-Kariwa NPS

    NASA Astrophysics Data System (ADS)

    Takao, M.; Mizutani, H.

    2009-05-01

    At about 10:13 on July 16, 2007, a strong earthquake named 'Niigata-ken Chuetsu-oki Earthquake' of Mj6.8 on Japan Meteorological Agencyfs scale occurred offshore Niigata prefecture in Japan. However, all of the nuclear reactors at Kashiwazaki-Kariwa Nuclear Power Station (KKNPS) in Niigata prefecture operated by Tokyo Electric Power Company shut down safely. In other words, automatic safety function composed of shutdown, cooling and containment worked as designed immediately after the earthquake. During the earthquake, the peak acceleration of the ground motion exceeded the design-basis ground motion (DBGM), but the force due to the earthquake applied to safety-significant facilities was about the same as or less than the design basis taken into account as static seismic force. In order to assess anew the safety of nuclear power plants, we have evaluated a new DBGM after conducting geomorphological, geological, geophysical, seismological survey and analyses. [Geomorphological, Geological and Geophysical survey] In the land area, aerial photograph interpretation was performed at least within the 30km radius to extract geographies that could possibly be tectonic reliefs as a geomorphological survey. After that, geological reconnaissance was conducted to confirm whether the extracted landforms are tectonic reliefs or not. Especially we carefully investigated Nagaoka Plain Western Boundary Fault Zone (NPWBFZ), which consists of Kakuda-Yahiko fault, Kihinomiya fault and Katakai fault, because NPWBFZ is the one of the active faults which have potential of Mj8 class in Japan. In addition to the geological survey, seismic reflection prospecting of approximate 120km in total length was completed to evaluate the geological structure of the faults and to assess the consecutiveness of the component faults of NPWBFZ. As a result of geomorphological, geological and geophysical surveys, we evaluated that the three component faults of NPWBFZ are independent to each other from the

  18. Solid waste retrieval. Phase 1, Operational basis

    SciTech Connect

    Johnson, D.M.

    1994-09-30

    This Document describes the operational requirements, procedures, and options for execution of the retrieval of the waste containers placed in buried storage in Burial Ground 218W-4C, Trench 04 as TRU waste or suspect TRU waste under the activity levels defining this waste in effect at the time of placement. Trench 04 in Burial Ground 218W-4C is totally dedicated to storage of retrievable TRU waste containers or retrievable suspect TRU waste containers and has not been used for any other purpose.

  19. Lessons of L'Aquila for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  20. Operational real-time GPS-enhanced earthquake early warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.; Johanson, I. A.; Allen, R. M.

    2014-10-01

    Moment magnitudes for large earthquakes (Mw≥7.0) derived in real time from near-field seismic data can be underestimated due to instrument limitations, ground tilting, and saturation of frequency/amplitude-magnitude relationships. Real-time high-rate GPS resolves the buildup of static surface displacements with the S wave arrival (assuming nonsupershear rupture), thus enabling the estimation of slip on a finite fault and the event's geodetic moment. Recently, a range of high-rate GPS strategies have been demonstrated on off-line data. Here we present the first operational system for real-time GPS-enhanced earthquake early warning as implemented at the Berkeley Seismological Laboratory (BSL) and currently analyzing real-time data for Northern California. The BSL generates real-time position estimates operationally using data from 62 GPS stations in Northern California. A fully triangulated network defines 170+ station pairs processed with the software trackRT. The BSL uses G-larmS, the Geodetic Alarm System, to analyze these positioning time series and determine static offsets and preevent quality parameters. G-larmS derives and broadcasts finite fault and magnitude information through least-squares inversion of the static offsets for slip based on a priori fault orientation and location information. This system tightly integrates seismic alarm systems (CISN-ShakeAlert, ElarmS-2) as it uses their P wave detections to trigger its processing; quality control runs continuously. We use a synthetic Hayward Fault earthquake scenario on real-time streams to demonstrate recovery of slip and magnitude. Reanalysis of the Mw7.2 El Mayor-Cucapah earthquake tests the impact of dynamic motions on offset estimation. Using these test cases, we explore sensitivities to disturbances of a priori constraints (origin time, location, and fault strike/dip).

  1. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a ...

  2. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  3. Is there a basis for preferring characteristic earthquakes over a Gutenberg-Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg-Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg-Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  4. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  5. Scientific and non-scientific challenges for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2015-12-01

    Tracking the time evolution of seismic hazard in time windows shorter than the usual 50-years of long-term hazard models may offer additional opportunities to reduce the seismic risk. This is the target of operational earthquake forecasting (OEF). During the OEF development in Italy we identify several challenges that range from pure science to the more practical interface of science with society. From a scientific point of view, although earthquake clustering is the clearest empirical evidence about earthquake occurrence, and OEF clustering models are the most (successfully) tested hazard models in seismology, we note that some seismologists are still reluctant to accept their scientific reliability. After exploring the motivations of these scientific doubts, we also look into an issue that is often overlooked in this discussion, i.e., in any kind of hazard analysis, we do not use a model because it is the true one, but because it is the better than anything else we can think of. The non-scientific aspects are mostly related to the fact that OEF usually provides weekly probabilities of large eartquakes smaller than 1%. These probabilities are considered by some seismologists too small to be of interest or useful. However, in a recent collaboration with engineers we show that such earthquake probabilities may lead to intolerable individual risk of death. Interestingly, this debate calls for a better definition of the still fuzzy boundaries among the different expertise required for the whole risk mitigation process. The last and probably more pressing challenge is related to the communication to the public. In fact, a wrong message could be useless or even counterproductive. Here we show some progresses that we have made in this field working with communication experts in Italy.

  6. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  7. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  8. Circuit breaker operation and potential failure modes during an earthquake

    SciTech Connect

    Lambert, H.E.; Budnitz, R.J.

    1987-01-01

    This study addresses the effect of a strong-motion earthquake on circuit breaker operation. It focuses on the loss of offsite power (LOSP) transient caused by a strong-motion earthquake at the Zion Nuclear Power Plant. This paper also describes the operator action necessary to prevent core melt if the above circuit breaker failure modes occur simultaneously on three 4.16 KV buses. Numerous circuit breakers important to plant safety, such as circuit breakers to diesel generators and engineered safety systems (ESS), must open and/or close during this transient while strong motion is occurring. Potential seismically-induced circuit-breaker failures modes were uncovered while the study was conducted. These failure modes include: circuit breaker fails to close; circuit breaker trips inadvertently; circuit breaker fails to reclose after trip. The causes of these failure modes include: Relay chatter causes the circuit breaker to trip; Relay chatter causes anti-pumping relays to seal-in which prevents automatic closure of circuit breakers; Load sequencer failures. The incorporation of these failure modes as well as other instrumentation and control failures into a limited scope seismic probabilistic risk assessment is also discussed in this paper.

  9. Ground motion following selection of SRS design basis earthquake and associated deterministic approach. Final report: Revision 1

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section`s Seismic Qualification Program for reactor restart.

  10. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  11. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  12. 1/f and the Earthquake Problem: Scaling constraints to facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Rundle, J. B.; Glasscoe, M. T.

    2013-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or '1/f', nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this '1/f problem,' it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area), in combination with a metric to quantify rate trends in local seismicity, to the local earthquake magnitude potential - the magnitudes of earthquakes the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.

  13. Post Test Analysis of a PCCV Model Dynamically Tested Under Simulated Design-Basis Earthquakes

    SciTech Connect

    Cherry, J.; Chokshi, N.; James, R.J.; Rashid, Y.R.; Tsurumaki, S.; Zhang, L.

    1998-11-09

    In a collaborative program between the United States Nuclear Regulatory Commission (USNRC) and the Nuclear Power Engineering Corporation (NUPEC) of Japan under sponsorship of the Ministry of International Trade and Ihdustry, the seismic behavior of Prestressed Concrete Containment Vessels (PCCV) is being investigated. A 1:10 scale PCCV model has been constructed by NUPEC and subjected to seismic simulation tests using the high performance shaking table at the Tadotsu Engineering Laboratory. A primary objective of the testing program is to demonstrate the capability of the PCCV to withstand design basis earthquakes with a significant safety margin against major damage or failure. As part of the collaborative program, Sandia National Laboratories (SNL) is conducting research in state-of-the-art analytical methods for predicting the seismic behavior of PCCV structures, with the eventual goal of understanding, validating, and improving calculations dated to containment structure performance under design and severe seismic events. With the increased emphasis on risk-informed- regulatory focus, more accurate ch&@erization (less uncertainty) of containment structural and functional integri~ is desirable. This paper presents results of post-test calculations conducted at ANATECH to simulate the design level scale model tests.

  14. Earthquake Response Modeling for a Parked and Operating Megawatt-Scale Wind Turbine

    SciTech Connect

    Prowell, I.; Elgamal, A.; Romanowitz, H.; Duggan, J. E.; Jonkman, J.

    2010-10-01

    Demand parameters for turbines, such as tower moment demand, are primarily driven by wind excitation and dynamics associated with operation. For that purpose, computational simulation platforms have been developed, such as FAST, maintained by the National Renewable Energy Laboratory (NREL). For seismically active regions, building codes also require the consideration of earthquake loading. Historically, it has been common to use simple building code approaches to estimate the structural demand from earthquake shaking, as an independent loading scenario. Currently, International Electrotechnical Commission (IEC) design requirements include the consideration of earthquake shaking while the turbine is operating. Numerical and analytical tools used to consider earthquake loads for buildings and other static civil structures are not well suited for modeling simultaneous wind and earthquake excitation in conjunction with operational dynamics. Through the addition of seismic loading capabilities to FAST, it is possible to simulate earthquake shaking in the time domain, which allows consideration of non-linear effects such as structural nonlinearities, aerodynamic hysteresis, control system influence, and transients. This paper presents a FAST model of a modern 900-kW wind turbine, which is calibrated based on field vibration measurements. With this calibrated model, both coupled and uncoupled simulations are conducted looking at the structural demand for the turbine tower. Response is compared under the conditions of normal operation and potential emergency shutdown due the earthquake induced vibrations. The results highlight the availability of a numerical tool for conducting such studies, and provide insights into the combined wind-earthquake loading mechanism.

  15. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  16. M6.0 South Napa Earthquake Forecasting on the basis of jet stream precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.

    2014-12-01

    Currently earthquake prediction research methods can be divided into the crust change, radon concentration, well water level, animal behavior, Very high frequency (VHF) signals, GPS/TEC in ionospheric variations, thermal infrared radiation (TIR) anomalies. Before major earthquakes (M> 6) occurred, jet stream in the epicenter area will interrupt or velocity flow lines cross. That meaning is that before earthquake happen, atmospheric pressure in high altitude suddenly dropped during 6~12 hours (Wu & Tikhonov, 2014). This technique has been used to predict the strong earthquakes in real time, and then pre-registered on the website. For example: M6.0 Northern California earthquake on 2014/08/24(figure1) , M6.6 Russia earthquake on 2013/10/12(figure2), As far as 2014/08/24 M6.6 earthquake in CA, USA, the front end of the 60knots speed line was at the S.F. on 2014/06/16 12:00, and then after 69 days ,M6.1 earthquake happened. We predicted that magnitude is larger than 5.5 but the period is only 30 days on 2014/07/16 . The deviation of predicted point was about 70 km. Lithosphere-atmosphere-ionosphere (LAI) coupling model may be explained this phenomenon : Ionization of the air produced by an increased emanation of radon at epicenter. The water molecules in the air react with these ions, and then release heat. The heat result in temperature rise in the air. They are also accompanied by a large-scale change in the atmospheric pressure and jet streams morphology.We obtain satisfactory accuracy of estimation of the epicenter location. As well we define the short alarm period. That's the positive aspects of our forecast. However, estimates of magnitude jet contain a big uncertainty.Reference:H.C Wu, I.N. Tikhonov, 2014, "Jet streams anomalies as possible short-term precursors of earthquakes with M>6.0", Research in geophysics, DOI: http://dx.doi.org/10.4081/ rg.2014.4939 http://www.pagepress.org/journals/index.php/rg/article/view/rg.2014.4939

  17. Earthquake Early Warning using a Seismogeodetic Approach: An operational plan for Cascadia

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Bodin, P.; Vidale, J. E.; Schmidt, D. A.; Melbourne, T. I.; Scrivner, C. W.; Santillan, V. M.; Szeliga, W. M.; Minson, S. E.; Bock, Y.; Melgar, D.

    2013-12-01

    We present an operational plan for implementing combined seismic and geodetic time series in an earthquake early warning system for Cascadia. The Cascadian subduction zone presents one of the greatest risks for a megaquake in the continental United States. Ascertaining the full magnitude and extent of large earthquakes is problematic for earthquake early warning systems due to instability when double integrating strong-motion records to ground displacement. This problem can be mitigated by augmenting earthquake early warning systems with real-time GPS data, allowing for the progression and spatial extent of large earthquakes to be better resolved due to GPS's ability to measure both dynamic and permanent displacements. The Pacific Northwest Seismic Network (PNSN) at the University of Washington is implementing an integrated seismogeodetic approach to earthquake early warning. Regional GPS data are provided by the Pacific Northwest Geodetic Array (PANGA) at Central Washington University. Precise Point Positioning (PPP) solutions are sent from PANGA to the PNSN through JSON formatted streams and processed with a Python-based quality control (QC) module. The QC module also ingest accelerations from PNSN seismic stations through the Earthworm seismic acquisition and processing system for the purpose of detecting outliers and Kalman filtering when collocated instruments exist. The QC module outputs time aligned and cleaned displacement waveforms to ActiveMQ, an XML-based messaging broker that is currently used in seismic early warning architecture. Earthquake characterization modules read displacement information from ActiveMQ when triggered by warnings from ElarmS earthquake early warning algorithm. Peak ground displacement and P-wave scaling relationships from Kalman filtered waveforms provide initial magnitude estimates. Additional modules perform more complex source modeling such as centroid moment tensors and slip inversions that characterize the full size and

  18. PBO Southwest Region: Baja Earthquake Response and Network Operations

    NASA Astrophysics Data System (ADS)

    Walls, C. P.; Basset, A.; Mann, D.; Lawrence, S.; Jarvis, C.; Feaux, K.; Jackson, M. E.

    2011-12-01

    The SW region of the Plate Boundary Observatory consists of 455 continuously operating GPS stations located principally along the transform system of the San Andreas fault and Eastern California Shear Zone. In the past year network uptime exceeded an average of 97% with greater than 99% data acquisition. Communications range from CDMA modem (307), radio (92), Vsat (30), DSL/T1/other (25) to manual downloads (1). Sixty-three stations stream 1 Hz data over the VRS3Net typically with <0.5 second latency. Over 620 maintenance activities were performed during 316 onsite visits out of approximately 368 engineer field days. Within the past year there have been 7 incidences of minor (attempted theft) to moderate vandalism (solar panel stolen) with one total loss of receiver and communications gear. Security was enhanced at these sites through fencing and more secure station configurations. In the past 12 months, 4 new stations were installed to replace removed stations or to augment the network at strategic locations. Following the M7.2 El Mayor-Cucapah earthquake CGPS station P796, a deep-drilled braced monument, was constructed in San Luis, AZ along the border within 5 weeks of the event. In addition, UNAVCO participated in a successful University of Arizona-led RAPID proposal for the installation of six continuous GPS stations for post-seismic observations. Six stations are installed and telemetered through a UNAM relay at the Sierra San Pedro Martir. Four of these stations have Vaisala WXT520 meteorological sensors. An additional site in the Sierra Cucapah (PTAX) that was built by CICESE, an Associate UNAVCO Member institution in Mexico, and Caltech has been integrated into PBO dataflow. The stations will be maintained as part of the PBO network in coordination with CICESE. UNAVCO is working with NOAA to upgrade PBO stations with WXT520 meteorological sensors and communications systems capable of streaming real-time GPS and met data. The real-time GPS and

  19. Operational earthquake forecasting in California: A prototype system combining UCERF3 and CyberShake

    NASA Astrophysics Data System (ADS)

    Milner, K. R.; Jordan, T. H.; Field, E. H.

    2014-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about time-dependent earthquake probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To attain this goal, OEF must provide a complete description of the seismic hazard—ground motion exceedance probabilities as well as short-term rupture probabilities—in concert with the long-term forecasts of probabilistic seismic hazard analysis. We have combined the Third Uniform California Earthquake Rupture Forecast (UCERF3) of the Working Group on California Earthquake Probabilities (Field et al., 2014) with the CyberShake ground-motion model of the Southern California Earthquake Center (Graves et al., 2011; Callaghan et al., this meeting) into a prototype OEF system for generating time-dependent hazard maps. UCERF3 represents future earthquake activity in terms of fault-rupture probabilities, incorporating both Reid-type renewal models and Omori-type clustering models. The current CyberShake model comprises approximately 415,000 earthquake rupture variations to represent the conditional probability of future shaking at 285 geographic sites in the Los Angeles region (~236 million horizontal-component seismograms). This combination provides significant probability gains relative to OEF models based on empirical ground-motion prediction equations (GMPEs), primarily because the physics-based CyberShake simulations account for the rupture directivity, basin effects, and directivity-basin coupling that are not represented by the GMPEs.

  20. Circuit breaker operation and potential failure modes during an earthquake: a preliminary investigation

    SciTech Connect

    Lambert, H.E.

    1984-04-09

    This study addresses the effect of a strong-motion earthquake on circuit breaker operation. It focuses on the loss of offsite power (LOSP) transient caused by a strong-motion earthquake at the Zion Nuclear Power Plant. This report also describes the operator action necessary to prevent core melt if the above circuit breaker failure modes occur simultaneously on three 4.16 KV buses. Numerous circuit breakers important to plant safety, such as circuit breakers to diesel generators and engineered safety systems, (ESS), must open and/or close during this transient while strong motion is occurring. Nearly 500 electrical drawings were examined to address the effects of earthquakes on circuit breaker operation. Due to the complexity of the problem, this study is not intended to be definitive but serves as a focusing tool for further work. 5 references, 9 figures, 3 tables.

  1. Earthquake.

    PubMed

    Cowen, A R; Denney, J P

    1994-04-01

    On January 25, 1 week after the most devastating earthquake in Los Angeles history, the Southern California Hospital Council released the following status report: 928 patients evacuated from damaged hospitals. 805 beds available (136 critical, 669 noncritical). 7,757 patients treated/released from EDs. 1,496 patients treated/admitted to hospitals. 61 dead. 9,309 casualties. Where do we go from here? We are still waiting for the "big one." We'll do our best to be ready when Mother Nature shakes, rattles and rolls. The efforts of Los Angeles City Fire Chief Donald O. Manning cannot be overstated. He maintained department command of this major disaster and is directly responsible for implementing the fire department's Disaster Preparedness Division in 1987. Through the chief's leadership and ability to forecast consequences, the city of Los Angeles was better prepared than ever to cope with this horrendous earthquake. We also pay tribute to the men and women who are out there each day, where "the rubber meets the road." PMID:10133439

  2. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology.

  3. Large Earthquakes at the Ibero-Maghrebian Region: Basis for an EEWS

    NASA Astrophysics Data System (ADS)

    Buforn, Elisa; Udías, Agustín; Pro, Carmen

    2015-09-01

    Large earthquakes (Mw > 6, Imax > VIII) occur at the Ibero-Maghrebian region, extending from a point (12ºW) southwest of Cape St. Vincent to Tunisia, with different characteristics depending on their location, which cause considerable damage and casualties. Seismic activity at this region is associated with the boundary between the lithospheric plates of Eurasia and Africa, which extends from the Azores Islands to Tunisia. The boundary at Cape St. Vincent, which has a clear oceanic nature in the westernmost part, experiences a transition from an oceanic to a continental boundary, with the interaction of the southern border of the Iberian Peninsula, the northern border of Africa, and the Alboran basin between them, corresponding to a wide area of deformation. Further to the east, the plate boundary recovers its oceanic nature following the northern coast of Algeria and Tunisia. The region has been divided into four zones with different seismic characteristics. From west to east, large earthquake occurrence, focal depth, total seismic moment tensor, and average seismic slip velocities for each zone along the region show the differences in seismic release of deformation. This must be taken into account in developing an EEWS for the region.

  4. Plutonium uranium extraction (PUREX) end state basis for interim operation (BIO) for surveillance and maintenance

    SciTech Connect

    DODD, E.N.

    1999-05-12

    This Basis for Interim Operation (BIO) was developed for the PUREX end state condition following completion of the deactivation project. The deactivation project has removed or stabilized the hazardous materials within the facility structure and equipment to reduce the hazards posed by the facility during the surveillance and maintenance (S and M) period, and to reduce the costs associated with the S and M. This document serves as the authorization basis for the PUREX facility, excluding the storage tunnels, railroad cut, and associated tracks, for the deactivated end state condition during the S and M period. The storage tunnels, and associated systems and areas, are addressed in WHC-SD-HS-SAR-001, Rev. 1, PUREX Final Safety Analysis Report. During S and M, the mission of the facility is to maintain the conditions and equipment in a manner that ensures the safety of the workers, environment, and the public. The S and M phase will continue until the final decontamination and decommissioning (D and D) project and activities are begun. Based on the methodology of DOE-STD-1027-92, Hazards Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, the final facility hazards category is identified as hazards category This considers the remaining material inventories, form and distribution of the material, and the energies present to initiate events of concern. Given the current facility configuration, conditions, and authorized S and M activities, there are no operational events identified resulting in significant hazard to any of the target receptor groups (e.g., workers, public, environment). The only accident scenarios identified with consequences to the onsite co-located workers were based on external natural phenomena, specifically an earthquake. The dose consequences of these events are within the current risk evaluation guidelines and are consistent with the expectations for a hazards category 2

  5. The establishment of a standard operation procedure for psychiatric service after an earthquake.

    PubMed

    Su, Chao-Yueh; Chou, Frank Huang-Chih; Tsai, Kuan-Yi; Lin, Wen-Kuo

    2011-07-01

    This study presents information on the design and creation of a standard operation procedure (SOP) for psychiatric service after an earthquake. The strategies employed focused on the detection of survivors who developed persistent psychiatric illness, particularly post-traumatic stress and major depressive disorders. In addition, the study attempted to detect the risk factors for psychiatric illness. A Disaster-Related Psychological Screening Test (DRPST) was designed by five psychiatrists and two public health professionals for rapidly and simply interviewing 4,223 respondents within six months of the September 1999 Chi-Chi earthquake. A SOP was established through a systemic literature review, action research, and two years of data collection. Despite the limited time and resources inherent to a disaster situation, it is necessary to develop an SOP for psychiatric service after an earthquake in order to assist the high number of survivors suffering from subsequent psychiatric impairment. PMID:21410747

  6. Development of Site-Specific Soil Design Basis Earthquake (DBE) Parameters for the Integrated Waste Treatment Unit (IWTU)

    SciTech Connect

    Payne, Suzette

    2008-08-01

    Horizontal and vertical PC 3 (2,500 yr) Soil Design Basis Earthquake (DBE) 5% damped spectra, corresponding time histories, and strain-compatible soil properties were developed for the Integrated Waste Treatment Unit (IWTU). The IWTU is located at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Laboratory (INL). Mean and 84th percentile horizontal DBE spectra derived from site-specific site response analyses were evaluated for the IWTU. The horizontal and vertical PC 3 (2,500 yr) Soil DBE 5% damped spectra at the 84th percentile were selected for Soil Structure Interaction (SSI) analyses at IWTU. The site response analyses were performed consistent with applicable Department of Energy (DOE) Standards, recommended guidance of the Nuclear Regulatory Commission (NRC), American Society of Civil Engineers (ASCE) Standards, and recommendations of the Blue Ribbon Panel (BRP) and Defense Nuclear Facilities Safety Board (DNFSB).

  7. The G-FAST Geodetic Earthquake Early Warning System: Operational Performance and Synthetic Testing

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Schmidt, D. A.; Bodin, P.; Vidale, J. E.; Melbourne, T. I.; Santillan, V. M.

    2015-12-01

    The G-FAST (Geodetic First Approximation of Size and TIming) earthquake early warning module is part of a joint seismic and geodetic earthquake early warning system currently under development at the Pacific Northwest Seismic Network (PNSN). Our two-stage approach to earthquake early warning includes: (1) initial detection and characterization from PNSN strong-motion and broadband data with the ElarmS package within ShakeAlert, and then (2) modeling of GPS data from the Pacific Northwest Geodetic Array (PANGA). The two geodetic modeling modules are (1) a fast peak-ground-displacement magnitude and depth estimate and (2) a CMT-based finite fault inversion that utilizes coseismic offsets to compute earthquake extent, slip and magnitude. The seismic and geodetic source estimates are then combined in a decision module currently under development. In this presentation, we first report on the operational performance during the first several months that G-FAST has been live with respect to magnitude estimates, timing information, and stability. Secondly, we report on the performance of the G-FAST test system using simulated displacements from plausible Cascadian earthquake scenarios. The test system permits us to: (1) replay segments of actual seismic waveform data recorded from the PNSN and neighboring networks to investigate both earthquakes and noise conditions, and (2) broadcast synthetic data into the system to simulate signals we anticipate from earthquakes for which we have no actual ground motion recordings. The test system lets us also simulate various error conditions (latent and/or out-of-sequence data, telemetry drop-outs, etc.) in order to explore how best to mitigate them. For example, we show for a replay of the 2001 M6.8 Nisqually earthquake that telemetry drop-outs create the largest variability and biases in magnitude and depth estimates whereas latency only causes some variability towards the beginning of the recordings before quickly stabilizing

  8. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  9. The Earthquake Prediction Experiment on the Basis of the Jet Stream's Precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.; Tikhonov, I. N.

    2014-12-01

    Simultaneous analysis of the jet stream maps and EQ data of M > 6.0 have been made. 58 cases of EQ occurred in 2006-2010 were studied. It has been found that interruption or velocity flow lines cross above an epicenter of EQ take place 1-70 days prior to event. The duration was 6-12 hours. The assumption is that jet stream will go up or down near an epicenter. In 45 cases the distance between epicenters and jet stream's precursor does not exceed 90 km. The forecast during 30 days before the EQ was 66.1 % (Wu and Tikhonov, 2014). This technique has been used to predict the strong EQ and pre-registered on the website (for example, the 23 October 2011, M 7.2 EQ (Turkey); the 20 May 2012, M 6.1 EQ (Italy); the 16 April 2013, M 7.8 EQ (Iran); the 12 November 2013, M 6.6 EQ (Russia); the 03 March 2014, M 6.7 Ryukyu EQ (Japan); the 21 July 2014, M 6.2 Kuril EQ). We obtain satisfactory accuracy of the epicenter location. As well we define the short alarm period. That's the positive aspects of forecast. However, estimates of magnitude contain a big uncertainty. Reference Wu, H.C., Tikhonov, I.N., 2014. Jet streams anomalies as possible short-term precursors of earthquakes with M > 6.0. Research in Geophysics, Special Issue on Earthquake Precursors. Vol. 4. No 1. doi:10.4081/rg.2014.4939. The precursor of M9.0 Japan EQ on 2011/03/11(fig1). A. M6.1 Italy EQ (2012/05/20, 44.80 N, 11.19 E, H = 5.1 km) Prediction: 2012/03/20~2012/04/20 (45.6 N, 10.5 E), M > 5.5(fig2) http://ireport.cnn.com/docs/DOC-764800 B. M7.8 Iran EQ (2013/04/16, 28.11 N, 62.05 E, H = 82.0 km) Prediction: 2013/01/14~2013/02/04 (28.0 N, 61.3 E) M > 6.0(fig3) http://ireport.cnn.com/docs/DOC-910919 C. M6.6 Russia EQ (2013/11/12, 54.68 N, 162.29 E, H = 47.2 km). Prediction: 2013/10/27~2013/11/13 (56.0 N, 162.9 E) M > 5.5 http://ireport.cnn.com/docs/DOC-1053599 D. M6.7 Japan EQ (2014/03/03, 27.41 N, 127.34 E, H = 111.2 km). Prediction: 2013/12/02 ~2014/01/15 (26.7 N, 128.1 E) M > 6.5(fig4) http

  10. Planning a Preliminary program for Earthquake Loss Estimation and Emergency Operation by Three-dimensional Structural Model of Active Faults

    NASA Astrophysics Data System (ADS)

    Ke, M. C.

    2015-12-01

    Large scale earthquakes often cause serious economic losses and a lot of deaths. Because the seismic magnitude, the occurring time and the occurring location of earthquakes are still unable to predict now. The pre-disaster risk modeling and post-disaster operation are really important works of reducing earthquake damages. In order to understanding disaster risk of earthquakes, people usually use the technology of Earthquake simulation to build the earthquake scenarios. Therefore, Point source, fault line source and fault plane source are the models which often are used as a seismic source of scenarios. The assessment results made from different models used on risk assessment and emergency operation of earthquakes are well, but the accuracy of the assessment results could still be upgrade. This program invites experts and scholars from Taiwan University, National Central University, and National Cheng Kung University, and tries using historical records of earthquakes, geological data and geophysical data to build underground three-dimensional structure planes of active faults. It is a purpose to replace projection fault planes by underground fault planes as similar true. The analysis accuracy of earthquake prevention efforts can be upgraded by this database. Then these three-dimensional data will be applied to different stages of disaster prevention. For pre-disaster, results of earthquake risk analysis obtained by the three-dimensional data of the fault plane are closer to real damage. For disaster, three-dimensional data of the fault plane can be help to speculate that aftershocks distributed and serious damage area. The program has been used 14 geological profiles to build the three dimensional data of Hsinchu fault and HisnCheng faults in 2015. Other active faults will be completed in 2018 and be actually applied on earthquake disaster prevention.

  11. Planning Matters: Response Operations following the 30 September 2009 Sumatran Earthquake

    NASA Astrophysics Data System (ADS)

    Comfort, L. K.; Cedillos, V.; Rahayu, H.

    2009-12-01

    Response operations following the 9/30/2009 West Sumatra earthquake tested extensive planning that had been done in Indonesia since the 26 December 2004 Sumatran Earthquake and Tsunami. After massive destruction in Aceh Province in 2004, the Indonesian National Government revised its national disaster management plans. A key component was to select six cities in Indonesia exposed to significant risk and make a focused investment of resources, planning activities, and public education to reduce risk of major disasters. Padang City was selected for this national “showcase” for disaster preparedness, planning, and response. The question is whether planning improved governmental performance and coordination in practice. There is substantial evidence that disaster preparedness planning and training initiated over the past four years had a positive effect on Padang in terms of disaster risk reduction. The National Disaster Management Agency (BNPB, 10/28/09) reported the following casualties: Padang City: deaths, 383; severe injuries, 431, minor injuries, 771. Province of West Sumatra: deaths, 1209; severe injuries, 67; minor injuries, 1179. These figures contrasted markedly with the estimated losses following the 2004 Earthquake and Tsunami when no training had been done: Banda Aceh, deaths, 118,000; Aceh Province, dead/missing, 236,169 (ID Health Ministry 2/22/05). The 2004 events were more severe, yet the comparable scale of loss was significantly lower in the 9/30/09 earthquake. Three factors contributed to reducing disaster risk in Padang and West Sumatra. First, annual training exercises for tsunami warning and evacuation had been organized by national agencies since 2004. In 2008, all exercises and training activities were placed under the newly established BNPB. The exercise held in Padang in February, 2009 served as an organizing framework for response operations in the 9/30/09 earthquake. Public officers with key responsibilities for emergency operations

  12. TECHNICAL BASIS FOR VENTILATION REQUIREMENTS IN TANK FARMS OPERATING SPECIFICATIONS DOCUMENTS

    SciTech Connect

    BERGLIN, E J

    2003-06-23

    This report provides the technical basis for high efficiency particulate air filter (HEPA) for Hanford tank farm ventilation systems (sometimes known as heating, ventilation and air conditioning [HVAC]) to support limits defined in Process Engineering Operating Specification Documents (OSDs). This technical basis included a review of older technical basis and provides clarifications, as necessary, to technical basis limit revisions or justification. This document provides an updated technical basis for tank farm ventilation systems related to Operation Specification Documents (OSDs) for double-shell tanks (DSTs), single-shell tanks (SSTs), double-contained receiver tanks (DCRTs), catch tanks, and various other miscellaneous facilities.

  13. Theoretical basis for operational ensemble forecasting of coronal mass ejections

    NASA Astrophysics Data System (ADS)

    Pizzo, V. J.; Koning, C.; Cash, M.; Millward, G.; Biesecker, D. A.; Puga, L.; Codrescu, M.; Odstrcil, D.

    2015-10-01

    We lay out the theoretical underpinnings for the application of the Wang-Sheeley-Arge-Enlil modeling system to ensemble forecasting of coronal mass ejections (CMEs) in an operational environment. In such models, there is no magnetic cloud component, so our results pertain only to CME front properties, such as transit time to Earth. Within this framework, we find no evidence that the propagation is chaotic, and therefore, CME forecasting calls for different tactics than employed for terrestrial weather or hurricane forecasting. We explore a broad range of CME cone inputs and ambient states to flesh out differing CME evolutionary behavior in the various dynamical domains (e.g., large, fast CMEs launched into a slow ambient, and the converse; plus numerous permutations in between). CME propagation in both uniform and highly structured ambient flows is considered to assess how much the solar wind background affects the CME front properties at 1 AU. Graphical and analytic tools pertinent to an ensemble approach are developed to enable uncertainties in forecasting CME impact at Earth to be realistically estimated. We discuss how uncertainties in CME pointing relative to the Sun-Earth line affects the reliability of a forecast and how glancing blows become an issue for CME off-points greater than about the half width of the estimated input CME. While the basic results appear consistent with established impressions of CME behavior, the next step is to use existing records of well-observed CMEs at both Sun and Earth to verify that real events appear to follow the systematic tendencies presented in this study.

  14. A century of oilfield operations and earthquakes in the greater Los Angeles Basin, southern California

    USGS Publications Warehouse

    Hauksson, Egill; Goebel, Thomas; Ampuero, Jean-Paul; Cochran, Elizabeth S.

    2015-01-01

    Most of the seismicity in the Los Angeles Basin (LA Basin) occurs at depth below the sediments and is caused by transpressional tectonics related to the big bend in the San Andreas fault. However, some of the seismicity could be associated with fluid extraction or injection in oil fields that have been in production for almost a century and cover ∼ 17% of the basin. In a recent study, first the influence of industry operations was evaluated by analyzing seismicity characteristics, including normalized seismicity rates, focal depths, and b-values, but no significant difference was found in seismicity characteristics inside and outside the oil fields. In addition, to identify possible temporal correlations, the seismicity and available monthly fluid extraction and injection volumes since 1977 were analyzed. Second, the production and deformation history of the Wilmington oil field were used to evaluate whether other oil fields are likely to experience similar surface deformation in the future. Third, the maximum earthquake magnitudes of events within the perimeters of the oil fields were analyzed to see whether they correlate with total net injected volumes, as suggested by previous studies. Similarly, maximum magnitudes were examined to see whether they exhibit an increase with net extraction volume. Overall, no obvious previously unidentified induced earthquakes were found, and the management of balanced production and injection of fluids appears to reduce the risk of induced-earthquake activity in the oil fields.

  15. Basis for Interim Operation for the K-Reactor in Cold Standby

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The Basis for Interim Operation (BIO) document for K Reactor in Cold Standby and the L- and P-Reactor Disassembly Basins was prepared in accordance with the draft DOE standard for BIO preparation (dated October 26, 1993).

  16. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J.; Stephenson, D.E.; Silva, W.

    1991-12-31

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  17. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J. ); Stephenson, D.E. ); Silva, W. )

    1991-01-01

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  18. Moving towards the operational seismogeodesy component of earthquake and tsunami early warning

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Bock, Y.; Geng, J.; Melgar, D.; Crowell, B. W.; Squibb, M. B.

    2013-12-01

    deviation; 3) simultaneous solution for ground motion biases to mitigate errors due to accelerometer tilt; 4) real time integration of accelerometer data to velocity and displacement without baseline corrections, providing the fundamental input for rapid finite fault source inversion; 5) low frequency estimates of P-wave arrival displacement to support single station earth quake early warning. The operational real-time GPS analysis was implemented in time to provide waveforms from the August 2012 Brawley, CA, seismic swarm. Now the full real-time seismogeodetic analysis is operational for GPS sites we have upgraded with low-cost MEMS accelerometers, meteorological sensors and an in-house geodetic modules for efficient real-time data transmission. The analysis system does not yet incorporate an alert system but is currently available to serve as a complement to seismic-based early warning systems to increase redundancy and robustness. It is anticipated to be especially useful for large earthquakes (> M7) where rapid determination of the fault parameters is critical for early assessment of the extent of damage in affected areas, or for rapid tsunami modeling.

  19. Anthropogenic Triggering of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor ``foreshocks'', since the induction may occur with a delay up to several years.

  20. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years. PMID:25156190

  1. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  2. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2010-07-01 2010-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  3. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2013-07-01 2013-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  4. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2012-07-01 2012-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  5. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2014-07-01 2014-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  6. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2011-07-01 2011-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  7. Technology basis for the Liquid Effluent Retention Facility Operating Specifications. Revision 3

    SciTech Connect

    Johnson, P.G.

    1995-05-17

    The Liquid Effluent Retention Facility (LERF) consists of three retention basins, each with a nominal storage capacity of 6.5 million gallons. LERF serves as interim storage of 242-A Evaporator process condensate for treatment in the Effluent Treatment Facility. This document provides the technical basis for the LERF Operating Specifications, OSD-T-151-00029.

  8. On the Physical Basis of Rate Law Formulations for River Evolution, and their Applicability to the Simulation of Evolution after Earthquakes

    NASA Astrophysics Data System (ADS)

    An, C.; Parker, G.; Fu, X.

    2015-12-01

    River morphology evolves in response to trade-offs among a series of environmental forcing factors, and this evolution will be disturbed if such environmental factors change. One example of response to chronic disturbance is the intensive river evolution after earthquakes in southwest China's mountain areas. When simulating river morphological response to environmental disturbance, an exponential rate law with a specified characteristic response time is often regarded as a practical tool for quantification. As conceptual models, empirical rate law formulations can be used to describe broad brush morphological response, but their physically basis is not solid in that they do not consider the details of morphodynamic processes. Meanwhile, river evolution can also be simulated with physically-based morphodynamic models which conserve sediment via the Exner equation. Here we study the links between the rate law formalism and the Exner equation through solving the Exner equation mathematically and numerically. The results show that, when implementing a very simplified form of a relation for bedload transport, the Exner equation can be reduced to the diffusion equation, the solution of which is a Gaussian function. This solution coincides with the solution associated with rate laws, thus providing a physical basis for such formulations. However, when the complexities of a natural river are considered, the solution of the Exner equation will no longer be a simple Gaussian function. Under such circumstances, the rate law becomes invalid, and a full understanding of the response of rivers to earthquakes requires a complete morphodynamic model.

  9. Real-time operative earthquake forecasting: the case of L'Aquila sequence

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Lombardi, A.

    2009-12-01

    A reliable earthquake forecast is one of the fundamental components required for reducing seismic risk. Despite very recent efforts devoted to test the validity of available models, the present skill at forecasting the evolution of seismicity is still largely unknown. The recent Mw 6.3 earthquake - that struck near the city of L'Aquila, Italy on April 6, 2009, causing hundreds of deaths and vast damages - offered to scientists a unique opportunity to test for the first time the forecasting capability in a real-time application. Here, we describe the results of this first prospective experiment. Immediately following the large event, we began producing daily one-day earthquake forecasts for the region, and we provided these forecasts to Civil Protection - the agency responsible for managing the emergency. The forecasts are based on a stochastic model that combines the Gutenberg-Richter distribution of earthquake magnitudes and power-law decay in space and time of triggered earthquakes. The results from the first month following the L'Aquila earthquake exhibit a good fit between forecasts and observations, indicating that accurate earthquake forecasting is now a realistic goal. Our experience with this experiment demonstrates an urgent need for a connection between probabilistic forecasts and decision-making in order to establish - before crises - quantitative and transparent protocols for decision support.

  10. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  11. Lessons Learned from Eight Years' Experience of Actual Operation, and Future Prospects of JMA Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Nishimae, Y.

    2015-12-01

    Since 2007, experiences of actual operation of EEW have been gained by the Japan Meteorological Agency (JMA). During this period, we have learned lessons from many M6- and M7-class earthquakes, and the Mw9.0 Tohoku earthquake. During the Mw9.0 Tohoku earthquake, JMA system functioned well: it issued a warning message more than 15 s before strong ground shaking in the Tohoku district (relatively near distance from the epicenter). However, it was not perfect: in addition to the problem of large extent of fault rupture, some false warning messages were issued due to the confusion of the system because of simultaneous multiple aftershocks which occurred at the wide rupture area. To address the problems, JMA will introduce two new methods into the operational system this year to start their tests, aiming at practical operation within a couple of years. One is Integrated Particle Filter (IPF) method, which is an integrated algorithm of multiple hypocenter determination techniques with Bayesian estimation, in which amplitude information is also used for hypocenter determination. The other is Propagation of Local Undamped Motion (PLUM) method, in which warning message is issued when strong ground shaking is detected at nearby stations around the target site (e.g., within 30 km). Here, hypocenter and magnitude are not required in PLUM. Aiming at application for several years later, we are investigating a new approach, in which current wavefield is estimated in real time, and then future wavefield is predicted time evolutionally from the current situation using physics of wave propagation. Here, hypocenter and magnitude are not necessarily required, but real-time observation of ground shaking is necessary. JMA also plans to predict long period ground motion (up to 8 s) with the EEW system for earthquake damage mitigation in high-rise buildings. Its test will start using the operational system in the near future.

  12. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  13. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  14. Computing single step operators of logic programming in radial basis function neural networks

    SciTech Connect

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  15. The Mixed Waste Management Facility. Design basis integrated operations plan (Title I design)

    SciTech Connect

    1994-12-01

    The Mixed Waste Management Facility (MWMF) will be a fully integrated, pilotscale facility for the demonstration of low-level, organic-matrix mixed waste treatment technologies. It will provide the bridge from bench-scale demonstrated technologies to the deployment and operation of full-scale treatment facilities. The MWMF is a key element in reducing the risk in deployment of effective and environmentally acceptable treatment processes for organic mixed-waste streams. The MWMF will provide the engineering test data, formal evaluation, and operating experience that will be required for these demonstration systems to become accepted by EPA and deployable in waste treatment facilities. The deployment will also demonstrate how to approach the permitting process with the regulatory agencies and how to operate and maintain the processes in a safe manner. This document describes, at a high level, how the facility will be designed and operated to achieve this mission. It frequently refers the reader to additional documentation that provides more detail in specific areas. Effective evaluation of a technology consists of a variety of informal and formal demonstrations involving individual technology systems or subsystems, integrated technology system combinations, or complete integrated treatment trains. Informal demonstrations will typically be used to gather general operating information and to establish a basis for development of formal demonstration plans. Formal demonstrations consist of a specific series of tests that are used to rigorously demonstrate the operation or performance of a specific system configuration.

  16. Power systems after the Northridge earthquake: Emergency operations and changes in seismic equipment specifications, practice, and system configuration

    SciTech Connect

    Schiff, A.J.; Tognazzini, R.; Ostrom, D.

    1995-12-31

    The Northridge earthquake caused extensive damage to high voltage substation equipment, and for the first time the failure of transmission towers. Power was lost to much of the earthquake impacted area, and 93% of the customers were restored within 24 hours. To restore service, damage monitoring, communication and protective equipment, such as current-voltage transformers, wave traps, and lightning arresters, were removed or bypassed and operation restored. To improve performance some porcelain members are being replaced with composite materials for bushings, current-voltage transformers and lightning arresters. Interim equipment seismic specifications for equipment have been instituted. Some substations are being re-configured and rigid bus and conductors are being replaced with flexible conductors. Non-load carrying conductors, such as those used on lightning arrester, are being reduced in size to reduce potential interaction problems. Better methods of documenting damage and repair costs are being considered.

  17. Representation of discrete Steklov-Poincare operator arising in domain decomposition methods in wavelet basis

    SciTech Connect

    Jemcov, A.; Matovic, M.D.

    1996-12-31

    This paper examines the sparse representation and preconditioning of a discrete Steklov-Poincare operator which arises in domain decomposition methods. A non-overlapping domain decomposition method is applied to a second order self-adjoint elliptic operator (Poisson equation), with homogeneous boundary conditions, as a model problem. It is shown that the discrete Steklov-Poincare operator allows sparse representation with a bounded condition number in wavelet basis if the transformation is followed by thresholding and resealing. These two steps combined enable the effective use of Krylov subspace methods as an iterative solution procedure for the system of linear equations. Finding the solution of an interface problem in domain decomposition methods, known as a Schur complement problem, has been shown to be equivalent to the discrete form of Steklov-Poincare operator. A common way to obtain Schur complement matrix is by ordering the matrix of discrete differential operator in subdomain node groups then block eliminating interface nodes. The result is a dense matrix which corresponds to the interface problem. This is equivalent to reducing the original problem to several smaller differential problems and one boundary integral equation problem for the subdomain interface.

  18. Real-time earthquake alert system for the greater San Francisco Bay Area: a prototype design to address operational issues

    SciTech Connect

    Harben, P.E.; Jarpe, S.; Hunter, S.

    1996-12-10

    The purpose of the earthquake alert system (EAS) is to outrun the seismic energy released in a large earthquake using a geographically distributed network of strong motion sensors that telemeter data to a rapid CPU-processing station, which then issues an area-wide warning to a region before strong motion will occur. The warning times involved are short, from 0 to 30 seconds or so; consequently, most responses must be automated. The San Francisco Bay Area is particularly well suited for an EAS because (1) large earthquakes have relatively shallow hypocenters (10- to 20-kilometer depth), giving favorable ray-path geometries for larger warning times than deeper from earthquakes, and (2) the active faults are few in number and well characterized, which means far fewer geographically distributed strong motion sensors are (about 50 in this region). An EAS prototype is being implemented in the San Francisco Bay Area. The system consists of four distinct subsystems: (1) a distributed strong motion seismic network, (2) a central processing station, (3) a warning communications system and (4) user receiver and response systems. We have designed a simple, reliable, and inexpensive strong motion monitoring station that consists of a three-component Analog Devices ADXLO5 accelerometer sensing unit, a vertical component weak motion sensor for system testing, a 16-bit digitizer with multiplexing, and communication output ports for RS232 modem or radio telemetry. The unit is battery-powered and will be sited in fire stations. The prototype central computer analysis system consists of a PC dam-acquisition platform that pipes the incoming strong motion data via Ethernet to Unix-based workstations for dam processing. Simple real-time algorithms, particularly for magnitude estimation, are implemented to give estimates of the time since the earthquake`s onset its hypocenter location, its magnitude, and the reliability of the estimate. These parameters are calculated and transmitted

  19. Review of the Technical Basis of the Hydrogen Control Limit for Operations in Hanford Tank Farms

    SciTech Connect

    Mahoney, Lenna A. ); Stewart, Charles W. )

    2002-11-30

    The waste in Hanford tanks generates a mixture of flammable gases and releases it into the tank headspace. The potential hazard resulting from flammable gas generation requires that controls be established to prevent ignition and halt operations if gas concentrations reach levels of concern. In cases where only hydrogen is monitored, a control limit of 6,250 ppm hydrogen has been in use at Hanford for several years. The hydrogen-based control limit is intended to conservatively represent 25% of the lower flammability limit of a gas mixture, accounting for the presence of flammable gases other than hydrogen, with ammonia being the primary concern. This report reviews the technical basis of the current control limit based on observed and projected concentrations of hydrogen and ammonia representing a range of gas release scenarios. The conclusion supports the continued use of the current 6,250 ppm hydrogen control limit

  20. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  1. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    SciTech Connect

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  2. Recent Experiences Operating a Large, International Network of Electromagnetic Earthquake Monitors

    NASA Astrophysics Data System (ADS)

    Bleier, T.; Dunson, J. C.; Lemon, J.

    2014-12-01

    Leading a 5-nation international collaboration, QuakeFinder currently has a network of 168 instruments along with a Data Center that processes the 10 GB of data each day, 7 days a week. Each instrument includes 3-axis induction magnetometers, positive and negative ion sensors, and a geophone. These ground instruments are augmented with GOES weather satellite infrared monitoring of California (and in the future—other countries). The nature of the signals we are trying to detect and identify to enable forecasts for significant earthquakes (>M5) involves refining algorithms that both identify quake-related signals at some distance and remove a myriad of natural and anthropogenic noise. Maximum detection range was further investigated this year. An initial estimated maximum detection distance of 10 miles (16 km) was challenged with the onset of a M8.2 quake near Iquique, Chile on April 1, 2014. We will discuss the different strategies used to push the limits of detection for this quake which was 93 miles (149 km) from the instrument that had just been installed 2 months before the quake. Identifying and masking natural and man-made noise to reduce the number of misses and false alarms, and to increase the number of "hits" in a limited earthquake data set continues to be a top priority. Several novel approaches were tried, and the resulting progress will be discussed.

  3. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  4. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1995-01-01

    Incineration as a method of treating radioactive or mixed waste is attractive because of volume reduction, but may result in high concentrations of some hazardous components. For safety reasons during operation, and because of the environmental impact of the plant, it is important to know how these materials partition between the furnace slay, the fly ash, and the stack emission. The chemistry of about 50 elements is discussed and through consideration of high temperature thermodynamic equilibria, an attempt is made to provide a basis for predicting how various radionuclides and heavy metals behave in a typical incinerator. The chemistry of the individual elements is first considered and a prediction of the most stable chemical species in the typical incinerator atmosphere is made. The treatment emphasizes volatility and the parameters considered are temperature, acidity, oxygen, sulfur, and halogen content, and the presence of several other key non-radioactive elements. A computer model is used to calculate equilibrium concentrations of many species in several systems at temperatures ranging from 500 to 1600{degrees}K. It is suggested that deliberate addition of various feed chemicals can have a major impact on the fate of many radionuclides and heavy metals. Several problems concerning limitations and application of the data are considered.

  5. The power of simplification: Operator interface with the AP1000{sup R} during design-basis and beyond design-basis events

    SciTech Connect

    Williams, M. G.; Mouser, M. R.; Simon, J. B.

    2012-07-01

    The AP1000{sup R} plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been

  6. Optimizing the Use of Chief Complaint & Diagnosis for Operational Decision Making: An EMR Case Study of the 2010 Haiti Earthquake

    PubMed Central

    Bambrick, Alexandra T.; Passman, Dina B.; Torman, Rachel M.; Livinski, Alicia A.; Olsen, Jennifer M.

    2014-01-01

    Introduction: Data from an electronic medical record (EMR) system can provide valuable insight regarding health consequences in the aftermath of a disaster. In January of 2010, the U.S. Department of Health and Human Services (HHS) deployed medical personnel to Haiti in response to a crippling earthquake. An EMR system was used to record patient encounters in real-time and to provide data for decision support during response activities. Problem: During the Haiti response, HHS monitored the EMR system by recoding diagnoses into seven broad categories. At the conclusion of the response, it was evident that a new diagnosis categorization process was needed to provide a better description of the patient encounters that were seen in the field. After examining the EMRs, researchers determined nearly half of the medical records were missing diagnosis data. The objective of this study was to develop and test a new method of categorization for patient encounters to provide more detailed data for decision making. Methods: A single researcher verified or assigned a new diagnosis for 8,787 EMRs created during the Haiti response. This created a new variable, the Operational Code, which was based on available diagnosis data and chief complaint. Retrospectively, diagnoses recorded in the field and Operational Codes were categorized into eighteen categories based on the ICD-9-CM diagnostic system. Results: Creating an Operational Code variable led to a more robust data set and a clearer depiction emerged of the clinical presentations seen at six HHS clinics set up in the aftermath of Haiti’s earthquake. The number of records with an associated ICD-9 code increased 106% from 4,261 to 8,787. The most frequent Operational Code categories during the response were: General Symptoms, Signs, and Ill-Defined Conditions (34.2%), Injury and Poisoning (18.9%), Other (14.7%), Respiratory (4.8%), and Musculoskeletal and Connective Tissue (4.8%). Conclusion: The Operational Code methodology

  7. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  8. Transient Fluid Flow Along Basement Faults and Rupture Mechanics: Can We Expect Injection-Induced Earthquake Behavior to Correspond Directly With Injection Operations?

    NASA Astrophysics Data System (ADS)

    Norbeck, J. H.; Horne, R. N.

    2015-12-01

    We explored injection-induced earthquake behavior in geologic settings where basement faults are connected hydraulically to overlying saline aquifers targeted for wastewater disposal. Understanding how the interaction between natural geology and injection well operations affects the behavior of injection-induced earthquake sequences has important implications for characterizing seismic hazard risk. Numerical experiments were performed to investigate the extent to which seismicity is influenced by the migration of pressure perturbations along fault zones. Two distinct behaviors were observed: a) earthquake ruptures that were confined to the pressurized region of the fault and b) sustained earthquake ruptures that propagated far beyond the pressure front. These two faulting mechanisms have important implications for assessing the manner in which seismicity can be expected respond to injection well operations.Based upon observations from the numerical experiments, we developed a criterion that can be used to classify the expected faulting behavior near wastewater disposal sites. The faulting criterion depends on the state of stress, the initial fluid pressure, the orientation of the fault, and the dynamic friction coefficient of the fault. If the initial ratio of shear to effective normal stress resolved on the fault (the prestress ratio) is less than the fault's dynamic friction coefficient, then earthquake ruptures will tend to be limited by the distance of the pressure front. In this case, parameters that affect seismic hazard assessment, like the maximum earthquake magnitude or earthquake recurrence interval, could correlate with injection well operational parameters. For example, the maximum earthquake magnitude might be expected to grow over time in a systematic manner as larger patches of the fault are exposed to significant pressure changes. In contrast, if the prestress ratio is greater than dynamic friction, a stress drop can occur outside of the pressurized

  9. LLNL earthquake impact analysis committee report on the Livermore, California, earthquakes of January 24 and 26, 1980

    SciTech Connect

    Not Available

    1980-07-15

    The overall effects of the earthquakes of January 24 and 26, 1980, at the Lawrence Livermore National Laboratory in northern California are outlined. The damage caused by those earthquakes and how employees responded are discussed. The immediate emergency actions taken by management and the subsequent measures to resume operations are summarized. Long-range plans for recovery and repair, and the seisic history of the Livermore Valley region, various investigations concerning the design-basis earthquake (DBE), and seismic criteria for structures are reviewed. Following an analysis of the Laboratory's earthquake preparedness, emergency response, and related matters a series of conclusions and recommendations are presented. Appendixes provide additional information, such as persons interviewed, seismic and site maps, and a summary of the estimated costs incurred from the earthquakes.

  10. DEVELOPMENT OF A MATHEMATICAL BASIS FOR RELATING SLUDGE PROPERTIES TO FGD-SCRUBBER OPERATING VARIABLES

    EPA Science Inventory

    The report gives results of research to investigate prospects for increasing the size of calcium sulfite sludge particles in flue gas desulfurization systems. The approach included four work packages: a literature survey and development of a mathematical basis for predicting calc...

  11. Martin Marietta Energy Systems, Inc. comprehensive earthquake management plan: Emergency Operations Center training manual

    SciTech Connect

    Not Available

    1990-02-28

    The objective of this training is to: describe the responsibilities, resources, and goals of the Emergency Operations Center and be able to evaluate and interpret this information to best direct and allocate emergency, plant, and other resources to protect life and the Paducah Gaseous Diffusion Plant.

  12. Everyday Earthquakes.

    ERIC Educational Resources Information Center

    Svec, Michael

    1996-01-01

    Describes methods to access current earthquake information from the National Earthquake Information Center. Enables students to build genuine learning experiences using real data from earthquakes that have recently occurred. (JRH)

  13. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant. PMID:21038753

  14. Duration and predictors of emergency surgical operations - basis for medical management of mass casualty incidents

    PubMed Central

    2009-01-01

    Background Hospitals have a critically important role in the management of mass causality incidents (MCI), yet there is little information to assist emergency planners. A significantly limiting factor of a hospital's capability to treat those affected is its surgical capacity. We therefore intended to provide data about the duration and predictors of life saving operations. Methods The data of 20,815 predominantly blunt trauma patients recorded in the Trauma Registry of the German-Trauma-Society was retrospectively analyzed to calculate the duration of life-saving operations as well as their predictors. Inclusion criteria were an ISS ≥ 16 and the performance of relevant ICPM-coded procedures within 6 h of admission. Results From 1,228 patients fulfilling the inclusion criteria 1,793 operations could be identified as life-saving operations. Acute injuries to the abdomen accounted for 54.1% followed by head injuries (26.3%), pelvic injuries (11.5%), thoracic injuries (5.0%) and major amputations (3.1%). The mean cut to suture time was 130 min (IQR 65-165 min). Logistic regression revealed 8 variables associated with an emergency operation: AIS of abdomen ≥ 3 (OR 4,00), ISS ≥ 35 (OR 2,94), hemoglobin level ≤ 8 mg/dL (OR 1,40), pulse rate on hospital admission < 40 or > 120/min (OR 1,39), blood pressure on hospital admission < 90 mmHg (OR 1,35), prehospital infusion volume ≥ 2000 ml (OR 1,34), GCS ≤ 8 (OR 1,32) and anisocoria (OR 1,28) on-scene. Conclusions The mean operation time of 130 min calculated for emergency life-saving surgical operations provides a realistic guideline for the prospective treatment capacity which can be estimated and projected into an actual incident admission capacity. Knowledge of predictive factors for life-saving emergency operations helps to identify those patients that need most urgent operative treatment in case of blunt MCI. PMID:20149987

  15. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-04-01

    This report presents preliminary research results from the investigation in to the development of new models and guidance for concepts of operations (ConOps) in advanced small modular reactor (aSMR) designs. In support of this objective, three important research areas were included: operating principles of multi-modular plants, functional allocation models and strategies that would affect the development of new, non-traditional concept of operations, and the requiremetns for human performance, based upon work domain analysis and current regulatory requirements. As part of the approach for this report, we outline potential functions, including the theoretical and operational foundations for the development of a new functional allocation model and the identification of specific regulatory requirements that will influence the development of future concept of operations. The report also highlights changes in research strategy prompted by confirmationof the importance of applying the work domain analysis methodology to a reference aSMR design. It is described how this methodology will enrich the findings from this phase of the project in the subsequent phases and help in identification of metrics and focused studies for the determination of human performance criteria that can be used to support the design process.

  16. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  17. Experience in Construction and Operation of the Distributed Information Systems on the Basis of the Z39.50 Protocol

    NASA Astrophysics Data System (ADS)

    Zhizhimov, Oleg; Mazov, Nikolay; Skibin, Sergey

    Questions concerned with construction and operation of the distributed information systems on the basis of ANSI/NISO Z39.50 Information Retrieval Protocol are discussed in the paper. The paper is based on authors' practice in developing ZooPARK server. Architecture of distributed information systems, questions of reliability of such systems, minimization of search time and administration are examined. Problems with developing of distributed information systems are also described.

  18. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-08-01

    This report presents preliminary research results from the investigation into the development of new models and guidance for Concepts of Operations in advanced small modular reactor (AdvSMR) designs. AdvSMRs are nuclear power plants (NPPs), but unlike conventional large NPPs that are constructed on site, AdvSMRs systems and components will be fabricated in a factory and then assembled on site. AdvSMRs will also use advanced digital instrumentation and control systems, and make greater use of automation. Some AdvSMR designs also propose to be operated in a multi-unit configuration with a single central control room as a way to be more cost-competitive with existing NPPs. These differences from conventional NPPs not only pose technical and operational challenges, but they will undoubtedly also have regulatory compliance implications, especially with respect to staffing requirements and safety standards.

  19. A probabilistic risk assessment of the LLNL Plutonium Facility`s evaluation basis fire operational accident. Revision 1

    SciTech Connect

    Brumburgh, G. P.

    1995-02-27

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous programmatic activities involving plutonium to include device fabrication, development of improved and/or unique fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed in July 1994 to address operational safety and acceptable risk to employees, the public, government property, and the environmental. This paper outlines the PRA analysis of the Evaluation Basis Fire (EBF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility.

  20. Light storage in a tripod medium as a basis for logical operations

    NASA Astrophysics Data System (ADS)

    Słowik, K.; Raczyński, A.; Zaremba, J.; Zielińska-Kaniasty, S.

    2012-05-01

    A photon being a carrier of a polarization qubit is stored inside an atomic medium in the tripod configuration in the form of atomic excitations. Such stored information can be processed in the atomic memory and carried away by the released photon. An implementation is proposed of single qubit gates, e.g., phase, NOT, √{NOT} and Hadamard, as well as for a two-qubit CNOT gate, operating on polarized photons and based on light storage.

  1. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1994-09-01

    For waste containing small amounts of radioactivity, rad waste (RW), or mixed waste (MW) containing both radioactive and chemically hazardous components, incineration is a logical management candidate because of inherent safety, waste volume reduction, and low costs. Successful operation requires that the facility is properly designed and operated to protect workers and to limit releases of hazardous materials. The large decrease in waste volume achieved by incineration also results in a higher concentration of most of the radionuclides and non radioactive heavy metals in the ash products. These concentrations impact subsequent treatment and disposal. The various constituents (chemical elements) are not equal in concentration in the various incinerator feed materials, nor are they equal in their contribution to health risks on subsequent handling, or accidental release. Thus, for management of the wastes it is important to be able to predict how the nuclides partition between the primary combustion residue which may be an ash or a fused slag, the fine particulates or fly ash that is trapped in the burner off-gas by several different techniques, and the airborne fraction that escapes to the atmosphere. The objective of this report is to provide an estimate of how different elements of concern may behave in the chemical environment of the incinerator. The study briefly examines published incinerator operation data, then considers the properties of the elements of concern, and employs thermodynamic calculations, to help predict the fate of these RW and MW constituents. Many types and configurations of incinerators have been designed and tested.

  2. Diagnostics of PF-1000 Facility Operation and Plasma Concentration on the Basis of Spectral Measurements

    SciTech Connect

    Skladnik-Sadowska, E.; Malinowski, K.; Sadowski, M. J.; Scholz, M.; Tsarenko, A. V.

    2006-01-15

    The paper concerns the monitoring of high-current pulse discharges and the determination of the plasma concentration within the dense magnetized plasma by means of optical spectroscopy methods. In experiments with the large PF-1000 facility operated at IPPLM in Warsaw, Poland, attention was paid to the determination of the operational mode and electron concentration under different experimental conditions. To measure the visible radiation (VR) the use was made of the MECHELLE registered 900-spectrometer equipped with the CCD readout. The VR emission, observed at 65 deg. to the z-axis, originated from a part of the electrode surfaces, the collapsing current-sheath layer and the dense plasma pinch-region (40-50 mm from the electrode ends). Considerable differences were found in the optical spectra recorded for so-called 'good shots' and for cases of some failures. Estimates of the electron concentration, which were performed with different spectroscopic techniques, showed that it ranged from 5.56x1018 cm-3 to 4.8x1019 cm-3, depending on experimental conditions. The correlation of the fusion-neutron yield and the plasma density was proved.

  3. Waste Encapsulation and Storage Facility (WESF) Basis for Interim Operation (BIO)

    SciTech Connect

    COVEY, L.I.

    2000-11-28

    The Waste Encapsulation and Storage Facility (WESF) is located in the 200 East Area adjacent to B Plant on the Hanford Site north of Richland, Washington. The current WESF mission is to receive and store the cesium and strontium capsules that were manufactured at WESF in a safe manner and in compliance with all applicable rules and regulations. The scope of WESF operations is currently limited to receipt, inspection, decontamination, storage, and surveillance of capsules in addition to facility maintenance activities. The capsules are expected to be stored at WESF until the year 2017, at which time they will have been transferred for ultimate disposition. The WESF facility was designed and constructed to process, encapsulate, and store the extracted long-lived radionuclides, {sup 90}Sr and {sup 137}Cs, from wastes generated during the chemical processing of defense fuel on the Hanford Site thus ensuring isolation of hazardous radioisotopes from the environment. The construction of WESF started in 1971 and was completed in 1973. Some of the {sup 137}Cs capsules were leased by private irradiators or transferred to other programs. All leased capsules have been returned to WESF. Capsules transferred to other programs will not be returned except for the seven powder and pellet Type W overpacks already stored at WESF.

  4. Modeling of the Reactor Core Isolation Cooling Response to Beyond Design Basis Operations - Interim Report

    SciTech Connect

    Ross, Kyle; Cardoni, Jeffrey N.; Wilson, Chisom Shawn; Morrow, Charles; Osborn, Douglas; Gauntt, Randall O.

    2015-12-01

    Efforts are being pursued to develop and qualify a system-level model of a reactor core isolation (RCIC) steam-turbine-driven pump. The model is being developed with the intent of employing it to inform the design of experimental configurations for full-scale RCIC testing. The model is expected to be especially valuable in sizing equipment needed in the testing. An additional intent is to use the model in understanding more fully how RCIC apparently managed to operate far removed from its design envelope in the Fukushima Daiichi Unit 2 accident. RCIC modeling is proceeding along two avenues that are expected to complement each other well. The first avenue is the continued development of the system-level RCIC model that will serve in simulating a full reactor system or full experimental configuration of which a RCIC system is part. The model reasonably represents a RCIC system today, especially given design operating conditions, but lacks specifics that are likely important in representing the off-design conditions a RCIC system might experience in an emergency situation such as a loss of all electrical power. A known specific lacking in the system model, for example, is the efficiency at which a flashing slug of water (as opposed to a concentrated jet of steam) could propel the rotating drive wheel of a RCIC turbine. To address this specific, the second avenue is being pursued wherein computational fluid dynamics (CFD) analyses of such a jet are being carried out. The results of the CFD analyses will thus complement and inform the system modeling. The system modeling will, in turn, complement the CFD analysis by providing the system information needed to impose appropriate boundary conditions on the CFD simulations. The system model will be used to inform the selection of configurations and equipment best suitable of supporting planned RCIC experimental testing. Preliminary investigations with the RCIC model indicate that liquid water ingestion by the turbine

  5. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  6. Earthquake prediction

    SciTech Connect

    Ma, Z.; Fu, Z.; Zhang, Y.; Wang, C.; Zhang, G.; Liu, D.

    1989-01-01

    Mainland China is situated at the eastern edge of the Eurasian seismic system and is the largest intra-continental region of shallow strong earthquakes in the world. Based on nine earthquakes with magnitudes ranging between 7.0 and 7.9, the book provides observational data and discusses successes and failures of earthquake prediction. Derived from individual earthquakes, observations of various phenomena and seismic activities occurring before and after earthquakes, led to the establishment of some general characteristics valid for earthquake prediction.

  7. Hidden Earthquakes.

    ERIC Educational Resources Information Center

    Stein, Ross S.; Yeats, Robert S.

    1989-01-01

    Points out that large earthquakes can take place not only on faults that cut the earth's surface but also on blind faults under folded terrain. Describes four examples of fold earthquakes. Discusses the fold earthquakes using several diagrams and pictures. (YP)

  8. Debriefing of American Red Cross personnel: pilot study on participants' evaluations and case examples from the 1994 Los Angeles earthquake relief operation.

    PubMed

    Armstrong, K; Zatzick, D; Metzler, T; Weiss, D S; Marmar, C R; Garma, S; Ronfeldt, H; Roepke, L

    1998-01-01

    The Multiple Stressor Debriefing (MSD) model was used to debrief 112 American Red Cross workers individually or in groups after their participation in the 1994 Los Angeles earthquake relief effort. Two composite case examples are presented that illustrate individual and group debriefings using the MSD model. A questionnaire which evaluated workers' experience of debriefing, was completed by 95 workers. Results indicated that workers evaluated the debriefings in which they participated positively. In addition, as participant to facilitator ratio increased, workers shared less of their feelings and reactions about the disaster relief operation. These findings, as well as more specific issues about debriefing, are discussed. PMID:9579015

  9. Hidden earthquakes

    SciTech Connect

    Stein, R.S.; Yeats, R.S.

    1989-06-01

    Seismologists generally look for earthquakes to happen along visible fault lines, e.g., the San Andreas fault. The authors maintain that another source of dangerous quakes has been overlooked: the release of stress along a fault that is hidden under a fold in the earth's crust. The paper describes the differences between an earthquake which occurs on a visible fault and one which occurs under an anticline and warns that Los Angeles greatest earthquake threat may come from a small quake originating under downtown Los Angeles, rather than a larger earthquake which occurs 50 miles away at the San Andreas fault.

  10. Earthquakes: A Teacher's Package for K-6.

    ERIC Educational Resources Information Center

    National Science Teachers Association, Washington, DC.

    Like rain, an earthquake is a natural occurrence which may be mild or catastrophic. Although an earthquake may last only a few seconds, the processes that cause it have operated within the earth for millions of years. Until recently, the cause of earthquakes was a mystery and the subject of fanciful folklore to people all around the world. This…

  11. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  12. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  13. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  14. Deep Earthquakes.

    ERIC Educational Resources Information Center

    Frohlich, Cliff

    1989-01-01

    Summarizes research to find the nature of deep earthquakes occurring hundreds of kilometers down in the earth's mantle. Describes further research problems in this area. Presents several illustrations and four references. (YP)

  15. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  16. Proposed plan/Statement of basis for the Grace Road Site (631-22G) operable unit: Final action

    SciTech Connect

    Palmer, E.

    1997-08-19

    This Statement of Basis/Proposed Plan is being issued by the U. S. Department of Energy (DOE), which functions as the lead agency for the Savannah River Site (SRS) remedial activities, with concurrence by the U. S. Environmental Protection Agency (EPA), and the South Carolina Department of Health and Environmental Control (SCDHEC). The purpose of this Statement of Basis/Proposed Plan is to describe the preferred alternative for addressing the Grace Road site (GRS) located at the Savannah River Site (SRS), in Aiken, South Carolina and to provide an opportunity for public input into the remedial action selection process.

  17. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable year which... made upon the final determination of the rate of absorption applicable to the taxable year....

  18. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  19. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  20. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  1. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  2. Earthquakes for Kids

    MedlinePlus

    ... Hazards Data & Products Learn Monitoring Research Earthquakes for Kids Kid's Privacy Policy Earthquake Topics for Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters ...

  3. Two grave issues concerning the expected Tokai Earthquake

    NASA Astrophysics Data System (ADS)

    Mogi, Kiyoo

    2004-08-01

    The possibility of a great shallow earthquake (M 8) in the Tokai region, central Honshu, in the near future was pointed out by Mogi in 1969 and by the Coordinating Committee for Earthquake Prediction (CCEP), Japan (1970). In 1978, the government enacted the Large-Scale Earthquake Countermeasures Law and began to set up intensified observations in this region for short-term prediction of the expected Tokai earthquake. In this paper, two serious issues are pointed out, which may contribute to catastrophic effects in connection with the Tokai earthquake: 1. The danger of black-and-white predictions: According to the scenario based on the Large-Scale Earthquake Countermeasures Law, if abnormal crustal changes are observed, the Earthquake Assessment Committee (EAC) will determine whether or not there is an imminent danger. The findings are reported to the Prime Minister who decides whether to issue an official warning statement. Administrative policy clearly stipulates the measures to be taken in response to such a warning, and because the law presupposes the ability to predict a large earthquake accurately, there are drastic measures appropriate to the situation. The Tokai region is a densely populated region with high social and economic activity, and it is traversed by several vital transportation arteries. When a warning statement is issued, all transportation is to be halted. The Tokyo capital region would be cut off from the Nagoya and Osaka regions, and there would be a great impact on all of Japan. I (the former chairman of EAC) maintained that in view of the variety and complexity of precursory phenomena, it was inadvisable to attempt a black-and-white judgment as the basis for a "warning statement". I urged that the government adopt a "soft warning" system that acknowledges the uncertainty factor and that countermeasures be designed with that uncertainty in mind. 2. The danger of nuclear power plants in the focal region: Although the possibility of the

  4. A computational method for solving stochastic Itô–Volterra integral equations based on stochastic operational matrix for generalized hat basis functions

    SciTech Connect

    Heydari, M.H.; Hooshmandasl, M.R.; Maalek Ghaini, F.M.; Cattani, C.

    2014-08-01

    In this paper, a new computational method based on the generalized hat basis functions is proposed for solving stochastic Itô–Volterra integral equations. In this way, a new stochastic operational matrix for generalized hat functions on the finite interval [0,T] is obtained. By using these basis functions and their stochastic operational matrix, such problems can be transformed into linear lower triangular systems of algebraic equations which can be directly solved by forward substitution. Also, the rate of convergence of the proposed method is considered and it has been shown that it is O(1/(n{sup 2}) ). Further, in order to show the accuracy and reliability of the proposed method, the new approach is compared with the block pulse functions method by some examples. The obtained results reveal that the proposed method is more accurate and efficient in comparison with the block pule functions method.

  5. A Schauder and Riesz basis criterion for non-self-adjoint Schrödinger operators with periodic and antiperiodic boundary conditions

    NASA Astrophysics Data System (ADS)

    Gesztesy, Fritz; Tkachenko, Vadim

    Under the assumption that V∈L2([0,π];dx), we derive necessary and sufficient conditions in terms of spectral data for (non-self-adjoint) Schrödinger operators -d2/dx2+V in L2([0,π];dx) with periodic and antiperiodic boundary conditions to possess a Riesz basis of root vectors (i.e., eigenvectors and generalized eigenvectors spanning the range of the Riesz projection associated with the corresponding periodic and antiperiodic eigenvalues). We also discuss the case of a Schauder basis for periodic and antiperiodic Schrödinger operators -d2/dx2+V in Lp([0,π];dx), p∈(1,∞).

  6. The parkfield, california, earthquake prediction experiment.

    PubMed

    Bakun, W H; Lindh, A G

    1985-08-16

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment. PMID:17739363

  7. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways...

  8. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways...

  9. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated...

  10. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated...

  11. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways...

  12. Postseismic Transient after the 2002 Denali Fault Earthquake from VLBI Measurements at Fairbanks

    NASA Technical Reports Server (NTRS)

    MacMillan, Daniel; Cohen, Steven

    2004-01-01

    The VLBI antenna (GILCREEK) at Fairbanks, Alaska observes in networks routinely twice a week with operational networks and on additional days with other networks on a more uneven basis. The Fairbanks antenna position is about 150 km north of the Denali fault and from the earthquake epicenter. We examine the transient behavior of the estimated VLBI position during the year following the earthquake to determine how the rate of change of postseismic deformation has changed. This is compared with what is seen in the GPS site position series.

  13. United States earthquakes, 1984

    SciTech Connect

    Stover, C.W.

    1988-01-01

    The report contains information for eartthquakes in the 50 states and Puerto Rico and the area near their shorelines. The data consist of earthquake locations (date, time, geographic coordinates, depth, and magnitudes), intensities, macroseismic information, and isoseismal and seismicity maps. Also, included are sections detailing the activity of seismic networks operated by universities and other government agencies and a list of results form strong-motion seismograph records.

  14. The Effects of Degraded Digital Instrumentation and Control Systems on Human-system Interfaces and Operator Performance: HFE Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.; W. Gunther, G. Martinez-Guridi

    2010-02-26

    New and advanced reactors will use integrated digital instrumentation and control (I&C) systems to support operators in their monitoring and control functions. Even though digital systems are typically highly reliable, their potential for degradation or failure could significantly affect operator performance and, consequently, impact plant safety. The U.S. Nuclear Regulatory Commission (NRC) supported this research project to investigate the effects of degraded I&C systems on human performance and plant operations. The objective was to develop human factors engineering (HFE) review guidance addressing the detection and management of degraded digital I&C conditions by plant operators. We reviewed pertinent standards and guidelines, empirical studies, and plant operating experience. In addition, we conducted an evaluation of the potential effects of selected failure modes of the digital feedwater system on human-system interfaces (HSIs) and operator performance. The results indicated that I&C degradations are prevalent in plants employing digital systems and the overall effects on plant behavior can be significant, such as causing a reactor trip or causing equipment to operate unexpectedly. I&C degradations can impact the HSIs used by operators to monitor and control the plant. For example, sensor degradations can make displays difficult to interpret and can sometimes mislead operators by making it appear that a process disturbance has occurred. We used the information obtained as the technical basis upon which to develop HFE review guidance. The guidance addresses the treatment of degraded I&C conditions as part of the design process and the HSI features and functions that support operators to monitor I&C performance and manage I&C degradations when they occur. In addition, we identified topics for future research.

  15. America's faulty earthquake plans

    SciTech Connect

    Rosen, J

    1989-10-01

    In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.

  16. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide. PMID:22410538

  17. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  18. Connecting slow earthquakes to huge earthquakes

    NASA Astrophysics Data System (ADS)

    Obara, Kazushige; Kato, Aitaro

    2016-07-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  19. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. PMID:27418504

  20. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  1. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  2. Development of information-and-control systems as a basis for modernizing the automated process control systems of operating power equipment

    NASA Astrophysics Data System (ADS)

    Shapiro, V. I.; Borisova, E. V.; Chausov, Yu. N.

    2014-03-01

    The main drawbacks inherent in the hardware of outdated control systems of power stations are discussed. It is shown that economically efficient and reliable operation of the process equipment will be impossible if certain part of these control systems is used further. It is pointed out that full retrofitting of outdated control systems on operating equipment in one go with replacing all technical facilities and cable connections by a modern computerized automation system involves certain difficulties if such work is carried out under the conditions of limited financial resources or a limited period of time destined for doing the works. A version of control system modernization is suggested that involves replacement of the most severely worn and outdated equipment (indicating and recording instruments, and local controllers) and retaining the existing cable routes and layout of board facilities. The modernization implies development of informationand-control systems constructed on the basis of a unified computerized automation system. Software and hardware products that have positively proven themselves in thermal power engineering are proposed for developing such an automation system. It is demonstrated that the proposed system has a considerable potential for its functional development and can become a basis for constructing a fully functional automated process control system.

  3. Stronger direction needed for the National Earthquake Program

    NASA Astrophysics Data System (ADS)

    1983-07-01

    The National Earthquake Hazards Reduction Program was established to mitigate the impact on communities. Emphasis is placed on: (1) the Federal Emergency Management Agency's (FEMA's) efforts to carry out its lead agency responsibilities for the program; (2) assistance provided to State and local governments in mitigating earthquake hazards; and (3) progress toward developing an operational earthquake prediction system.

  4. Earthquake Archaeology: a logical approach?

    NASA Astrophysics Data System (ADS)

    Stewart, I. S.; Buck, V. A.

    2001-12-01

    Ancient earthquakes can leave their mark in the mythical and literary accounts of ancient peoples, the stratigraphy of their site histories, and the structural integrity of their constructions. Within this broad cross-disciplinary tramping ground, earthquake geologists have tended to focus on those aspects of the cultural record that are most familiar to them; the physical effects of seismic deformation on ancient constructions. One of the core difficulties with this 'earthquake archaeology' approach is that recent attempts to isolate structural criteria that are diagnostic or strongly suggestive of a seismic origin are undermined by the recognition that signs of ancient seismicity are generally indistinguishable from non-seismic mechanisms (poor construction, adverse geotechnical conditions). We illustrate the difficulties and inconsistencies in current proposed 'earthquake diagnostic' schemes by reference to two case studies of archaeoseismic damage in central Greece. The first concerns fallen columns at various Classical temple localities in mainland Greece (Nemea, Sounio, Olympia, Bassai) which, on the basis of observed structural criteria, are earthquake-induced but which are alternatively explained by archaeologists as the action of human disturbance. The second re-examines the almost type example of the Kyparissi site in the Atalanti region as a Classical stoa offset across a seismic surface fault, arguing instead for its deformation by ground instability. Finally, in highlighting the inherent ambiguity of archaeoseismic data, we consider the value of a logic-tree approach for quantifying and quantifying our uncertainities for seismic-hazard analysis.

  5. Response to “Comment on ‘Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set’” [J. Chem. Phys. 140, 177103 (2014)

    SciTech Connect

    Reuter, Matthew G.; Harrison, Robert J.

    2014-05-07

    The thesis of Brandbyge's comment [J. Chem. Phys. 140, 177103 (2014)] is that our operator decoupling condition is immaterial to transport theories, and it appeals to discussions of nonorthogonal basis sets in transport calculations in its arguments. We maintain that the operator condition is to be preferred over the usual matrix conditions and subsequently detail problems in the existing approaches. From this operator perspective, we conclude that nonorthogonal projectors cannot be used and that the projectors must be selected to satisfy the operator decoupling condition. Because these conclusions pertain to operators, the choice of basis set is not germane.

  6. On subduction zone earthquakes and the Pacific Northwest seismicity

    SciTech Connect

    Chung, Dae H.

    1991-12-01

    A short review of subduction zone earthquakes and the seismicity of the Pacific Northwest region of the United States is provided for the purpose of a basis for assessing issues related to earthquake hazard evaluations for the region. This review of seismotectonics regarding historical subduction zone earthquakes and more recent seismological studies pertaining to rupture processes of subduction zone earthquakes, with specific references to the Pacific Northwest, is made in this brief study. Subduction zone earthquakes tend to rupture updip and laterally from the hypocenter. Thus, the rupture surface tends to become more elongated as one considers larger earthquakes (there is limited updip distance that is strongly coupled, whereas rupture length can be quite large). The great Aleutian-Alaska earthquakes of 1957, 1964, and 1965 had rupture lengths of greater than 650 km. The largest earthquake observed instrumentally, the M{sub W} 9.5, 1960 Chile Earthquake, had a rupture length over 1000 km. However, earthquakes of this magnitude are very unlikely on Cascadia. The degree of surface shaking has a very strong dependency on the depth and style of rupture. The rupture surface during a great earthquake shows heterogeneous stress drop, displacement, energy release, etc. The high strength zones are traditionally termed asperities and these asperities control when and how large an earthquake is generated. Mapping of these asperities in specific subduction zones is very difficult before an earthquake. They show up more easily in inversions of dynamic source studies of earthquake ruptures, after an earthquake. Because seismic moment is based on the total radiated-energy from an earthquake, the moment-based magnitude M{sub W} is superior to all other magnitude estimates, such as M{sub L}, m{sub b}, M{sub bLg}, M{sub S}, etc Probably, just to have a common language, non-moment magnitudes should be converted to M{sub W} in any discussions of subduction zone earthquakes.

  7. EARTHQUAKE HAZARDS IN THE OFFSHORE ENVIRONMENT.

    USGS Publications Warehouse

    Page, Robert A.; Basham, Peter W.

    1985-01-01

    This report discusses earthquake effects and potential hazards in the marine environment, describes and illustrates methods for the evaluation of earthquake hazards, and briefly reviews strategies for mitigating hazards. The report is broadly directed toward engineers, scientists, and others engaged in developing offshore resources. The continental shelves have become a major frontier in the search for new petroleum resources. Much of the current exploration is in areas of moderate to high earthquake activity. If the resources in these areas are to be developed economically and safely, potential earthquake hazards must be identified and mitigated both in planning and regulating activities and in designing, constructing, and operating facilities. Geologic earthquake effects that can be hazardous to marine facilities and operations include surface faulting, tectonic uplift and subsidence, seismic shaking, sea-floor failures, turbidity currents, and tsunamis.

  8. Chern-Simons gravity with (curvature){sup 2} and (torsion){sup 2} terms and a basis of degree-of-freedom projection operators

    SciTech Connect

    Helayeel-Neto, J. A.; Hernaski, C. A.; Pereira-Dias, B.; Vargas-Paredes, A. A.; Vasquez-Otoya, V. J.

    2010-09-15

    The effects of (curvature){sup 2}- and (torsion){sup 2}-terms in the Einstein-Hilbert-Chern-Simons Lagrangian are investigated. The purposes are two-fold: (i) to show the efficacy of an orthogonal basis of degree-of-freedom projection operators recently proposed and to ascertain its adequacy for obtaining propagators of general parity-breaking gravity models in three dimensions; (ii) to analyze the role of the topological Chern-Simons term for the unitarity and the particle spectrum of the model squared-curvature terms in connection with dynamical torsion. Our conclusion is that the Chern-Simons term does not influence the unitarity conditions imposed on the parameters of the Lagrangian but significantly modifies the particle spectrum.

  9. Automated Microwave Complex on the Basis of a Continuous-Wave Gyrotron with an Operating Frequency of 263 GHz and an Output Power of 1 kW

    NASA Astrophysics Data System (ADS)

    Glyavin, M. Yu.; Morozkin, M. V.; Tsvetkov, A. I.; Lubyako, L. V.; Golubiatnikov, G. Yu.; Kuftin, A. N.; Zapevalov, V. E.; V. Kholoptsev, V.; Eremeev, A. G.; Sedov, A. S.; Malygin, V. I.; Chirkov, A. V.; Fokin, A. P.; Sokolov, E. V.; Denisov, G. G.

    2016-02-01

    We study experimentally the automated microwave complex for microwave spectroscopy and diagnostics of various media, which was developed at the Institute of Applied Physics of the Russian Academy of Sciences in cooperation with GYCOM Ltd. on the basis of a gyrotron with a frequency of 263 GHz and operated at the first gyrofrequency harmonic. In the process of the experiments, a controllable output power of 0 .1 -1 kW was achieved with an efficiency of up to 17 % in the continuous-wave generation regime. The measured radiation spectrum with a relative width of about 10 -6 and the frequency values measured at various parameters of the device are presented. The results of measuring the parameters of the wave beam, which was formed by a built-in quasioptical converter, as well as the data obtained by measuring the heat loss in the cavity and the vacuum output window are analyzed.

  10. Evaluation of near-field earthquake effects

    SciTech Connect

    Shrivastava, H.P.

    1994-11-01

    Structures and equipment, which are qualified for the design basis earthquake (DBE) and have anchorage designed for the DBE loading, do not require an evaluation of the near-field earthquake (NFE) effects. However, safety class 1 acceleration sensitive equipment such as electrical relays must be evaluated for both NFE and DBE since they are known to malfunction when excited by high frequency seismic motions.

  11. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions. PMID:2347628

  12. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  13. Stress Drops for Potentially Induced Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  14. A Simplified Approach to the Basis Functions of Symmetry Operations and Terms of Metal Complexes in an Octahedral Field with d[superscript 1] to d[superscript 9] Configurations

    ERIC Educational Resources Information Center

    Lee, Liangshiu

    2010-01-01

    The basis sets for symmetry operations of d[superscript 1] to d[superscript 9] complexes in an octahedral field and the resulting terms are derived for the ground states and spin-allowed excited states. The basis sets are of fundamental importance in group theory. This work addresses such a fundamental issue, and the results are pedagogically…

  15. Material contrast does not predict earthquake rupture propagation direction

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    2005-01-01

    Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

  16. Guidelines for earthquake ground motion definition for the eastern United States

    SciTech Connect

    Gwaltney, R.C.; Aramayo, G.A.; Williams, R.T.

    1985-01-01

    Guidelines for the determination of earthquake ground-motion definition for the eastern United States are established in this paper. Both far-field and near-field guidelines are given. The guidelines were based on an extensive review of the current procedures for specifying ground motion in the United States. Both empirical and theoretical procedures were used in establishing the guidelines because of the low seismicity in the eastern United States. Only a few large to great (M > 7.5) sized earthquakes have occurred in this region, no evidence of tectonic surface ruptures related to historic or Holocene earthquakes have been found, and no currently active plate boundaries of any kind are known in this region. Very little instrumented data has been gathered in the East. Theoretical procedures are proposed so that in regions of almost no data a reasonable level of seismic ground motion activity can be assumed. The guidelines are to be used to develop the Safe Shutdown Earthquake, SSE. A new procedure for establishing the Operating Basis Earthquake, OBE, is proposed, in particular for the eastern United States. The OBE would be developed using a probabilistic assessment of the geological conditions and the recurrence of seismic events at a site. These guidelines should be useful in development of seismic design requirements for future reactors. 17 refs., 2 figs., 1 tab.

  17. Geodetic measurement of deformation in the Loma Prieta, California earthquake with Very Long Baseline Interferometry (VLBI)

    SciTech Connect

    Clark, T.A.; Ma, C.; Sauber, J.M.; Ryan, J.W. ); Gordon, D.; Caprette, D.S. ); Shaffer, D.B.; Vandenberg, N.R. )

    1990-07-01

    Following the Loma Prieta earthquake, two mobile Very Long Baseline Interferometry (VLBI) systems operated by the NASA Crustal Dynamics Project and the NOAA National Geodetic Survey were deployed at three previously established VLBI sites in the earthquake area: Fort Ord (near Monterey), the Presidio (in San Francisco) and Point Reyes. From repeated VLBI occupations of these sites since 1983, the pre-earthquake rates of deformation have been determined with respect to a North American reference frame with 1{sigma} formal standard errors of {approximately}1 mm/yr. The VLBI measurements immediately following the earthquake showed that the Fort Ord site was displaced 49 {plus minus} 4 mm at an azimuth of 11 {plus minus} 4{degree} and that the Presidio site was displaced 12 {plus minus} 5 mm at an azimuth of 148 {plus minus} 13{degree}. No anomalous change was detected at Point Reyes with 1{sigma} uncertainty of 4 mm. The estimated displacements at Fort Ord and the Presidio are consistent with the static displacements predicted on the basis of a coseismic slip model in which slip on the southern segment is shallower than slip on the more northern segment is shallower than slip on the more northern segment of the fault rupture. The authors also give the Cartesian positions at epoch 1990.0 of a set of VLBI fiducial stations and the three mobile sites in the vicinity of the earthquake.

  18. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  19. EARTHQUAKE CAUSED RELEASES FROM A NUCLEAR FUEL CYCLE FACILITY

    SciTech Connect

    Charles W. Solbrig; Chad Pope; Jason Andrus

    2014-08-01

    The fuel cycle facility (FCF) at the Idaho National Laboratory is a nuclear facility which must be licensed in order to operate. A safety analysis is required for a license. This paper describes the analysis of the Design Basis Accident for this facility. This analysis involves a model of the transient behavior of the FCF inert atmosphere hot cell following an earthquake initiated breach of pipes passing through the cell boundary. The hot cell is used to process spent metallic nuclear fuel. Such breaches allow the introduction of air and subsequent burning of pyrophoric metals. The model predicts the pressure, temperature, volumetric releases, cell heat transfer, metal fuel combustion, heat generation rates, radiological releases and other quantities. The results show that releases from the cell are minimal and satisfactory for safety. This analysis method should be useful in other facilities that have potential for damage from an earthquake and could eliminate the need to back fit facilities with earthquake proof boundaries or lessen the cost of new facilities.

  20. From Earthquake Prediction Research to Time-Variable Seismic Hazard Assessment Applications

    NASA Astrophysics Data System (ADS)

    Bormann, Peter

    2011-01-01

    The first part of the paper defines the terms and classifications common in earthquake prediction research and applications. This is followed by short reviews of major earthquake prediction programs initiated since World War II in several countries, for example the former USSR, China, Japan, the United States, and several European countries. It outlines the underlying expectations, concepts, and hypotheses, introduces the technologies and methodologies applied and some of the results obtained, which include both partial successes and failures. Emphasis is laid on discussing the scientific reasons why earthquake prediction research is so difficult and demanding and why the prospects are still so vague, at least as far as short-term and imminent predictions are concerned. However, classical probabilistic seismic hazard assessments, widely applied during the last few decades, have also clearly revealed their limitations. In their simple form, they are time-independent earthquake rupture forecasts based on the assumption of stable long-term recurrence of earthquakes in the seismotectonic areas under consideration. Therefore, during the last decade, earthquake prediction research and pilot applications have focused mainly on the development and rigorous testing of long and medium-term rupture forecast models in which event probabilities are conditioned by the occurrence of previous earthquakes, and on their integration into neo-deterministic approaches for improved time-variable seismic hazard assessment. The latter uses stress-renewal models that are calibrated for variations in the earthquake cycle as assessed on the basis of historical, paleoseismic, and other data, often complemented by multi-scale seismicity models, the use of pattern-recognition algorithms, and site-dependent strong-motion scenario modeling. International partnerships and a global infrastructure for comparative testing have recently been developed, for example the Collaboratory for the Study of

  1. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  2. Estimating earthquake potential

    USGS Publications Warehouse

    Page, R.A.

    1980-01-01

    The hazards to life and property from earthquakes can be minimized in three ways. First, structures can be designed and built to resist the effects of earthquakes. Second, the location of structures and human activities can be chosen to avoid or to limit the use of areas known to be subject to serious earthquake hazards. Third, preparations for an earthquake in response to a prediction or warning can reduce the loss of life and damage to property as well as promote a rapid recovery from the disaster. The success of the first two strategies, earthquake engineering and land use planning, depends on being able to reliably estimate the earthquake potential. The key considerations in defining the potential of a region are the location, size, and character of future earthquakes and frequency of their occurrence. Both historic seismicity of the region and the geologic record are considered in evaluating earthquake potential. 

  3. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  4. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  5. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  6. Generally Contracted Valence-Core/Valence Basis Sets for Use with Relativistic Effective Core Potentials and Spin-Orbit Coupling Operators

    SciTech Connect

    Ermler, Walter V.; Tilson, Jeffrey L.

    2012-12-15

    A procedure for structuring generally contracted valence-core/valence basis sets of Gaussian-type functions for use with relativistic effective core potentials (gcv-c/v-RECP basis sets) is presented. Large valence basis sets are enhanced using a compact basis set derived for outer core electrons in the presence of small-core RECPs. When core electrons are represented by relativistic effective core potentials (RECPs), and appropriate levels of theory, these basis sets are shown to provide accurate representations of atomic and molecular valence and outer-core electrons. Core/valence polarization and correlation effects can be calculated using these basis sets through standard methods for treating electron correlation. Calculations of energies and spectra for Ru, Os, Ir, In and Cs are reported. Spectroscopic constants for RuO2+, OsO2+, Cs2 and InH are calculated and compared with experiment.

  7. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  8. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  9. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  10. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  11. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  12. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  13. A Century of Induced Earthquakes in Oklahoma

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Page, M. T.

    2015-12-01

    Seismicity rates have increased sharply since 2009 in the central and eastern United States, with especially high rates of activity in the state of Oklahoma. A growing body of evidence indicates that many of these events are induced, primarily by injection of wastewater in deep disposal wells. The upsurge in activity has raised the questions, what is the background rate of tectonic earthquakes in Oklahoma? And how much has the rate varied throughout historical and early instrumental times? We first review the historical catalog, including assessment of the completeness level of felt earthquakes, and show that seismicity rates since 2009 surpass previously observed rates throughout the 20th century. Furthermore, several lines of evidence suggest that most of the significant (Mw > 3.5) earthquakes in Oklahoma during the 20th century were likely induced by wastewater injection and/or enhanced oil recovery operations. We show that there is a statistically significant temporal and spatial correspondence between earthquakes and disposal wells permitted during the 1950s. The intensity distributions of the 1952 Mw5.7 El Reno earthquake and the 1956 Mw3.9 Tulsa county earthquake are similar to those from recent induced earthquakes, with significantly lower shaking than predicted given a regional intensity-prediction equation. The rate of tectonic earthquakes is thus inferred to be significantly lower than previously estimated throughout most of the state, but is difficult to estimate given scant incontrovertible evidence for significant tectonic earthquakes during the 20th century. We do find evidence for a low level of tectonic seismicity in southeastern Oklahoma associated with the Ouachita structural belt, and conclude that the 22 October 1882 Choctaw Nation earthquake, for which we estimate Mw4.8, occurred in this zone.

  14. Safety Basis Report

    SciTech Connect

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  15. Virtual California: studying earthquakes through simulation

    NASA Astrophysics Data System (ADS)

    Sachs, M. K.; Heien, E. M.; Turcotte, D. L.; Yikilmaz, M. B.; Rundle, J. B.; Kellogg, L. H.

    2012-12-01

    Virtual California is a computer simulator that models earthquake fault systems. The design of Virtual California allows for fast execution so many thousands of events can be generated over very long simulated time periods. The result is a rich dataset, including simulated earthquake catalogs, which can be used to study the statistical properties of the seismicity on the modeled fault systems. We describe the details of Virtual California's operation and discuss recent results from Virtual California simulations.

  16. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  17. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  18. The loma prieta, california, earthquake: an anticipated event.

    PubMed

    1990-01-19

    The first major earthquake on the San Andreas fault since 1906 fulfilled a long-term forecast for its rupture in the southern Santa Cruz Mountains. Severe damage occurred at distances of up to 100 kilometers from the epicenter in areas underlain by ground known to be hazardous in strong earthquakes. Stronger earthquakes will someday strike closer to urban centers in the United States, most of which also contain hazardous ground. The Loma Prieta earthquake demonstrated that meaningful predictions can be made of potential damage patterns and that, at least in well-studied areas, long-term forecasts can be made of future earthquake locations and magnitudes. Such forecasts can serve as a basis for action to reduce the threat major earthquakes pose to the United States. PMID:17735847

  19. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  20. Application of Seismic Array Processing to Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Meng, L.; Allen, R. M.; Ampuero, J. P.

    2013-12-01

    Earthquake early warning (EEW) systems that can issue warnings prior to the arrival of strong ground shaking during an earthquake are essential in mitigating seismic hazard. Many of the currently operating EEW systems work on the basis of empirical magnitude-amplitude/frequency scaling relations for a point source. This approach is of limited effectiveness for large events, such as the 2011 Tohoku-Oki earthquake, for which ignoring finite source effects may result in underestimation of the magnitude. Here, we explore the concept of characterizing rupture dimensions in real time for EEW using clusters of dense low-cost accelerometers located near active faults. Back tracing the waveforms recorded by such arrays allows the estimation of the earthquake rupture size, duration and directivity in real-time, which enables the EEW of M > 7 earthquakes. The concept is demonstrated with the 2004 Parkfield earthquake, one of the few big events (M>6) that have been recorded by a local small-scale seismic array (UPSAR array, Fletcher et al, 2006). We first test the approach against synthetic rupture scenarios constructed by superposition of empirical Green's functions. We find it important to correct for the bias in back azimuth induced by dipping structures beneath the array. We implemented the proposed methodology to the mainshock in a simulated real-time environment. After calibrating the dipping-layer effect with data from smaller events, we obtained an estimated rupture length of 9 km, consistent with the distance between the two main high frequency subevents identified by back-projection using all local stations (Allman and Shearer, 2007). We proposed to deploy small-scale arrays every 30 km along the San Andreas Fault. The array processing is performed in local processing centers at each array. The output is compared with finite fault solutions based on real-time GPS system and then incorporated into the standard ElarmS system. The optimal aperture and array geometry is

  1. Earthquake source inversion of tsunami runup prediction

    NASA Astrophysics Data System (ADS)

    Sekar, Anusha

    Our goal is to study two inverse problems: using seismic data to invert for earthquake parameters and using tide gauge data to invert for earthquake parameters. We focus on the feasibility of using a combination of these inverse problems to improve tsunami runup prediction. A considerable part of the thesis is devoted to studying the seismic forward operator and its modeling using immersed interface methods. We develop an immersed interface method for solving the variable coefficient advection equation in one dimension with a propagating singularity and prove a convergence result for this method. We also prove a convergence result for the one-dimensional acoustic system of partial differential equations solved using immersed interface methods with internal boundary conditions. Such systems form the building blocks of the numerical model for the earthquake. For a simple earthquake-tsunami model, we observe a variety of possibilities in the recovery of the earthquake parameters and tsunami runup prediction. In some cases the data are insufficient either to invert for the earthquake parameters or to predict the runup. When more data are added, we are able to resolve the earthquake parameters with enough accuracy to predict the runup. We expect that this variety will be true in a real world three dimensional geometry as well.

  2. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed Central

    Grant, Rachel A.; Conlan, Hilary

    2013-01-01

    juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co-incidence. Statistical analysis of the data indicated frog swarms are unlikely to be connected with earthquakes. Reports of unusual behaviour giving rise to earthquake fears should be interpreted with caution, and consultation with experts in the field of earthquake biology is advised. PMID:26479746

  3. Effects of the 2011 Tohoku Earthquake on VLBI Geode- tic Measurements

    NASA Astrophysics Data System (ADS)

    MacMillan, D.; Behrend, D.; Kurihara, S.

    2012-12-01

    The VLBI antenna TSUKUB32 at Tsukuba, Japan observes in 24-hour observing sessions once per week with the R1 operational network and on additional days with other networks on a more irregular basis. Further, the antenna is an endpoint of the single-baseline, 1-hr Intensive Int2 sessions observed on the weekends for the determination of UT1. TSUKUB32 returned to normal operational observing one month after the earthquake. The antenna is 160 km west and 240 km south of the epicenter of the Tohoku earthquake. We looked at the transient behavior of the TSUKUB32 position time series following the earthquake and found that significant deformation is continuing. The eastward rate relative to the long-term rate prior to the earthquake was about 20 cm/yr four months after the earthquake and 9 cm/yr after one year. The VLBI series agrees closely with the corresponding JPL (Jet Propulsion Laboratory) GPS series measured by the co-located GPS antenna TSUK. The co-seismic UEN displacement at Tsukuba as determined by VLBI was (-90 mm, 640 mm, 44 mm). We examined the effect of the variation of the TSUKUB32 position on EOP estimates and then used the GPS data to correct its position for the estimation of UT1 in the Tsukuba-Wettzell Int2 Intensive experiments. For this purpose and to provide operational UT1, the IVS scheduled a series of weekend Intensive sessions observing on the Kokee-Wettzell baseline immediately before each of the two Tsukuba-Wettzell Intensive sessions. Comparisons between the UT1 estimates from these weekend sessions and the USNO (United States Naval Observatory) combination series were used to validate the GPS correction to the TSUKUB32 position.

  4. Comment on “Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set” [J. Chem. Phys. 139, 114104 (2013)

    SciTech Connect

    Brandbyge, Mads

    2014-05-07

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an “implicit decoupling assumption,” leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, and that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.

  5. Evidence for remotely triggered micro-earthquakes during salt cavern collapse

    NASA Astrophysics Data System (ADS)

    Jousset, P.; Rohmer, J.

    2012-04-01

    Micro-seismicity is a good indicator of spatio-temporal evolution of physical properties of rocks prior to catastrophic events like volcanic eruptions or landslides and may be triggered by a number of causes including dynamic characteristics of processes in play or/and external forces. Micro-earthquake triggering has been in the recent years the subject of intense research and our work contribute to showing further evidence of possible triggering of micro-earthquakes by remote large earthquakes. We show evidence of triggered micro-seismicity in the vicinity of an underground salt cavern prone to collapse by a remote M~7.2 earthquake, which occurred ~12000 kilometres away. We demonstrate the near critical state of the cavern before the collapse by means of 2D axisymmetric elastic finite-element simulations. Pressure was lowered in the cavern by pumping operations of brine out of the cavern. We demonstrate that a very small stress increase would be sufficient to break the overburden. High-dynamic broadband records reveal a remarkable time-correlation between a dramatic increase of the local high-frequency micro-seismicity rate associated with the break of the stiffest layer stabilizing the overburden and the passage of low-frequency remote seismic waves, including body, Love and Rayleigh surface waves. Stress oscillations due to the seismic waves exceeded the strength required for the rupture of the complex media made of brine and rock triggering micro-earthquakes and leading to damage of the overburden and eventually collapse of the salt cavern. The increment of stress necessary for the failure of a Dolomite layer is of the same order or magnitude as the maximum dynamic stress magnitude observed during the passage of the earthquakes waves. On this basis, we discuss the possible contribution of the Love and Rayleigh low-frequency surfaces waves.

  6. Retrospective Evaluation of Earthquake Forecasts during the 2010-12 Canterbury, New Zealand, Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Marzocchi, W.; Taroni, M.; Zechar, J. D.; Gerstenberger, M.; Liukis, M.; Rhoades, D. A.; Cattania, C.; Christophersen, A.; Hainzl, S.; Helmstetter, A.; Jimenez, A.; Steacy, S.; Jordan, T. H.

    2014-12-01

    The M7.1 Darfield, New Zealand (NZ), earthquake triggered a complex earthquake cascade that provides a wealth of new scientific data to study earthquake triggering and the predictive skill of statistical and physics-based forecasting models. To this end, the Collaboratory for the Study of Earthquake Predictability (CSEP) is conducting a retrospective evaluation of over a dozen short-term forecasting models that were developed by groups in New Zealand, Europe and the US. The statistical model group includes variants of the Epidemic-Type Aftershock Sequence (ETAS) model, non-parametric kernel smoothing models, and the Short-Term Earthquake Probabilities (STEP) model. The physics-based model group includes variants of the Coulomb stress triggering hypothesis, which are embedded either in Dieterich's (1994) rate-state formulation or in statistical Omori-Utsu clustering formulations (hybrid models). The goals of the CSEP evaluation are to improve our understanding of the physical mechanisms governing earthquake triggering, to improve short-term earthquake forecasting models and time-dependent hazard assessment for the Canterbury area, and to understand the influence of poor-quality, real-time data on the skill of operational (real-time) forecasts. To assess the latter, we use the earthquake catalog data that the NZ CSEP Testing Center archived in near real-time during the earthquake sequence and compare the predictive skill of models using the archived data as input with the skill attained using the best available data today. We present results of the retrospective model comparison and discuss implications for operational earthquake forecasting.

  7. Operations

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.

    2013-01-01

    Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this…

  8. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  9. Earthquakes and the office-based surgeon.

    PubMed Central

    Conover, W A

    1992-01-01

    A major earthquake may strike while a surgeon is performing an operation in an office surgical facility. A sudden major fault disruption will lead to thousands of casualties and widespread destruction. Surgeons who operate in offices can help lessen havoc by careful preparation. These plans should coordinate with other disaster plans for effective triage, evacuation, and the treatment of casualties. PMID:1413756

  10. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  11. Astronomical tides and earthquakes

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoping; Mao, Wei; Huang, Yong

    2001-03-01

    A review on the studies of correlation between astronomical tides and earthquakes is given in three categories, including (1) earthquakes and the relative locations of the sun, the moon and the earth, (2) earthquakes and the periods and phases of tides and (3) earthquakes and the tidal stress. The first two categories mainly investigate whether or not there exist any dominant pattern of the relative locations of the sun, the moon and the earth during earthquakes, whether or not the occurrences of earthquakes are clustered in any special phase during a tidal period, whether or not there exists any tidal periodic phenomenon in seismic activities, By empasizing the tidal stress in seismic focus, the third category investigates the relationship between various seismic faults and the triggering effects of tidal stress, which reaches the crux of the issue. Possible reasons to various inconsistent investigation results by using various methods and samples are analyzed and further investigations are proposed.

  12. Anthropogenic seismicity rates and operational parameters at the Salton Sea Geothermal Field.

    PubMed

    Brodsky, Emily E; Lajoie, Lia J

    2013-08-01

    Geothermal power is a growing energy source; however, efforts to increase production are tempered by concern over induced earthquakes. Although increased seismicity commonly accompanies geothermal production, induced earthquake rate cannot currently be forecast on the basis of fluid injection volumes or any other operational parameters. We show that at the Salton Sea Geothermal Field, the total volume of fluid extracted or injected tracks the long-term evolution of seismicity. After correcting for the aftershock rate, the net fluid volume (extracted-injected) provides the best correlation with seismicity in recent years. We model the background earthquake rate with a linear combination of injection and net production rates that allows us to track the secular development of the field as the number of earthquakes per fluid volume injected decreases over time. PMID:23845943

  13. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  14. Earthquake swarms in Greenland

    NASA Astrophysics Data System (ADS)

    Larsen, Tine B.; Voss, Peter H.; Dahl-Jensen, Trine

    2014-05-01

    Earthquake swarms occur primarily near active volcanoes and in areas with frequent tectonic activity. However, intraplate earthquake swarms are not an unknown phenomenon. They are located near zones of weakness, e.g. in regions with geological contrasts, where dynamic processes are active. An earthquake swarm is defined as a period of increased seismicity, in the form of a cluster of earthquakes of similar magnitude, occurring in the same general area, during a limited time period. There is no obvious main shock among the earthquakes in a swarm. Earthquake swarms occur in Greenland, which is a tectonically stable, intraplate environment. The first earthquake swarms in Greenland were detected more than 30 years ago in Northern and North-Eastern Greenland. However, detection of these low-magnitude events is challenging due to the enormous distances and the relatively sparse network of seismographs. The seismograph coverage of Greenland has vastly improved since the international GLISN-project was initiated in 2008. Greenland is currently coved by an open network of 19 BB seismographs, most of them transmitting data in real-time. Additionally, earthquake activity in Greenland is monitored by seismographs in Canada, Iceland, on Jan Mayen, and on Svalbard. The time-series of data from the GLISN network is still short, with the latest station been added in NW Greenland in 2013. However, the network has already proven useful in detecting several earthquake swarms. In this study we will focus on two swarms: one occurring near/on the East Greenland coast in 2008, and another swarm occurring in the Disko-area near the west coast of Greenland in 2010. Both swarms consist of earthquakes with local magnitudes between 1.9 and 3.2. The areas, where the swarms are located, are regularly active with small earthquakes. The earthquake swarms are analyzed in the context of the general seismicity and the possible relationship to the local geological conditions.

  15. Earthquake at 40 feet

    USGS Publications Warehouse

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  16. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed

    Grant, Rachel A; Conlan, Hilary

    2013-01-01

    In short-term earthquake risk forecasting, the avoidance of false alarms is of utmost importance to preclude the possibility of unnecessary panic among populations in seismic hazard areas. Unusual animal behaviour prior to earthquakes has been reported for millennia but has rarely been scientifically documented. Recently large migrations or unusual behaviour of amphibians have been linked to large earthquakes, and media reports of large frog and toad migrations in areas of high seismic risk such as Greece and China have led to fears of a subsequent large earthquake. However, at certain times of year large migrations are part of the normal behavioural repertoire of amphibians. News reports of "frog swarms" from 1850 to the present day were examined for evidence that this behaviour is a precursor to large earthquakes. It was found that only two of 28 reported frog swarms preceded large earthquakes (Sichuan province, China in 2008 and 2010). All of the reported mass migrations of amphibians occurred in late spring, summer and autumn and appeared to relate to small juvenile anurans (frogs and toads). It was concluded that most reported "frog swarms" are actually normal behaviour, probably caused by juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co

  17. Role of Bioindicators In Earthquake Modelling

    NASA Astrophysics Data System (ADS)

    Zelinsky, I. P.; Melkonyan, D. V.; Astrova, N. G.

    On the basis of experimental researches of influence of sound waves on bacteria- indicators a model of earthquake is constructed. It is revealed that the growth of num- ber of bacteria depends on frequency of a sound wave, influencing on the bacterium, (the less frequency of a sound wave, the faster takes place a growth). It is shown, that at absorption of energy of a sound wave by bacterium occurs growth of concentration of isopotential lines of biodynamic field in a bacterium. This process leads to the bac- terium braking and heating. By structure of deformation of lines of a biodynamic field it is possible to predict various geodynamic processes including earthquakes.

  18. Earthquake fluctuations in wells in New Jersey

    USGS Publications Warehouse

    Austin, Charles R.

    1960-01-01

    New Jersey is fortunate to be situated in a region that is relatively stable, geologically. For this reason scientists believe, on the basis of the best scientific evidence available, that the chances of New Jersey experiencing a major earthquake are very small. The last major earthquake on the east coast occurred at Charleston, S. C., in 1886. Minor shocks have been felt in New Jersey, however, from time to time. Reports of dishes being rattled or even of plaster in buildings being cracked are not uncommon. These minor disturbances are generally restricted to relatively small areas.

  19. Compiling the 'Global Earthquake History' (1000-1903)

    NASA Astrophysics Data System (ADS)

    Albini, P.; Musson, R.; Locati, M.; Rovida, A.

    2013-12-01

    The study of historical earthquakes from historical sources, or historical seismology, is of wider interest than just the seismic hazard and risk community. In the scope of the two-year project (October 2010-March 2013) "Global Earthquake History", developed in the framework of GEM, a reassessment of world historical seismicity was made, from available published studies. The scope of the project is the time window 1000-1903, with magnitudes 7.0 and above. Events with lower magnitudes are included on a case by case, or region by region, basis. The Global Historical Earthquake Archive (GHEA) provides a complete account of the global situation in historical seismology. From GHEA, the Global Historical Earthquake Catalogue (GHEC, v1, available at http://www.emidius.eu/GEH/, under Creative Commons licence) was derived, i.e. a world catalogue of earthquakes for the period 1000-1903, with magnitude 7 and over, using publically-available materials, as for the Archive. This is intended to be the best global historical catalogue of large earthquakes presently available, with the best parameters selected, duplications and fakes removed, and in some cases, new earthquakes discovered. GHEA and GHEC are conceived as providing a basis for co-ordinating future research into historical seismology in any part of the world, and hopefully, encouraging new historical earthquake research initiatives that will continue to improve the information available.

  20. Earthquakes and Plate Boundaries

    ERIC Educational Resources Information Center

    Lowman, Paul; And Others

    1978-01-01

    Contains the contents of the Student Investigation booklet of a Crustal Evolution Education Project (CEEP) instructional modules on earthquakes. Includes objectives, procedures, illustrations, worksheets, and summary questions. (MA)

  1. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  2. Missing Great Earthquakes

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Martin, S.

    2013-12-01

    The occurrence of three earthquakes with Mw greater than 8.8, and six earthquakes larger than Mw8.5, since 2004 has raised interest in the long-term rate of great earthquakes. Past studies have focused on rates since 1900, which roughly marks the start of the instrumental era. Yet substantial information is available for earthquakes prior to 1900. A re-examination of the catalog of global historical earthquakes reveals a paucity of Mw ≥ 8.5 events during the 18th and 19th centuries compared to the rate during the instrumental era (Hough, 2013, JGR), suggesting that the magnitudes of some documented historical earthquakes have been underestimated, with approximately half of all Mw≥8.5 earthquakes missing or underestimated in the 19th century. Very large (Mw≥8.5) magnitudes have traditionally been estimated for historical earthquakes only from tsunami observations given a tautological assumption that all such earthquakes generate significant tsunamis. Magnitudes would therefore tend to be underestimated for deep megathrust earthquakes that generated relatively small tsunamis, deep earthquakes within continental collision zones, earthquakes that produced tsunamis that were not documented, outer rise events, and strike-slip earthquakes such as the 11 April 2012 Sumatra event. We further show that, where magnitudes of historical earthquakes are estimated from earthquake intensities using the Bakun and Wentworth (1997, BSSA) method, magnitudes of great earthquakes can be significantly underestimated. Candidate 'missing' great 19th century earthquakes include the 1843 Lesser Antilles earthquake, which recent studies suggest was significantly larger than initial estimates (Feuillet et al., 2012, JGR; Hough, 2013), and an 1841 Kamchatka event, for which Mw9 was estimated by Gusev and Shumilina (2004, Izv. Phys. Solid Ear.). We consider cumulative moment release rates during the 19th century compared to that during the 20th and 21st centuries, using both the Hough

  3. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  4. Investigations on Real-time GPS for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.; Aranha, M. A.; Melgar, D.; Allen, R. M.

    2015-12-01

    The Geodetic Alarm System (G-larmS) is a software system developed in a collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech (NMT) primarily for real-time Earthquake Early Warning (EEW). It currently uses high rate (1Hz), low latency (< ~5 seconds), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from existing EEW algorithms, e.g. ElarmS, to compute static offsets upon S-wave arrival. G-larmS performs a least squares inversion on these offsets to determine slip on a finite fault, which we use to estimate moment magnitude. These computations are repeated every second for the duration of the event. G-larmS has been in continuous operation at the BSL for over a year using event triggers from the California Integrated Seismic Network (CISN) ShakeAlert system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California. Pairs of stations are processed as baselines using trackRT (MIT software package). G-larmS produced good results in real-time during the South Napa (M 6.0, August 2014) earthquake as well as on several replayed and simulated test cases. We evaluate the performance of G-larmS for EEW by analysing the results using a set of well defined test cases to investigate the following: (1) using multiple fault regimes and concurrent processing with the ultimate goal of achieving model generation (slip and magnitude computations) within each 1 second GPS epoch on very large magnitude earthquakes (up to M 9.0), (2) the use of Precise Point Positioning (PPP) real-time data streams of various operators, accuracies, latencies and formats along with baseline data streams, (3) collaboratively expanding EEW coverage along the U.S. West Coast on a regional network basis for Northern California, Southern California and Cascadia.

  5. Gravity drives Great Earthquakes

    NASA Astrophysics Data System (ADS)

    Lister, Gordon; Forster, Marnie

    2010-05-01

    of the over-riding crust and mantle. This is possible for the crust and mantle above major subduction zones is mechanically weakened by the flux of heat and water associated with subduction zone processes. In consequence the lithosphere of the over-riding orogens can act more like a fluid than a rigid plate. Such fluid-like behaviour has been noted for the Himalaya and for the crust of the uplifted adjacent Tibetan Plateau, which appear to be collapsing. Similar conclusions as to the fluid-like behaviour of an orogen can also be reached for the crust and mantle of Myanmar and Indonesia, since here again, there is evidence for arc-normal motion adjacent to rolling-back subduction zones. Prior to the Great Sumatran Earthquake of 2004 we had postulated such movements on geological time-scales, describing them as ‘surges‘ driven by the gravitational potential energy of the adjacent orogen. But we considered time-scales that were very different to those that apply in the lead up, or during and subsequent to a catastrophic seismic event. The Great Sumatran Earthquake taught us quite differently. Data from satellites support the hypothesis that extension took place in a discrete increment, which we interpret to be the result of a gravitationally driven surge of the Indonesian crust westward over the weakened rupture during and after the earthquake. Mode II megathrusts are tsunamigenic for one very simple reason: the crust has been attenuated as the result of ongoing extension, so they can be overlain by large tracts of water, and they have a long rupture run time, allowing a succession of stress accumulations to be harvested. The after-slip beneath the Andaman Sea was also significant (in terms of moment) although non-seismogenic in its character. Operation of a Mode II megathrust prior to catastrophic failure may involve relatively quiescent motion with a mixture of normal faults and reverse faults, much like south of Java today. Ductile yield may produce steadily

  6. Earthquake activity in Oklahoma

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. )

    1989-08-01

    Oklahoma is one of the most seismically active areas in the southern Mid-Continent. From 1897 to 1988, over 700 earthquakes are known to have occurred in Oklahoma. The earliest documented Oklahoma earthquake took place on December 2, 1897, near Jefferson, in Grant County. The largest known Oklahoma earthquake happened near El Reno on April 9, 1952. This magnitude 5.5 (mb) earthquake was felt from Austin, Texas, to Des Moines, Iowa, and covered a felt area of approximately 362,000 km{sup 2}. Prior to 1962, all earthquakes in Oklahoma (59) were either known from historical accounts or from seismograph stations outside the state. Over half of these events were located in Canadian County. In late 1961, the first seismographs were installed in Oklahoma. From 1962 through 1976, 70 additional earthquakes were added to the earthquake database. In 1977, a statewide network of seven semipermanent and three radio-telemetry seismograph stations were installed. The additional stations have improved earthquake detection and location in the state of Oklahoma. From 1977 to 1988, over 570 additional earthquakes were located in Oklahoma, mostly of magnitudes less than 2.5. Most of these events occurred on the eastern margin of the Anadarko basin along a zone 135 km long by 40 km wide that extends from Canadian County to the southern edge of Garvin County. Another general area of earthquake activity lies along and north of the Ouachita Mountains in the Arkoma basin. A few earthquakes have occurred in the shelves that border the Arkoma and Anadarko basins.

  7. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  8. Investigating landslides caused by earthquakes - A historical review

    USGS Publications Warehouse

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  9. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  10. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  11. Can we control earthquakes?

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    In 1966, it was discovered that high pressure injection of industrial waste fluids into the subsurface near Denver, Colo., was triggering earthquakes. While this was disturbing at the time, it was also exciting because there was immediate speculation that here at last was a mechanism to control earthquakes.  

  12. Earthquake history of Texas

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Seventeen earthquakes, intensity V or greater, have centered in Texas since 1882, when the first shock was reported. The strongest earthquake, a maximum intensity VIII, was in western Texas in 1931 and was felt over 1 165 000 km 2. Three shocks in the Panhandle region in 1925, 1936, and 1943 were widely felt. 

  13. Earthquake research in China

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    The prediction of the Haicheng earthquake was an extraordinary achievement by the geophysical workers of the People's Republic of China, whose national program in earthquake reserach was less than 10 years old at the time. To study the background to this prediction, a delgation of 10 U.S scientists, which I led, visited China in June 1976. 

  14. Recent Progress and Development on Multi-parameters Remote Sensing Application in Earthquake Monitoring in China

    NASA Astrophysics Data System (ADS)

    Shen, Xuhui; Zhang, Xuemin; Hong, Shunying; Jing, Feng; Zhao, Shufan

    2014-05-01

    In the last ten years, a few national research plans and scientific projects on remote sensing application in Earthquake monitoring research are implemented in China. Focusing on advancing earthquake monitoring capability searching for the way of earthquake prediction, satellite electromagnetism, satellite infrared and D-InSAR technology were developed systematically and some remarkable progress were achieved by statistical research on historical earthquakes and summarized initially the space precursory characters, which laid the foundation for gradually promoting the practical use. On the basis of these works, argumentation on the first space-based platform has been finished in earthquake stereoscope observation system in China, and integrated earthquake remote sensing application system has been designed comprehensively. To develop the space-based earthquake observational system has become a major trend of technological development in earthquake monitoring and prediction. We shall pay more emphasis on the construction of the space segment of China earthquake stereoscope observation system and Imminent major scientific projects such as earthquake deformation observation system and application research combined INSAR, satellite gravity and GNSS with the goal of medium and long term earthquake monitoring and forcasting, infrared observation and technical system and application research with the goal of medium and short term earthquake monitoring and forcasting, and satellite-based electromagnetic observation and technical system and application system with the goal of short term and imminent earthquake monitoring.

  15. The USGS Earthquake Scenario Project

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Petersen, M. D.; Wald, L. A.; Frankel, A. D.; Quitoriano, V. R.; Lin, K.; Luco, N.; Mathias, S.; Bausch, D.

    2009-12-01

    The U.S. Geological Survey’s (USGS) Earthquake Hazards Program (EHP) is producing a comprehensive suite of earthquake scenarios for planning, mitigation, loss estimation, and scientific investigations. The Earthquake Scenario Project (ESP), though lacking clairvoyance, is a forward-looking project, estimating earthquake hazard and loss outcomes as they may occur one day. For each scenario event, fundamental input includes i) the magnitude and specified fault mechanism and dimensions, ii) regional Vs30 shear velocity values for site amplification, and iii) event metadata. A grid of standard ShakeMap ground motion parameters (PGA, PGV, and three spectral response periods) is then produced using the well-defined, regionally-specific approach developed by the USGS National Seismic Hazard Mapping Project (NHSMP), including recent advances in empirical ground motion predictions (e.g., the NGA relations). The framework also allows for numerical (3D) ground motion computations for specific, detailed scenario analyses. Unlike NSHMP ground motions, for ESP scenarios, local rock and soil site conditions and commensurate shaking amplifications are applied based on detailed Vs30 maps where available or based on topographic slope as a proxy. The scenario event set is comprised primarily by selection from the NSHMP events, though custom events are also allowed based on coordination of the ESP team with regional coordinators, seismic hazard experts, seismic network operators, and response coordinators. The event set will be harmonized with existing and future scenario earthquake events produced regionally or by other researchers. The event list includes approximate 200 earthquakes in CA, 100 in NV, dozens in each of NM, UT, WY, and a smaller number in other regions. Systematic output will include all standard ShakeMap products, including HAZUS input, GIS, KML, and XML files used for visualization, loss estimation, ShakeCast, PAGER, and for other systems. All products will be

  16. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  17. Earthquake Hazard and Risk Assessment for Turkey

    NASA Astrophysics Data System (ADS)

    Betul Demircioglu, Mine; Sesetyan, Karin; Erdik, Mustafa

    2010-05-01

    Using a GIS-environment to present the results, seismic risk analysis is considered as a helpful tool to support the decision making for planning and prioritizing seismic retrofit intervention programs at large scale. The main ingredients of seismic risk analysis consist of seismic hazard, regional inventory of buildings and vulnerability analysis. In this study, the assessment of the national earthquake hazard based on the NGA ground motion prediction models and the comparisons of the results with the previous models have been considered, respectively. An evaluation of seismic risk based on the probabilistic intensity ground motion prediction for Turkey has been investigated. According to the Macroseismic approach of Giovinazzi and Lagomarsino (2005), two alternative vulnerability models have been used to estimate building damage. The vulnerability and ductility indices for Turkey have been taken from the study of Giovinazzi (2005). These two vulnerability models have been compared with the observed earthquake damage database. A good agreement between curves has been clearly observed. In additional to the building damage, casualty estimations based on three different methods for each return period and for each vulnerability model have been presented to evaluate the earthquake loss. Using three different models of building replacement costs, the average annual loss (AAL) and probable maximum loss ratio (PMLR) due to regional earthquake hazard have been provided to form a basis for the improvement of the parametric insurance model and the determination of premium rates for the compulsory earthquake insurance in Turkey.

  18. Maximum Earthquake Magnitude Assessments by Japanese Government Committees (Invited)

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2013-12-01

    The 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history and such a gigantic earthquake was not foreseen around Japan. After the 2011 disaster, various government committees in Japan have discussed and assessed the maximum credible earthquake size around Japan, but their values vary without definite consensus. I will review them with earthquakes along the Nankai Trough as an example. The Central Disaster Management Council, under Cabinet Office, set up a policy for the future tsunami disaster mitigation. The possible future tsunamis are classified into two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, for which saving people's lives is the first priority with soft measures such as tsunami hazard maps, evacuation facilities or disaster education. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared. The assessments of L1 and L2 events are left to local governments. The CDMC also assigned M 9.1 as the maximum size of earthquake along the Nankai trough, then computed the ground shaking and tsunami inundation for several scenario earthquakes. The estimated loss is about ten times the 2011 disaster, with maximum casualties of 320,000 and economic loss of 2 trillion dollars. The Headquarters of Earthquake Research Promotion, under MEXT, was set up after the 1995 Kobe earthquake and has made long-term forecast of large earthquakes and published national seismic hazard maps. The future probability of earthquake occurrence, for example in the next 30 years, was calculated from the past data of large earthquakes, on the basis of characteristic earthquake model. The HERP recently revised the long-term forecast of Naknai trough earthquake; while the 30 year probability (60 - 70 %) is similar to the previous estimate, they noted the size can be M 8 to 9, considering the variability of past

  19. The mass balance of earthquakes and earthquake sequences

    NASA Astrophysics Data System (ADS)

    Marc, O.; Hovius, N.; Meunier, P.

    2016-04-01

    Large, compressional earthquakes cause surface uplift as well as widespread mass wasting. Knowledge of their trade-off is fragmentary. Combining a seismologically consistent model of earthquake-triggered landsliding and an analytical solution of coseismic surface displacement, we assess how the mass balance of single earthquakes and earthquake sequences depends on fault size and other geophysical parameters. We find that intermediate size earthquakes (Mw 6-7.3) may cause more erosion than uplift, controlled primarily by seismic source depth and landscape steepness, and less so by fault dip and rake. Such earthquakes can limit topographic growth, but our model indicates that both smaller and larger earthquakes (Mw < 6, Mw > 7.3) systematically cause mountain building. Earthquake sequences with a Gutenberg-Richter distribution have a greater tendency to lead to predominant erosion, than repeating earthquakes of the same magnitude, unless a fault can produce earthquakes with Mw > 8 or more.

  20. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  1. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  2. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  3. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  4. AGU develops earthquake curriculum

    NASA Astrophysics Data System (ADS)

    Blue, Charles

    AGU, in cooperation with the Federal Emergency Management Agency (FEMA), announces the production of a new curriculum package for grades 7-12 on the engineering and geophysical aspects of earthquakes.According to Frank Ireton, AGU's precollege education manager, “Both AGU and FEMA are working to promote the understanding of earthquake processes and their impact on the built environment. We are designing a program that involves students in learning how science, mathematics, and social studies concepts can be applied to reduce earthquake hazards.”

  5. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal. PMID:27108213

  6. Cooperative earthquake research between the United States and the People's Republic of China

    SciTech Connect

    Russ, D.P.; Johnson, L.E.

    1986-01-01

    This paper describes cooperative research by scientists of the US and the People's Republic of China (PRC) which has resulted in important new findings concerning the fundamental characteristics of earthquakes and new insight into mitigating earthquake hazards. There have been over 35 projects cooperatively sponsored by the Earthquake Studies Protocol in the past 5 years. The projects are organized into seven annexes, including investigations in earthquake prediction, intraplate faults and earthquakes, earthquake engineering and hazards investigation, deep crustal structure, rock mechanics, seismology, and data exchange. Operational earthquake prediction experiments are currently being developed at two primary sites: western Yunnan Province near the town of Xiaguan, where there are several active faults, and the northeast China plain, where the devastating 1976 Tangshan earthquake occurred.

  7. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  8. Forecasting southern california earthquakes.

    PubMed

    Raleigh, C B; Sieh, K; Sykes, L R; Anderson, D L

    1982-09-17

    Since 1978 and 1979, California has had a significantly higher frequency of moderate to large earthquakes than in the preceding 25 years. In the past such periods have also been associated with major destructive earthquakes, of magnitude 7 or greater, and the annual probability of occurrence of such an event is now 13 percent in California. The increase in seismicity is associated with a marked deviation in the pattern of strain accumulation, a correlation that is physically plausible. Although great earthquakes (magnitude greater than 7.5) are too infrequent to have clear associations with any pattern of seismicity that is now observed, the San Andreas fault in southern California has accumulated sufficient potential displacement since the last rupture in 1857 to generate a great earthquake along part or all of its length. PMID:17740956

  9. To capture an earthquake

    SciTech Connect

    Ellsworth, W.L. )

    1990-11-01

    An earthquake model based on the theory of plate tectonics is presented. It is assumed that the plates behave elastically in response to slow, steady motions and the strains concentrate within the boundary zone between the plates. When the accumulated stresses exceed the bearing capacity of the rocks, the rocks break, producing an earthquake and releasing the accumulated stresses. As the steady movement of the plates continues, strain begins to reaccumulate. The cycle of strain accumulation and release is modeled using the motion of a block, pulled across a rough surface by a spring. A model earthquake can be predicted by taking into account a precursory event or the peak spring force prior to slip as measured in previous cycles. The model can be applied to faults, e.g., the San Andreas fault, if the past earthquake history of the fault and the rate of strain accumulation are known.

  10. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  11. Building losses assessment for Lushan earthquake utilization multisource remote sensing data and GIS

    NASA Astrophysics Data System (ADS)

    Nie, Juan; Yang, Siquan; Fan, Yida; Wen, Qi; Xu, Feng; Li, Lingling

    2015-12-01

    On 20 April 2013, a catastrophic earthquake of magnitude 7.0 struck the Lushan County, northwestern Sichuan Province, China. This earthquake named Lushan earthquake in China. The Lushan earthquake damaged many buildings. The situation of building loss is one basis for emergency relief and reconstruction. Thus, the building losses of the Lushan earthquake must be assessed. Remote sensing data and geographic information systems (GIS) can be employed to assess the building loss of the Lushan earthquake. The building losses assessment results for Lushan earthquake disaster utilization multisource remote sensing dada and GIS were reported in this paper. The assessment results indicated that 3.2% of buildings in the affected areas were complete collapsed. 12% and 12.5% of buildings were heavy damaged and slight damaged, respectively. The complete collapsed buildings, heavy damaged buildings, and slight damaged buildings mainly located at Danling County, Hongya County, Lushan County, Mingshan County, Qionglai County, Tianquan County, and Yingjing County.

  12. The SCEC-USGS Dynamic Earthquake Rupture Code Comparison Exercise - Simulations of Large Earthquakes and Strong Ground Motions

    NASA Astrophysics Data System (ADS)

    Harris, R.

    2015-12-01

    I summarize the progress by the Southern California Earthquake Center (SCEC) and U.S. Geological Survey (USGS) Dynamic Rupture Code Comparison Group, that examines if the results produced by multiple researchers' earthquake simulation codes agree with each other when computing benchmark scenarios of dynamically propagating earthquake ruptures. These types of computer simulations have no analytical solutions with which to compare, so we use qualitative and quantitative inter-code comparisons to check if they are operating satisfactorily. To date we have tested the codes against benchmark exercises that incorporate a range of features, including single and multiple planar faults, single rough faults, slip-weakening, rate-state, and thermal pressurization friction, elastic and visco-plastic off-fault behavior, complete stress drops that lead to extreme ground motion, heterogeneous initial stresses, and heterogeneous material (rock) structure. Our goal is reproducibility, and we focus on the types of earthquake-simulation assumptions that have been or will be used in basic studies of earthquake physics, or in direct applications to specific earthquake hazard problems. Our group's goals are to make sure that when our earthquake-simulation codes simulate these types of earthquake scenarios along with the resulting simulated strong ground shaking, that the codes are operating as expected. For more introductory information about our group and our work, please see our group's overview papers, Harris et al., Seismological Research Letters, 2009, and Harris et al., Seismological Research Letters, 2011, along with our website, scecdata.usc.edu/cvws.

  13. Historical Earthquakes and Active Structure for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashivli, Otar

    2014-05-01

    Long-term seismic history is an important foundation for reliable assessment of seismic hazard and risk. Therefore, completeness of earthquake catalogues in the longest historical part is very important. Survived historical sources, as well as special researches from the institutes, museums, libraries and archives in Georgia, the Caucasus and the Middle East indicate to high level of seismicity which entailed numerous human casualties and destruction on the territory of Georgia during the historical period. The study and detailed analysis of these original documents and researches have allowed us to create a new catalogue of historical earthquakes of Georgia from 1250 BC to 1900 AD. The method of the study is based on a multidisciplinary approach, i.e. on the joint use of methods of history and paleoseismology, archeoseismology, seismotectonics, geomorphology, etc. We present here a new parametric catalogue of 44 historic earthquakes of Georgia and a full "descriptor" of all the phenomena described in it. Constructed on its basis, the summarized map of the distribution of maximum damage in the historical period (before 1900) on the territory of Georgia clearly shows the main features of the seismic field during this period. In particular, in the axial part and the southern slope of the Greater Caucasus there is a seismic gap, which was filled in 1991 by the strongest earthquake and its aftershocks in Racha. In addition, it is also obvious that very high seismic activity in the central and eastern parts of the Javakheti highland is not described in historical materials and this fact requires further searches of various kinds of sources that contain data about historical earthquakes. We hope that this catalogue will enable to create a new joint (instrumental and historical) parametric earthquake catalogue of Georgia and will serve to assess the real seismic hazard and risk in the country.

  14. Dynamic stress changes during earthquake rupture

    USGS Publications Warehouse

    Day, S.M.; Yu, G.; Wald, D.J.

    1998-01-01

    We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.

  15. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  16. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    SciTech Connect

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-15

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ε) ≔ T{sub 0} + εT{sub 1} + ε{sup 2}T{sub 2} + … + ε{sup k}T{sub k} + … forms a Riesz basis in L{sup 2}(0, T), T > 0, where ε∈C, T{sub 0} is a closed densely defined linear operator on a separable Hilbert space H with domain D(T{sub 0}) having isolated eigenvalues with multiplicity one, while T{sub 1}, T{sub 2}, … are linear operators on H having the same domain D⊃D(T{sub 0}) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  17. Injection-induced earthquakes.

    PubMed

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard. PMID:23846903

  18. Seafloor earthquake measurement system, SEMS IV

    SciTech Connect

    Platzbecker, M.R.; Ehasz, J.P.; Franco, R.J.

    1997-07-01

    Staff of the Telemetry Technology Development Department (2664) have, in support of the U.S. Interior Department Mineral Management Services (MMS), developed and deployed the Seafloor Earthquake Measurement System IV (SEMS IV). The result of this development project is a series of three fully operational seafloor seismic monitor systems located at offshore platforms: Eureka, Grace, and Irene. The instrument probes are embedded from three to seven feet into the seafloor and hardwired to seismic data recorders installed top side at the offshore platforms. The probes and underwater cables were designed to survive the seafloor environment with an operation life of five years. The units have been operational for two years and have produced recordings of several minor earthquakes in that time. Sandia Labs will transfer operation of SEMS IV to MMS contractors in the coming months. 29 figs., 25 tabs.

  19. Practical approaches to earthquake prediction and warning

    NASA Astrophysics Data System (ADS)

    Kisslinger, Carl

    1984-04-01

    The title chosen for this renewal of the U.S.-Japan prediction seminar series reflects optimism, perhaps more widespread in Japan than in the United States, that research on earthquake prediction has progressed to a stage at which it is appropriate to begin testing operational forecast systems. This is not to suggest that American researchers do not recognize very substantial gains in understanding earthquake processes and earthquake recurrence, but rather that we are at the point of initiating pilot prediction experiments rather than asserting that we are prepared to start making earthquake predictions in a routine mode.For the sixth time since 1964, with support from the National Science Foundation and the Japan Society for the Promotion of Science, as well as substantial support from the U.S. Geological Survey (U.S.G.S.) for participation of a good representation of its own scientists, earthquake specialists from the two countries came together on November 7-11, 1983, to review progress of the recent past and share ideas about promising directions for future efforts. If one counts the 1980 Ewing symposium on prediction, sponsored by Lamont-Doherty Geological Observatory, which, though multinational, served the same purpose, one finds a continuity in these interchanges that has made them especially productive and stimulating for both scientific communities. The conveners this time were Chris Scholz, Lamont-Doherty, for the United States and Tsuneji Rikitake, Nihon University, for Japan.

  20. Earthquake swarms on Mount Erebus, Antarctica

    NASA Astrophysics Data System (ADS)

    Kaminuma, Katsutada; Baba, Megumi; Ueki, Sadato

    1986-12-01

    Mount Erebus (3794 m), located on Ross Island in McMurdo Sound, is one of the few active volcanoes in Antartica. A high-sensitivity seismic network has been operated by Japanese and US parties on and around the Volcano since December, 1980. The results of these observations show two kinds of seismic activity on Ross Island: activity concentrated near the summit of Mount Erebus associated with Strombolian eruptions, and micro-earthquake activity spread through Mount Erebus and the surrounding area. Seismicity on Mount Erebus has been quite high, usually exceeding 20 volcanic earthquakes per day. They frequently occur in swarms with daily counts exceeding 100 events. Sixteen earthquake swarms with more than 250 events per day were recorded by the seismic network during the three year period 1982-1984, and three notable earthquake swarms out of the sixteen were recognized, in October, 1982 (named 82-C), March-April, 1984 (84-B) and July, 1984 (84-F). Swarms 84-B and 84-F have a large total number of earthquakes and large Ishimoto-Iida's "m"; hence these two swarms are presumed to constitute on one of the precursor phenomena to the new eruption, which took place on 13 September, 1984, and lasted a few months.

  1. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  2. Prospective Tests of Southern California Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.

    2004-12-01

    We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability

  3. New characteristics of intensity assessment of Sichuan Lushan "4.20" M s7.0 earthquake

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Yan, Peilei; Chen, Xiangzhao

    2014-08-01

    The post-earthquake rapid accurate assessment of macro influence of seismic ground motion is of significance for earthquake emergency relief, post-earthquake reconstruction and scientific research. The seismic intensity distribution map released by the Lushan earthquake field team of the China Earthquake Administration (CEA) five days after the strong earthquake ( M7.0) occurred in Lushan County of Sichuan Ya'an City at 8:02 on April 20, 2013 provides a scientific basis for emergency relief, economic loss assessment and post-earthquake reconstruction. In this paper, the means for blind estimation of macroscopic intensity, field estimation of macro intensity, and review of intensity, as well as corresponding problems are discussed in detail, and the intensity distribution characteristics of the Lushan "4.20" M7.0 earthquake and its influential factors are analyzed, providing a reference for future seismic intensity assessments.

  4. Trial application of guidelines for nuclear plant response to an earthquake. Final report

    SciTech Connect

    Schmidt, W.; Oliver, R.; O`Connor, W.

    1993-09-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. These guidelines are published in EPRI report NP-6695, ``Guidelines for Nuclear Plant Response to an Earthquake,`` dated December 1989. This report includes two sets of nuclear plant procedures which were prepared to implement the guidelines of EPRI report NP-6695. The first set were developed by the Toledo Edison Company Davis-Besse plant. Davis-Besse is a pressurized water reactor (PWR) and contains relatively standard seismic monitoring instrumentation typical of many domestic nuclear plants. The second set of procedures were prepared by Yankee Atomic Electric Company for the Vermont Yankee facility. This plant is a boiling water reactor (BWR) with state-of-the-art seismic monitoring and PC-based data processing equipment, software developed specifically to implement the OBE Exceedance Criterion presented in EPRI report NP-5930, ``A Criterion for Determining Exceedance of the operating Basis Earthquake.`` The two sets of procedures are intended to demonstrate how two different nuclear utilities have interpreted and applied the EPRI guidance given in report NP-6695.

  5. Triggering of repeated earthquakes

    NASA Astrophysics Data System (ADS)

    Sobolev, G. A.; Zakrzhevskaya, N. A.; Sobolev, D. G.

    2016-03-01

    Based on the analysis of the world's earthquakes with magnitudes M ≥ 6.5 for 1960-2013, it is shown that they cause global-scale coherent seismic oscillations which most distinctly manifest themselves in the period interval of 4-6 min during 1-3 days after the event. After these earthquakes, a repeated shock has an increased probability to occur in different seismically active regions located as far away as a few thousand km from the previous event, i.e., a remote interaction of seismic events takes place. The number of the repeated shocks N( t) decreases with time, which characterizes the memory of the lithosphere about the impact that has occurred. The time decay N( t) can be approximated by the linear, exponential, and powerlaw dependences. No distinct correlation between the spatial locations of the initial and repeated earthquakes is revealed. The probable triggering mechanisms of the remote interaction between the earthquakes are discussed. Surface seismic waves traveling several times around the Earth's, coherent oscillations, and global source are the most preferable candidates. This may lead to the accumulation and coalescence of ruptures in the highly stressed or weakened domains of a seismically active region, which increases the probability of a repeated earthquake.

  6. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  7. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  8. The recent tectonic stress districts and strong earthquakes in China

    NASA Astrophysics Data System (ADS)

    Xie, F.; Zhang, H.

    2010-12-01

    According to the stress state and force source character, the recent tectonic stress field of China is preliminary divided into four classes. Among them, there are two first order districts, four second order districts, five third order districts and twenty-six fourth order districts. By analyzing those tectonic stress districts and strong earthquakes, the close relation between them is mainly summarized as follows: (1) The boundary of stress districts especially the first or second order boundary controlled by the interaction of tectonic plates has strong earthquakes very easily and frequently. (2) Stress districts with stress direction, regime type and stress value transformation are concentrative zones of strong earthquakes. (3) Stress districts with local stress differentiation but in the homogeneous stress background are the places where strong earthquakes are rela-tively concentrated. On the basis of these research work, we discuss the present dynamic environment in China from force source and plates movement.

  9. Estimating surface faulting impacts from the shakeout scenario earthquake

    USGS Publications Warehouse

    Treiman, J.A.; Pontib, D.J.

    2011-01-01

    An earthquake scenario, based on a kinematic rupture model, has been prepared for a Mw 7.8 earthquake on the southern San Andreas Fault. The rupture distribution, in the context of other historic large earthquakes, is judged reasonable for the purposes of this scenario. This model is used as the basis for generating a surface rupture map and for assessing potential direct impacts on lifelines and other infrastructure. Modeling the surface rupture involves identifying fault traces on which to place the rupture, assigning slip values to the fault traces, and characterizing the specific displacements that would occur to each lifeline impacted by the rupture. Different approaches were required to address variable slip distribution in response to a variety of fault patterns. Our results, involving judgment and experience, represent one plausible outcome and are not predictive because of the variable nature of surface rupture. ?? 2011, Earthquake Engineering Research Institute.

  10. Earthquakes: Megathrusts and mountain building

    NASA Astrophysics Data System (ADS)

    Briggs, Rich

    2016-05-01

    Coastlines above subduction zones slowly emerge from the sea despite repeated drowning by great, shallow earthquakes. Analysis of the Chilean coast suggests that moderate-to-large, deeper earthquakes may be responsible for the net uplift.

  11. Distribution of similar earthquakes in aftershocks of inland earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, M.; Hiramatsu, Y.; Aftershock Observations Of 2007 Noto Hanto, G.

    2010-12-01

    Frictional properties control the slip behavior on a fault surface such as seismic slip and aseismic slip. Asperity, as a seismic slip area, is characterized by a strong coupling in the interseismic period and large coseismic slip. On the other hand, steady slip or afterslip occurs in an aseismic slip area around the asperity. If an afterslip area includes small asperities, a repeating rupture of single asperity can generate similar earthquakes due to the stress accumulation caused by the afterslip. We here investigate a detail distribution of similar earthquakes in the aftershocks of the 2007 Noto Hanto earthquake (Mjma 6.9) and the 2000 Western Tottori earthquake (Mjma 7.3), inland large earthquakes in Japan. We use the data obtained by the group for the aftershock observations of the 2007 Noto Hanto Earthquake and by the group for the aftershock observations of the 2000 Western Tottori earthquake. First, we select pairs of aftershocks whose cross correlation coefficients in 10 s time window of band-pass filtered waveforms of 1~4 Hz are greater than 0.95 at more than 5 stations and divide those into groups by a link of the cross correlation coefficients. Second, we reexamine the arrival times of P and S waves and the maximum amplitude for earthquakes of each group and apply the double-difference method (Waldhouser and Ellsworth, 2000) to relocate them. As a result of the analysis, we find 24 groups of similar earthquakes in the aftershocks on the source fault of the 2007 Noto Hanto Earthquake and 86 groups of similar earthquakes in the aftershocks on the source fault of the 2000 Western Tottori Earthquake. Most of them are distributed around or outside the asperity of the main shock. Geodetic studies reported that postseismic deformation was detected for the both earthquakes (Sagiya et al., 2002; Hashimoto et al., 2008). The source area of similar earthquakes seems to correspond to the afterslip area. These features suggest that the similar earthquakes observed

  12. Large magnitude (M > 7.5) offshore earthquakes in 2012: few examples of absent or little tsunamigenesis, with implications for tsunami early warning

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Tinti, Stefano

    2013-04-01

    We take into account some examples of offshore earthquakes occurred worldwide in year 2012 that were characterised by a "large" magnitude (Mw equal or larger than 7.5) but which produced no or little tsunami effects. Here, "little" is intended as "lower than expected on the basis of the parent earthquake magnitude". The examples we analyse include three earthquakes occurred along the Pacific coasts of Central America (20 March, Mw=7.8, Mexico; 5 September, Mw=7.6, Costa Rica; 7 November, Mw=7.5, Mexico), the Mw=7.6 and Mw=7.7 earthquakes occurred respectively on 31 August and 28 October offshore Philippines and offshore Alaska, and the two Indian Ocean earthquakes registered on a single day (11 April) and characterised by Mw=8.6 and Mw=8.2. For each event, we try to face the problem related to its tsunamigenic potential from two different perspectives. The first can be considered purely scientific and coincides with the question: why was the ensuing tsunami so weak? The answer can be related partly to the particular tectonic setting in the source area, partly to the particular position of the source with respect to the coastline, and finally to the focal mechanism of the earthquake and to the slip distribution on the ruptured fault. The first two pieces of information are available soon after the earthquake occurrence, while the third requires time periods in the order of tens of minutes. The second perspective is more "operational" and coincides with the tsunami early warning perspective, for which the question is: will the earthquake generate a significant tsunami and if so, where will it strike? The Indian Ocean events of 11 April 2012 are perfect examples of the fact that the information on the earthquake magnitude and position alone may not be sufficient to produce reliable tsunami warnings. We emphasise that it is of utmost importance that the focal mechanism determination is obtained in the future much more quickly than it is at present and that this

  13. Testing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Luen, Brad; Stark, Philip B.

    2008-01-01

    Statistical tests of earthquake predictions require a null hypothesis to model occasional chance successes. To define and quantify 'chance success' is knotty. Some null hypotheses ascribe chance to the Earth: Seismicity is modeled as random. The null distribution of the number of successful predictions - or any other test statistic - is taken to be its distribution when the fixed set of predictions is applied to random seismicity. Such tests tacitly assume that the predictions do not depend on the observed seismicity. Conditioning on the predictions in this way sets a low hurdle for statistical significance. Consider this scheme: When an earthquake of magnitude 5.5 or greater occurs anywhere in the world, predict that an earthquake at least as large will occur within 21 days and within an epicentral distance of 50 km. We apply this rule to the Harvard centroid-moment-tensor (CMT) catalog for 2000-2004 to generate a set of predictions. The null hypothesis is that earthquake times are exchangeable conditional on their magnitudes and locations and on the predictions - a common "nonparametric" assumption in the literature. We generate random seismicity by permuting the times of events in the CMT catalog. We consider an event successfully predicted only if (i) it is predicted and (ii) there is no larger event within 50 km in the previous 21 days. The P-value for the observed success rate is <0.001: The method successfully predicts about 5% of earthquakes, far better than 'chance' because the predictor exploits the clustering of earthquakes - occasional foreshocks - which the null hypothesis lacks. Rather than condition on the predictions and use a stochastic model for seismicity, it is preferable to treat the observed seismicity as fixed, and to compare the success rate of the predictions to the success rate of simple-minded predictions like those just described. If the proffered predictions do no better than a simple scheme, they have little value.

  14. Slow earthquakes triggered by typhoons.

    PubMed

    Liu, ChiChing; Linde, Alan T; Sacks, I Selwyn

    2009-06-11

    The first reports on a slow earthquake were for an event in the Izu peninsula, Japan, on an intraplate, seismically active fault. Since then, many slow earthquakes have been detected. It has been suggested that the slow events may trigger ordinary earthquakes (in a context supported by numerical modelling), but their broader significance in terms of earthquake occurrence remains unclear. Triggering of earthquakes has received much attention: strain diffusion from large regional earthquakes has been shown to influence large earthquake activity, and earthquakes may be triggered during the passage of teleseismic waves, a phenomenon now recognized as being common. Here we show that, in eastern Taiwan, slow earthquakes can be triggered by typhoons. We model the largest of these earthquakes as repeated episodes of slow slip on a reverse fault just under land and dipping to the west; the characteristics of all events are sufficiently similar that they can be modelled with minor variations of the model parameters. Lower pressure results in a very small unclamping of the fault that must be close to the failure condition for the typhoon to act as a trigger. This area experiences very high compressional deformation but has a paucity of large earthquakes; repeating slow events may be segmenting the stressed area and thus inhibiting large earthquakes, which require a long, continuous seismic rupture. PMID:19516339

  15. Turkish Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  16. Seismic survey probes urban earthquake hazards in Pacific Northwest

    USGS Publications Warehouse

    Fisher, M.A.; Brocher, T.M.; Hyndman, R.D.; Trehu, A.M.; Weaver, C.S.; Creager, K.C.; Crosson, R.S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B.C.; Hammer, P.T.; Childs, J. R.; Cochrane, G.R.; Chopra, S.; Walia, R.

    1999-01-01

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region. The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  17. East Texas's biggest earthquake may have been induced

    NASA Astrophysics Data System (ADS)

    Schultz, Colin

    2014-05-01

    Aside from a few small events, east Texas has been largely devoid of earthquakes. However, operations began in 2006 to pump waste water from oil and gas production into wells around the region, with some sites being injected with just shy of 43,000 cubic meters of water per month. Within a few years, this seismic quiet zone began to feel some temblors: In 2008 the preshocks started, small events with magnitudes from 0.5 to 2.2. Then, on 10 May 2012, a magnitude 3.9 earthquake hit, chased a week later by a magnitude 4.8 earthquake.

  18. Seismic survey probes urban earthquake hazards in Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Fisher, M. A.; Brocher, T. M.; Hyndman, R. D.; Trehu, A. M.; Weaver, C. S.; Creager, K. C.; Crosson, R. S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B. C.; Hammer, P. T.; ten Brink, U.; Pratt, T. L.; Miller, K. C.; Childs, J. R.; Cochrane, G. R.; Chopra, S.; Walia, R.

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region.The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  19. Reducing the Risks of Nonstructural Earthquake Damage: A Practical Guide. Earthquake Hazards Reduction Series 1.

    ERIC Educational Resources Information Center

    Reitherman, Robert

    The purpose of this booklet is to provide practical information to owners, operators, and occupants of office and commercial buildings on the vulnerabilities posed by earthquake damage to nonstructural items and the means available to deal with these potential problems. Examples of dangerous nonstructural damages that have occurred in past…

  20. Forecasters of earthquakes

    NASA Astrophysics Data System (ADS)

    Maximova, Lyudmila

    1987-07-01

    For the first time Soviet scientists have set up a bioseismological proving ground which will stage a systematic extensive experiment of using birds, ants, mountain rodents including marmots, which can dig holes in the Earth's interior to a depth of 50 meters, for the purpose of earthquake forecasting. Biologists have accumulated extensive experimental data on the impact of various electromagnetic fields, including fields of weak intensity, on living organisms. As far as mammals are concerned, electromagnetic waves with frequencies close to the brain's biorhythms have the strongest effect. How these observations can be used to forecast earthquakes is discussed.

  1. Earthquakes in New England

    USGS Publications Warehouse

    Fratto, E. S.; Ebel, J.E.; Kadinsky-Cade, K.

    1990-01-01

    New England has a long history of earthquakes. Some of the first explorers were startled when they experienced strong shaking and rumbling of the earth below their feet. they soon learned from the Indians that this was not an uncommon occurrence in the New World. the Plymouth Pilgrims felt their first earthquake in 1638. that first shock rattled dishes, doors, and buildings. The shaking so frightened those working in the fields that they threw down their tools and ran panic-stricken through the countryside. 

  2. California earthquake history

    USGS Publications Warehouse

    Toppozada, T.; Branum, D.

    2004-01-01

    This paper presents an overview of the advancement in our knowledge of California's earthquake history since ??? 1800, and especially during the last 30 years. We first review the basic statewide research on earthquake occurrences that was published from 1928 through 2002, to show how the current catalogs and their levels of completeness have evolved with time. Then we review some of the significant new results in specific regions of California, and some of what remains to be done. Since 1850, 167 potentially damaging earthquakes of M ??? 6 or larger have been identified in California and its border regions, indicating an average rate of 1.1 such events per year. Table I lists the earthquakes of M ??? 6 to 6.5 that were also destructive since 1812 in California and its border regions, indicating an average rate of one such event every ??? 5 years. Many of these occurred before 1932 when epicenters and magnitudes started to be determined routinely using seismographs in California. The number of these early earthquakes is probably incomplete in sparsely populated remote parts of California before ??? 1870. For example, 6 of the 7 pre-1873 events in table I are of M ??? 7, suggesting that other earthquakes of M 6.5 to 6.9 occurred but were not properly identified, or were not destructive. The epicenters and magnitudes (M) of the pre-instrumental earthquakes were determined from isoseismal maps that were based on the Modified Mercalli Intensity of shaking (MMI) at the communities that reported feeling the earthquakes. The epicenters were estimated to be in the regions of most intense shaking, and values of M were estimated from the extent of the areas shaken at various MMI levels. MMI VII or greater shaking is the threshold of damage to weak buildings. Certain areas in the regions of Los Angeles, San Francisco, and Eureka were each shaken repeatedly at MMI VII or greater at least six times since ??? 1812, as depicted by Toppozada and Branum (2002, fig. 19).

  3. Headaches prior to earthquakes

    NASA Astrophysics Data System (ADS)

    Morton, L. L.

    1988-06-01

    In two surveys of headaches it was noted that their incidence had increased significantly within 48 h prior to earthquakes from an incidence of 17% to 58% in the first survey using correlated samples and from 20.4% to 44% in the second survey using independent samples. It is suggested that an increase in positive air ions from rock compression may trigger head pain via a decrease in brain levels of the neurotransmitter serotonin. The findings are presented as preliminary, with the hope of generating further research efforts in areas more prone to earthquakes.

  4. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    NASA Astrophysics Data System (ADS)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  5. WEST Physics Basis

    NASA Astrophysics Data System (ADS)

    Bourdelle, C.; Artaud, J. F.; Basiuk, V.; Bécoulet, M.; Brémond, S.; Bucalossi, J.; Bufferand, H.; Ciraolo, G.; Colas, L.; Corre, Y.; Courtois, X.; Decker, J.; Delpech, L.; Devynck, P.; Dif-Pradalier, G.; Doerner, R. P.; Douai, D.; Dumont, R.; Ekedahl, A.; Fedorczak, N.; Fenzi, C.; Firdaouss, M.; Garcia, J.; Ghendrih, P.; Gil, C.; Giruzzi, G.; Goniche, M.; Grisolia, C.; Grosman, A.; Guilhem, D.; Guirlet, R.; Gunn, J.; Hennequin, P.; Hillairet, J.; Hoang, T.; Imbeaux, F.; Ivanova-Stanik, I.; Joffrin, E.; Kallenbach, A.; Linke, J.; Loarer, T.; Lotte, P.; Maget, P.; Marandet, Y.; Mayoral, M. L.; Meyer, O.; Missirlian, M.; Mollard, P.; Monier-Garbet, P.; Moreau, P.; Nardon, E.; Pégourié, B.; Peysson, Y.; Sabot, R.; Saint-Laurent, F.; Schneider, M.; Travère, J. M.; Tsitrone, E.; Vartanian, S.; Vermare, L.; Yoshida, M.; Zagorski, R.; Contributors, JET

    2015-06-01

    With WEST (Tungsten Environment in Steady State Tokamak) (Bucalossi et al 2014 Fusion Eng. Des. 89 907-12), the Tore Supra facility and team expertise (Dumont et al 2014 Plasma Phys. Control. Fusion 56 075020) is used to pave the way towards ITER divertor procurement and operation. It consists in implementing a divertor configuration and installing ITER-like actively cooled tungsten monoblocks in the Tore Supra tokamak, taking full benefit of its unique long-pulse capability. WEST is a user facility platform, open to all ITER partners. This paper describes the physics basis of WEST: the estimated heat flux on the divertor target, the planned heating schemes, the expected behaviour of the L-H threshold and of the pedestal and the potential W sources. A series of operating scenarios has been modelled, showing that ITER-relevant heat fluxes on the divertor can be achieved in WEST long pulse H-mode plasmas.

  6. Earthquake prediction comes of age

    SciTech Connect

    Lindth, A. . Office of Earthquakes, Volcanoes, and Engineering)

    1990-02-01

    In the last decade, scientists have begun to estimate the long-term probability of major earthquakes along the San Andreas fault. In 1985, the U.S. Geological Survey (USGS) issued the first official U.S. government earthquake prediction, based on research along a heavily instrumented 25-kilometer section of the fault in sparsely populated central California. Known as the Parkfield segment, this section of the Sand Andreas had experienced its last big earthquake, a magnitude 6, in 1966. Estimated probabilities of major quakes along the entire San Andreas by a working group of California earthquake experts, using new geologic data and careful analysis of past earthquakes, are reported.

  7. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  8. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  9. Fractal dynamics of earthquakes

    SciTech Connect

    Bak, P.; Chen, K.

    1995-05-01

    Many objects in nature, from mountain landscapes to electrical breakdown and turbulence, have a self-similar fractal spatial structure. It seems obvious that to understand the origin of self-similar structures, one must understand the nature of the dynamical processes that created them: temporal and spatial properties must necessarily be completely interwoven. This is particularly true for earthquakes, which have a variety of fractal aspects. The distribution of energy released during earthquakes is given by the Gutenberg-Richter power law. The distribution of epicenters appears to be fractal with dimension D {approx} 1--1.3. The number of after shocks decay as a function of time according to the Omori power law. There have been several attempts to explain the Gutenberg-Richter law by starting from a fractal distribution of faults or stresses. But this is a hen-and-egg approach: to explain the Gutenberg-Richter law, one assumes the existence of another power-law--the fractal distribution. The authors present results of a simple stick slip model of earthquakes, which evolves to a self-organized critical state. Emphasis is on demonstrating that empirical power laws for earthquakes indicate that the Earth`s crust is at the critical state, with no typical time, space, or energy scale. Of course the model is tremendously oversimplified; however in analogy with equilibrium phenomena they do not expect criticality to depend on details of the model (universality).

  10. HOMOGENEOUS CATALOGS OF EARTHQUAKES*

    PubMed Central

    Knopoff, Leon; Gardner, J. K.

    1969-01-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967. PMID:16578700

  11. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  12. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  13. Subcrustal earthquakes in the plate boundary zone of New Zealand's South Island

    NASA Astrophysics Data System (ADS)

    Boese, C. M.; Stern, T. A.; Townend, J.; Sheehan, A. F.; Molnar, P. H.; Collins, J. A.; Karalliyadda, S.; Bourguignon, S.; Bannister, S. C.

    2012-12-01

    Sporadic, intermediate-depth earthquakes have been observed for ~40 years in the vicinity of the Alpine Fault, a 460 km-long transpressive fault forming the western boundary of the Southern Alps. The Alpine Fault represents the plate boundary between the Australian and Pacific Plates in New Zealand and links two subduction zones of opposite polarity in the North and South. Several earthquakes at depths of 59-85 km have been recorded by the Southern Alps Microearthquake Borehole Array (SAMBA) since its deployment in November 2008. Due to large numbers of impulsive phase arrivals, focal mechanisms were obtained for these events during routine processing. In 2009 and early 2010, several additional temporary seismometer networks were operating in the central Southern Alps (Alpine Fault Array ALFA, Deep Fault Drilling Project 2010 DFDP10) and the offshore region west of the South Island (Marine Observations of Anisotropy Near Aotearoa MOANA). To gain more insight about the cause and mechanism of these deep events, a comprehensive analysis has been performed incorporating data from all available instruments. Accurate hypocentres of 22 earthquakes (ML<4) and focal mechanisms of at least 14 events have been obtained. The focal mechanisms reveal that reverse faulting predominates at depth in the continental collision zone between the Pacific and Australian Plates. The intermediate-depth events occur below the Moho discontinuity, which has been mapped in detail using wide-angle reflection/refraction data obtained during the South Island Geophysical Transect (SIGHT) project in 1995/96. Although the cause for these subcrustal earthquakes is not yet clear, they have previously been interpreted to result from intra-continental subduction (Reyners 1987), high shear-strain gradients due to depressed geotherms and viscous deformation of mantle lithosphere (Kohler and Eberhart-Phillips 2003). On the basis of the locations and mechanisms obtained using SAMBA, we have argued that

  14. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Mayeda, K.; Walter, W. R.

    2003-04-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by applying the same methodology to a series of datasets that spans roughly 10 orders in seismic moment, M0. We will summarize recent results using a coda envelope methodology of Mayeda et al, (2003) which provide the most stable source spectral estimates to date. This methodology eliminates the complicating effects of lateral path heterogeneity, source radiation pattern, directivity, and site response (e.g., amplification, f-max and kappa). We find that in tectonically active continental crustal areas the total radiated energy scales as M00.25 whereas in regions of relatively younger oceanic crust, the stress drop is generally lower and exhibits a 1-to-1 scaling with moment. In addition to answering a fundamental question in earthquake source dynamics, this study addresses how one would scale small earthquakes in a particular region up to a future, more damaging earthquake. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  15. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    NASA Astrophysics Data System (ADS)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  16. Monitoring road losses for Lushan 7.0 earthquake disaster utilization multisource remote sensing images

    NASA Astrophysics Data System (ADS)

    Huang, He; Yang, Siquan; Li, Suju; He, Haixia; Liu, Ming; Xu, Feng; Lin, Yueguan

    2015-12-01

    Earthquake is one major nature disasters in the world. At 8:02 on 20 April 2013, a catastrophic earthquake with Ms 7.0 in surface wave magnitude occurred in Sichuan province, China. The epicenter of this earthquake located in the administrative region of Lushan County and this earthquake was named the Lushan earthquake. The Lushan earthquake caused heavy casualties and property losses in Sichuan province. After the earthquake, various emergency relief supplies must be transported to the affected areas. Transportation network is the basis for emergency relief supplies transportation and allocation. Thus, the road losses of the Lushan earthquake must be monitoring. The road losses monitoring results for Lushan earthquake disaster utilization multisource remote sensing images were reported in this paper. The road losses monitoring results indicated that there were 166 meters' national roads, 3707 meters' provincial roads, 3396 meters' county roads, 7254 meters' township roads, and 3943 meters' village roads were damaged during the Lushan earthquake disaster. The damaged roads mainly located at Lushan County, Baoxing County, Tianquan County, Yucheng County, Mingshan County, and Qionglai County. The results also can be used as a decision-making information source for the disaster management government in China.

  17. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  18. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

  19. Force and pressure characteristics for a series of nose inlets at Mach numbers from 1.59 to 1.99 V : analysis and comparison on basis of ram-jet aircraft range and operational characteristics

    NASA Technical Reports Server (NTRS)

    Howard, E; Luidens, R W; Allen, J L

    1951-01-01

    Performance of four experimentally investigated axially symmetric spike-type nose inlets is compared on basis of ram-jet-engine aircraft range and operational problems. At design conditions, calculated peak engine efficiencies varied 25 percent from the highest value which indicates importance of inlet design. Calculations for a typical supersonic aircraft indicate possible increase in range if engine is flown at moderate angle of attack and result in engine lift utilized. For engines with fixed exhaust nozzle, propulsive thrust increases with increasing heat addition in subcritical flow region in spite of increasing additive drag. For the perforated inlet there is a range of increasing total-temperature ratios in subcritical flow region that does not yield an increase in propulsive thrust. Effects of inlet characteristics on speed stability of a typical aircraft for three types of fuel control is discussed.

  20. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    NASA Astrophysics Data System (ADS)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Barış, Şerif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is

  1. Detailed source process of the 2007 Tocopilla earthquake.

    NASA Astrophysics Data System (ADS)

    Peyrat, S.; Madariaga, R.; Campos, J.; Asch, G.; Favreau, P.; Bernard, P.; Vilotte, J.

    2008-05-01

    We investigated the detail rupture process of the Tocopilla earthquake (Mw 7.7) of the 14 November 2007 and of the main aftershocks that occurred in the southern part of the North Chile seismic gap using strong motion data. The earthquake happen in the middle of the permanent broad band and strong motion network IPOC newly installed by GFZ and IPGP, and of a digital strong-motion network operated by the University of Chile. The Tocopilla earthquake is the last large thrust subduction earthquake that occurred since the major Iquique 1877 earthquake which produced a destructive tsunami. The Arequipa (2001) and Antofagasta (1995) earthquakes already ruptured the northern and southern parts of the gap, and the intraplate intermediate depth Tarapaca earthquake (2005) may have changed the tectonic loading of this part of the Peru-Chile subduction zone. For large earthquakes, the depth of the seismic rupture is bounded by the depth of the seismogenic zone. What controls the horizontal extent of the rupture for large earthquakes is less clear. Factors that influence the extent of the rupture include fault geometry, variations of material properties and stress heterogeneities inherited from the previous ruptures history. For subduction zones where structures are not well known, what may have stopped the rupture is not obvious. One crucial problem raised by the Tocopilla earthquake is to understand why this earthquake didn't extent further north, and at south, what is the role of the Mejillones peninsula that seems to act as a barrier. The focal mechanism was determined using teleseismic waveforms inversion and with a geodetic analysis (cf. Campos et al.; Bejarpi et al., in the same session). We studied the detailed source process using the strong motion data available. This earthquake ruptured the interplate seismic zone over more than 150 km and generated several large aftershocks, mainly located south of the rupture area. The strong-motion data show clearly two S

  2. The health effects of earthquakes in the mid-1990s.

    PubMed

    Alexander, D

    1996-09-01

    This paper gives an overview of the global pattern of casualties in earthquakes which occurred during the 30-month period from 1 September 1993 to 29 February 1996. It also describes some of the behavioural and logistical regularities associated with mortality and morbidity in these events. Of 83 earthquakes studied, there were casualties in 49. Lethal earthquakes occurred in rapid succession in Indonesia, China, Colombia and Iran. In the events studied, a disproportionate number of deaths and injuries occurred during the first six hours of the day and in earthquakes with magnitudes between 6.5 and 7.4. Ratios of death to injury varied markedly (though with some averages close to 1:3), as did the nature and causes of mortality and morbidity and the proportion of serious to slight injuries. As expected on the basis of previous knowledge, few problems were caused by post-earthquake illness and disease. Also, as expected, building collapse was the principal source of casualties: tsunamis, landslides, debris flows and bridge collapses were the main secondary causes. In addition, new findings are presented on the temporal sequence of casualty estimates after seismic disaster. In synthesis, though mortality in earthquakes may have been low in relation to long-term averages, the interval of time studied was probably typical of other periods in which seismic catastrophes were relatively limited in scope. PMID:8854459

  3. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  4. Earthquake Hazard Mitigation and Real-Time Warnings of Tsunamis and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2015-09-01

    With better understanding of earthquake physics and the advent of broadband seismology and GPS, seismologists can forecast the future activity of large earthquakes on a sound scientific basis. Such forecasts are critically important for long-term hazard mitigation, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainties, and unexpected events will inevitably occur. Recent developments in real-time seismology helps seismologists cope with and prepare for such unexpected events, including tsunamis and earthquakes. For a tsunami warning, the required warning time is fairly long (usually 5 min or longer) and enables use of a rigorous method for this purpose. Significant advances have already been made. In contrast, early warning of earthquakes is far more challenging because the required warning time is very short (as short as three seconds). Despite this difficulty the methods used for regional warnings have advanced substantially, and several systems have been already developed and implemented. A future strategy for more challenging, rapid (a few second) warnings, which are critically important for saving properties and lives, is discussed.

  5. Present, Past, Future - What earthquake clusters can tell us about an upcoming Marmara Sea earthquake

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    ground acceleration maps is presented and an expectation for the magnitude-dependent aftershock activity which is based on a correlation and extrapolation of earthquake clusters around the Marmara Sea. This type of scenario building approach provides a more detailed basis for risk assessment and management planning with a more realistic scenario providing better analysis and socioeconomic effect study potential in the next disaster.

  6. Non-characteristic recurrence behavior on the 1942 Niksar-Erbaa earthquake rupture along the North Anatolian fault system, Turkey

    NASA Astrophysics Data System (ADS)

    Kondo, H.; Kürçer, A.; Özalp, S.; Emre, Ö.

    2009-04-01

    Repeatability of surface slip distribution through earthquake cycles is basis to evaluate size and timing of future large earthquakes generated by active faults. In order to examine characteristic slip hypothesis on the North Anatolian fault system (NAFS), we have systematically performed 3D trenching survey on the 1944 Bolu-Gerede and the 1942 Niksar-Erbaa earthquake ruptures, to simultaneously reconstruct timing and slip associated with paleoearthquakes. These two earthquake segments are relatively well-known on historical earthquake records indicating the timing and the rupture extent. The results suggest that 1) the NAFS is highly segmented in several tens km long, 2) past large earthquakes have been produced by the multi-segment faulting, and 3) each fault segment seems to have their own characteristics of recurrence behavior. At Demir Tepe site on the Gerede segment which recorded the maximum slip during the 1944 earthquake (M7.4), we revealed the repetition of ca. 5-m-slips and quasi-periodic repeat time of ca. 330 year. The reconstructed slip history gives us to support characteristic slip behavior on the segment, although the segment had ruptured during the historical earthquakes with greatly varied rupture length for each. On the other hands, at Ayvaz site on the Niksar segment which recorded 2.5-m-slip during the 1942 earthquake (M7.0), the preliminary results of 3D trenching exhibit 6.0-m-slip during the penultimate event, probably corresponding to the 1668 Anatolian earthquake. Since this gigantic M8 earthquake ruptured through almost half of the entire NAFS, including both the 1944 and the 1942 earthquake segments, the 1942 type earthquake is not characteristic earthquake. This non-characteristic behavior implies various sizes of large earthquakes have occurred on the NAFS. The key for understanding multi-segment ruptures may be recognition of such macroscopic barrier segment like the 1942 earthquake segment.

  7. Microbiological study of pathogenic bacteria isolated from paediatric wound infections following the 2008 Wenchuan earthquake.

    PubMed

    Ran, Ying-Chun; Ao, Xiao-Xiao; Liu, Lan; Fu, Yi-Long; Tuo, Hui; Xu, Feng

    2010-05-01

    On 12 May 2008, the Wenchuan earthquake struck in Sichuan, China. Within 1 month after the earthquake, 98 injured children were admitted to the Children's Hospital of Chongqing Medical University. According to clinical manifestations, 50 children were diagnosed with wound infections. Wound secretions were cultured for bacteria. Pathogen distribution and drug resistance were analyzed. A total of 99 pathogens were isolated; 16 (16%) were Gram-positive bacteria and 81 (82%) were Gram-negative bacteria. The distribution of pathogens isolated within 1 month after the earthquake was different to the distribution of pathogens in 546 general hospitalized cases in the y before the earthquake. The pathogens most frequently isolated 1 month after the earthquake were Acinetobacter baumannii (27%), Enterobacter cloacae (18%) and Pseudomonas aeruginosa (13%). The pathogens most frequently isolated in the y prior to the earthquake were Escherichia coli (27%), Staphylococcus aureus (23%) and coagulase-negative staphylococci (9%). The rate of isolated drug-resistant bacteria was higher in the earthquake cases than in the general hospitalized cases. In the cases injured in the earthquake, the rates of isolation of methicillin-resistant Staphylococcus aureus and extended-spectrum beta-lactamase-producing E. cloacae, E. coli and Klebsiella pneumoniae were higher than in the cases from before the earthquake. Multidrug-resistant and pandrug-resistant A. baumannii were isolated at a higher rate in cases after the earthquake than in those before the earthquake. These changes in the spectrum of pathogens and in the drug resistance of the pathogens isolated following an earthquake will provide the basis for emergency treatment after earthquakes. PMID:20095936

  8. Improving the RST Approach for Earthquake Prone Areas Monitoring: Results of Correlation Analysis among Significant Sequences of TIR Anomalies and Earthquakes (M>4) occurred in Italy during 2004-2014

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Coviello, I.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2015-12-01

    Looking toward the assessment of a multi-parametric system for dynamically updating seismic hazard estimates and earthquake short term (from days to weeks) forecast, a preliminary step is to identify those parameters (chemical, physical, biological, etc.) whose anomalous variations can be, to some extent, associated to the complex process of preparation of a big earthquake. Among the different parameters, the fluctuations of Earth's thermally emitted radiation, as measured by sensors on board of satellite system operating in the Thermal Infra-Red (TIR) spectral range, have been proposed since long time as potential earthquake precursors. Since 2001, a general approach called Robust Satellite Techniques (RST) has been used to discriminate anomalous thermal signals, possibly associated to seismic activity from normal fluctuations of Earth's thermal emission related to other causes (e.g. meteorological) independent on the earthquake occurrence. Thanks to its full exportability on different satellite packages, RST has been implemented on TIR images acquired by polar (e.g. NOAA-AVHRR, EOS-MODIS) and geostationary (e.g. MSG-SEVIRI, NOAA-GOES/W, GMS-5/VISSR) satellite sensors, in order to verify the presence (or absence) of TIR anomalies in presence (absence) of earthquakes (with M>4) in different seismogenic areas around the world (e.g. Italy, Turkey, Greece, California, Taiwan, etc.).In this paper, a refined RST (Robust Satellite Techniques) data analysis approach and RETIRA (Robust Estimator of TIR Anomalies) index were used to identify Significant Sequences of TIR Anomalies (SSTAs) during eleven years (from May 2004 to December 2014) of TIR satellite records, collected over Italy by the geostationary satellite sensor MSG-SEVIRI. On the basis of specific validation rules (mainly based on physical models and results obtained by applying RST approach to several earthquakes all around the world) the level of space-time correlation among SSTAs and earthquakes (with M≥4

  9. Scientific aspects of the Tohoku earthquake and Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Koketsu, Kazuki

    2016-04-01

    We investigated the 2011 Tohoku earthquake, the accident of the Fukushima Daiichi nuclear power plant, and assessments conducted beforehand for earthquake and tsunami potential in the Pacific offshore region of the Tohoku District. The results of our investigation show that all the assessments failed to foresee the earthquake and its related tsunami, which was the main cause of the accident. Therefore, the disaster caused by the earthquake, and the accident were scientifically unforeseeable at the time. However, for a zone neighboring the reactors, a 2008 assessment showed tsunamis higher than the plant height. As a lesson learned from the accident, companies operating nuclear power plants should be prepared using even such assessment results for neighboring zones.

  10. The Earthquake That Tweeted

    NASA Astrophysics Data System (ADS)

    Petersen, D.

    2011-12-01

    Advances in mobile technology and social networking are enabling new behaviors that were not possible even a few short years ago. When people experience a tiny earthquake, it's more likely they're going to reach for their phones and tell their friends about it than actually take cover under a desk. With 175 million Twitter accounts, 750 million Facebook users and more than five billion mobile phones in the world today, people are generating terrific amounts of data simply by going about their everyday lives. Given the right tools and guidance these connected individuals can act as the world's largest sensor network, doing everything from reporting on earthquakes to anticipating global crises. Drawing on the author's experience as a user researcher and experience designer, this presentation will discuss these trends in crowdsourcing the collection and analysis of data, and consider their implications for how the public encounters the earth sciences in their everyday lives.

  11. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  12. Pain after earthquake

    PubMed Central

    2012-01-01

    Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009). Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%). Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations. PMID:22747796

  13. Fault lubrication during earthquakes.

    PubMed

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved. PMID:21430777

  14. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  15. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  16. Foreshocks of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Guglielmi, A. V.; Sobisevich, L. E.; Sobisevich, A. L.; Lavrov, I. P.

    2014-07-01

    The specific enhancement of ultra-low-frequency (ULF) electromagnetic oscillations a few hours prior to the strong earthquakes, which was previously mentioned in the literature, motivated us to search for the distinctive features of the mechanical (foreshock) activity of the Earth's crust in the epicentral zones of the future earthquakes. Activation of the foreshocks three hours before the main shock is revealed, which is roughly similar to the enhancement of the specific electromagnetic ULF emission. It is hypothesized that the round-the-world seismic echo signals from the earthquakes, which form the peak of energy release 2 h 50 min before the main events, act as the triggers of the main shocks due to the cumulative action of the surface waves converging to the epicenter. It is established that the frequency of the fluctuations in the foreshock activity decreases at the final stages of the preparation of the main shocks, which probably testifies to the so-called mode softening at the approach of the failure point according to the catastrophe theory.

  17. VLF/LF EM emissions as main precursor of earthquakes and their searching possibilities for Georgian s/a region

    NASA Astrophysics Data System (ADS)

    Kachakhidze, Manana; Kachakhidze, Nino

    2016-04-01

    Authors of abstract have created work which offers model of earth electromagnetic emissions generation detected in the process of earthquake preparation on the basis of electrodynamics. The model gives qualitative explanation of a mechanism of generation of electromagnetic waves emitted in the earthquake preparation period. Besides, scheme of the methodology of earthquake forecasting is created based on avalanche-like unstable model of fault formation and an analogous model of electromagnetic contour, synthesis of which, is rather harmonious. According to the authors of the work electromagnetic emissions in radiodiapason is more universal and reliable than other anomalous variations of various geophysical phenomena in earthquake preparation period; Besides, VLF/LF electromagnetic emissions might be declared as the main precursor of earthquake because it might turn out very useful with the view of prediction of large (M ≥5) inland earthquakes and to govern processes going on in lithosphere-atmosphere - ionosphere coupling (LAIC) system. Since the other geophysical phenomena, which may accompany earthquake preparation process and expose themselves several months, weeks or days prior to earthquakes are less informative with the view of earthquake forecasting, it is admissible to consider them as earthquake indicators. Physical mechanisms of mentioned phenomena are explained on the basis of the model of generation of electromagnetic emissions detected before earthquake, where a process of earthquake preparation and its realization are considered taking into account distributed and conservative systems properties. Up to these days electromagnetic emissions detection network did not exist in Georgia. European colleagues helped us (Prof. Dr. PF Biagi, Prof. Dr. Aydın BÜYÜKSARAÇ) and made possible the installation of a receiver. We are going to develop network and put our share in searching of earthquakes problem. Participation in conference is supported by financial

  18. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  19. Tien Shan Geohazards Database: Earthquakes and landslides

    NASA Astrophysics Data System (ADS)

    Havenith, H. B.; Strom, A.; Torgoev, I.; Torgoev, A.; Lamair, L.; Ischuk, A.; Abdrakhmatov, K.

    2015-11-01

    In this paper we present new and review already existing landslide and earthquake data for a large part of the Tien Shan, Central Asia. For the same area, only partial databases for sub-regions have been presented previously. They were compiled and new data were added to fill the gaps between the databases. Major new inputs are products of the Central Asia Seismic Risk Initiative (CASRI): a tentative digital map of active faults (even with indication of characteristic or possible maximum magnitude) and the earthquake catalogue of Central Asia until 2009 that was now updated with USGS data (to May 2014). The new compiled landslide inventory contains existing records of 1600 previously mapped mass movements and more than 1800 new landslide data. Considering presently available seismo-tectonic and landslide data, a target region of 1200 km (E-W) by 600 km (N-S) was defined for the production of more or less continuous geohazards information. This target region includes the entire Kyrgyz Tien Shan, the South-Western Tien Shan in Tajikistan, the Fergana Basin (Kyrgyzstan, Tajikistan and Uzbekistan) as well as the Western part in Uzbekistan, the North-Easternmost part in Kazakhstan and a small part of the Eastern Chinese Tien Shan (for the zones outside Kyrgyzstan and Tajikistan, only limited information was available and compiled). On the basis of the new landslide inventory and the updated earthquake catalogue, the link between landslide and earthquake activity is analysed. First, size-frequency relationships are studied for both types of geohazards, in terms of Gutenberg-Richter Law for the earthquakes and in terms of probability density function for the landslides. For several regions and major earthquake events, case histories are presented to outline further the close connection between earthquake and landslide hazards in the Tien Shan. From this study, we concluded first that a major hazard component is still now insufficiently known for both types of geohazards

  20. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  1. Earthquake triggering at alaskan volcanoes following the 3 November 2002 denali fault earthquake

    USGS Publications Warehouse

    Moran, S.C.; Power, J.A.; Stihler, S.D.; Sanchez, J.J.; Caplan-Auerbach, J.

    2004-01-01

    The 3 November 2002 Mw 7.9 Denali fault earthquake provided an excellent opportunity to investigate triggered earthquakes at Alaskan volcanoes. The Alaska Volcano Observatory operates short-period seismic networks on 24 historically active volcanoes in Alaska, 247-2159 km distant from the mainshock epicenter. We searched for evidence of triggered seismicity by examining the unfiltered waveforms for all stations in each volcano network for ???1 hr after the Mw 7.9 arrival time at each network and for significant increases in located earthquakes in the hours after the mainshock. We found compelling evidence for triggering only at the Katmai volcanic cluster (KVC, 720-755 km southwest of the epicenter), where small earthquakes with distinct P and 5 arrivals appeared within the mainshock coda at one station and a small increase in located earthquakes occurred for several hours after the mainshock. Peak dynamic stresses of ???0.1 MPa at Augustine Volcano (560 km southwest of the epicenter) are significantly lower than those recorded in Yellowstone and Utah (>3000 km southeast of the epicenter), suggesting that strong directivity effects were at least partly responsible for the lack of triggering at Alaskan volcanoes. We describe other incidents of earthquake-induced triggering in the KVC, and outline a qualitative magnitude/distance-dependent triggering threshold. We argue that triggering results from the perturbation of magmatic-hydrothermal systems in the KVC and suggest that the comparative lack of triggering at other Alaskan volcanoes could be a result of differences in the nature of magmatic-hydrothermal systems.

  2. EQInfo - earthquakes world-wide

    NASA Astrophysics Data System (ADS)

    Weber, Bernd; Herrnkind, Stephan

    2014-05-01

    EQInfo is a free Android app providing recent earthquake information from various earthquake monitoring centers as GFZ, EMSC, USGS and others. It allows filtering of agency, region and magnitude as well as controlling update interval, institute priority and alarm types. Used by more than 25k active users and beeing in the top ten list of Google Play, EQInfo is one of the most popular apps for earthquake information.

  3. Distant, delayed and ancient earthquake-induced landslides

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Torgoev, Almaz; Braun, Anika; Schlögel, Romy; Micu, Mihai

    2016-04-01

    On the basis of a new classification of seismically induced landslides we outline particular effects related to the delayed and distant triggering of landslides. Those cannot be predicted by state-of-the-art methods. First, for about a dozen events the 'predicted' extension of the affected area is clearly underestimated. The most problematic cases are those for which far-distant triggering of landslides had been reported, such as for the 1988 Saguenay earthquake. In Central Asia reports for such cases are known for areas marked by a thick cover of loess. One possible contributing effect could be a low-frequency resonance of the thick soils induced by distant earthquakes, especially those in the Pamir - Hindu Kush seismic region. Such deep focal and high magnitude (>>7) earthquakes are also found in Europe, first of all in the Vrancea region (Romania). For this area and others in Central Asia we computed landslide event sizes related to scenario earthquakes with M>7.5. The second particular and challenging type of triggering is the one delayed with respect to the main earthquake event: case histories have been reported for the Racha earthquake in 1991 when several larger landslides only started moving 2 or 3 days after the main shock. Similar observations were also made after other earthquake events in the U.S., such as after the 1906 San Francisco, the 1949 Tacoma, the 1959 Hebgen Lake and the 1983 Bora Peak earthquakes. Here, we will present a series of detailed examples of (partly monitored) mass movements in Central Asia that mainly developed after earthquakes, some even several weeks after the main shock: e.g. the Tektonik and Kainama landslides triggered in 1992 and 2004, respectively. We believe that the development of the massive failures is a consequence of the opening of tension cracks during the seismic shaking and their filling up with water during precipitations that followed the earthquakes. The third particular aspect analysed here is the use of large

  4. [Rehabilitation care for children after trauma in the earthquake disaster].

    PubMed

    Yang, Zhi-Quan; Zhang, Qing-Min

    2013-06-01

    For the children who suffer trauma in earthquake, rehabilitation care aims to promote functional recovery, shorten hospital stay, and reduce the incidence of complications or disability by evidence-based, multidisciplinary, and comprehensive early rehabilitation intervention on the basis of first aid and clinical treatment. Children are likely to suffer traumatic brain injury, spinal cord injury, peripheral nerve injury, limb fracture, and amputation in the earthquake disaster, so the clinical rehabilitation care designed considering the characteristics of children should be provided immediately after acute phase of trauma to promote functional recovery. PMID:23791056

  5. Application of the Titius-Bode law in earthquakes study

    NASA Astrophysics Data System (ADS)

    Hu, H.; Malkin, Z.; Wang, R.

    2015-08-01

    This article introduces application of the commensurability revealed by Titius-Bode Law in earthquake (EQ) prediction study. The results show that occurrence of the most of the world's major earthquakes is not accidental, and they occurred at the commensurable points of time axis. As an example, both EQ 7.0 in Lushan, China on 2013-04-20 and EQ 8.2 in Iquique, Chile on 2014-04-01 occurred at their commensurable epochs. This provides an important scientific basis for the prediction of major EQ, which will occur in the area in future.

  6. Prehistoric Earthquakes in the Puget Lowland, Washington

    NASA Astrophysics Data System (ADS)

    Sherrod, B. L.

    2005-12-01

    Coastal marsh deposits and lidar topographic data show evidence for past earthquakes on at least seven fault zones in the Puget lowland. Three major fault zones, the Seattle fault zone, Tacoma fault, and the Southern Whidbey Island fault zone (SWIFZ), cut through the heavily populated portions of central Puget Sound. Faults in four other areas, namely the Darrington-Devils Mountain fault zone, Olympia fault, the northern margin of the Olympic Mountains, and the southeastern Olympic Mountains, show that the area of active Holocene faulting extends over the entire Puget Sound lowlands. As recently as 1998, field evidence could confirm only one fault with evidence of past earthquake activity. Uplifted coastlines and surface ruptures are the field evidence for past Seattle fault earthquakes. Raised intertidal platforms along the Seattle fault zone show that regional uplift of as much as 7 meters accompanied a large earthquake about 1100 years. This earthquake also caused a tsunami, which inundated low-lying coastal areas north of Seattle. All of the lidar scarps found in the Seattle fault zone are north-side-up, opposite the vergence suggested for the Seattle fault from regional geological studies. Excavations across these scarps reveal north-dipping thrust faults that roughly follow bedding planes in bedrock and disrupt late Holocene soils. Soil stratigraphy and radiocarbon ages suggest as many as three surface-rupturing earthquakes in the past 2500 years. Lidar mapping revealed several en echelon scarps along the trace of the Tacoma fault. Existence of the Tacoma fault was previously hypothesized on the basis of large-amplitude gravity, aeromagnetic, and seismic-velocity anomalies, shallow marine seismic reflection surveys, glaciolacustrine strandlines, and coastal marsh stratigraphy. Coastal marsh deposits and scarp excavations suggest that the scarps formed during an earthquake on the Tacoma fault ~1100 years ago, possibly by folding above a buried reverse fault

  7. Earthquake prediction: Simple methods for complex phenomena

    NASA Astrophysics Data System (ADS)

    Luen, Bradley

    2010-09-01

    Earthquake predictions are often either based on stochastic models, or tested using stochastic models. Tests of predictions often tacitly assume predictions do not depend on past seismicity, which is false. We construct a naive predictor that, following each large earthquake, predicts another large earthquake will occur nearby soon. Because this "automatic alarm" strategy exploits clustering, it succeeds beyond "chance" according to a test that holds the predictions _xed. Some researchers try to remove clustering from earthquake catalogs and model the remaining events. There have been claims that the declustered catalogs are Poisson on the basis of statistical tests we show to be weak. Better tests show that declustered catalogs are not Poisson. In fact, there is evidence that events in declustered catalogs do not have exchangeable times given the locations, a necessary condition for the Poisson. If seismicity followed a stochastic process, an optimal predictor would turn on an alarm when the conditional intensity is high. The Epidemic-Type Aftershock (ETAS) model is a popular point process model that includes clustering. It has many parameters, but is still a simpli_cation of seismicity. Estimating the model is di_cult, and estimated parameters often give a non-stationary model. Even if the model is ETAS, temporal predictions based on the ETAS conditional intensity are not much better than those of magnitude-dependent automatic (MDA) alarms, a much simpler strategy with only one parameter instead of _ve. For a catalog of Southern Californian seismicity, ETAS predictions again o_er only slight improvement over MDA alarms

  8. Evaluation of earthquake and tsunami on JSFR

    SciTech Connect

    Chikazawa, Y.; Enuma, Y.; Kisohara, N.; Yamano, H.; Kubo, S.; Hayafune, H.; Sagawa, H.; Okamura, S.; Shimakawa, Y.

    2012-07-01

    Evaluation of earthquake and tsunami on JSFR has been analyzed. For seismic design, safety components are confirmed to maintain their functions even against recent strong earthquakes. As for Tsunami, some parts of reactor building might be submerged including component cooling water system whose final heat sink is sea water. However, in the JSFR design, safety grade components are independent from component cooling water system (CCWS). The JSFR emergency power supply adopts a gas turbine system with air cooling, since JSFR does not basically require quick start-up of the emergency power supply thanks to the natural convection DHRS. Even in case of long station blackout, the DHRS could be activated by emergency batteries or manually and be operated continuously by natural convection. (authors)

  9. Human casualties in earthquakes: modelling and mitigation

    USGS Publications Warehouse

    Spence, R.J.S.; So, E.K.M.

    2011-01-01

    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  10. Structural performance of the DOE's Idaho National Engineering Laboratory during the 1983 Borak Peak earthquake

    SciTech Connect

    Guenzler, R.C.; Gorman, V.W.

    1985-01-01

    The 1983 Borah Peak Earthquake (7.3 Richter magnitude) was the largest earthquake ever experienced by the DOE's Idaho National Engineering Laboratory (INEL). Reactor and plant facilities are generally located about 90 to 110 km (60 miles) from the epicenter. Several reactors were operating normally at the time of the earthquake. Based on detailed inspections, comparisons of measured accelerations with design levels, and instrumental seismograph information, it was concluded that the 1983 Borah Peak Earthquake created no safety problems for INEL reactors or other facilities. 10 refs., 16 figs., 2 tabs.

  11. Synthetic earthquake catalogs simulating seismic activity in the Corinth Gulf, Greece, fault system

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Carluccio, Roberto; Papadimitriou, Eleftheria; Karakostas, Vassilis

    2015-01-01

    The characteristic earthquake hypothesis is the basis of time-dependent modeling of earthquake recurrence on major faults. However, the characteristic earthquake hypothesis is not strongly supported by observational data. Few fault segments have long historical or paleoseismic records of individually dated ruptures, and when data and parameter uncertainties are allowed for, the form of the recurrence distribution is difficult to establish. This is the case, for instance, of the Corinth Gulf Fault System (CGFS), for which documents about strong earthquakes exist for at least 2000 years, although they can be considered complete for M ≥ 6.0 only for the latest 300 years, during which only few characteristic earthquakes are reported for individual fault segments. The use of a physics-based earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 500,000 events of magnitudes ≥ 4.0. The main features of our simulation algorithm are (1) an average slip rate released by earthquakes for every single segment in the investigated fault system, (2) heuristic procedures for rupture growth and stop, leading to a self-organized earthquake magnitude distribution, (3) the interaction between earthquake sources, and (4) the effect of minor earthquakes in redistributing stress. The application of our simulation algorithm to the CGFS has shown realistic features in time, space, and magnitude behavior of the seismicity. These features include long-term periodicity of strong earthquakes, short-term clustering of both strong and smaller events, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the higher-magnitude range.

  12. Synthetic earthquake catalogs simulating seismic activity in the Corynth Gulf, Greece, fault system

    NASA Astrophysics Data System (ADS)

    Console, R.; Carluccio, R.; Papadimitriou, E. E.; Karakostas, V. G.

    2014-12-01

    The characteristic earthquake hypothesis is the basis of time-dependent modeling of earthquake recurrence on major faults, using the renewal process methodology. However, the characteristic earthquake hypothesis is not strongly supported by observational data. Few fault segments have long historical or paleoseismic records of individually dated ruptures, and when data and parameter uncertainties are allowed for, the form of the recurrence-distribution is difficult to establish. This is the case, for instance, of the Corinth gulf fault system, for which documents about strong earthquakes exist for at least two thousand years, but they can be considered complete for magnitudes > 6.0 only for the latest 300 years, during which only few characteristic earthquakes are reported for single fault segments. The use of a physics-based earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 500,000 events of magnitudes > 4.0. The main features of our simulation algorithm are (1) the imposition of an average slip rate released by earthquakes to every single segment recognized in the investigated fault system, (2) the interaction between earthquake sources, (3) a self-organized earthquake magnitude distribution, and (4) the effect of minor earthquakes in redistributing stress. The application of our simulation algorithm to the Corinth gulf fault system has shown realistic features in time, space and magnitude behavior of the seismicity. These features include long-term periodicity of strong earthquakes, short-term clustering of both strong and smaller events, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the higher magnitude range.

  13. Rupture Zones of Strong Earthquakes In The Corinth Rift

    NASA Astrophysics Data System (ADS)

    Papadopoulos, G. A.; Kouskouna, V.; Plessa, A.

    Ruptures zones of the strong (M 8805; 6) earthquakes that occurred in the Corinth rift in the last three hundred years have been determined on the basis of aftershock epi- central distributions , intensity distributions and observations regarding seismogenic ground failures and tsunamis. The space U time distribution of the rupture zones indi- cates that (1) for time intervals of about 50yrs the rupture zones do not overlap; over- alpping appear, however, in longer time intervals , (2) there is a trend of the seismic activity to decrease westwards , and (3) particular regions constitute potential seis- mic gaps , like the Kiato UXylocastro region in the south coast of the Corinth Gulf, where the large 1402 earthquake occurred, and the Livadia U Desfina region where the A.D.361 and 551 large earthquakes possibly took place.

  14. Karhunen-Loéve expansion for random earthquake excitations

    NASA Astrophysics Data System (ADS)

    He, Jun

    2015-03-01

    This paper develops a trigonometric-basis-function based Karhunen-Loéve (KL) expansion for simulating random earthquake excitations with known covariance functions. The methods for determining the number of the KL terms and defining the involved random variables are described in detail. The simplified form of the KL expansion is given, whereby the relationship between the KL expansion and the spectral representation method is investigated and revealed. The KL expansion is of high efficiency for simulating long-term earthquake excitations in the sense that it needs a minimum number of random variables, as compared with the spectral representation method. Numerical examples demonstrate the convergence and accuracy of the KL expansion for simulating two commonly-used random earthquake excitation models and estimating linear and nonlinear random responses to the random excitations.

  15. Self-Organized Earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.; Klein, W.

    2011-12-01

    Self-Organized Criticality was proposed by the Per Bak et al. [1] as a means of explaining scaling laws observed in driven natural systems, usually in (slowly) driven threshold systems. The example used by Bak was a simple cellular automaton model of a sandpile, in which grains of sand were slowly dropped (randomly) onto a flat plate. After a period of time, during which the 'critical state' was approached, a series of self-similar avalanches would begin. Scaling exponents for the frequency-area statistics of the sandpile avalanches were found to be approximately 1, a value that characterizes 'flicker noise' in natural systems. SOC is associated with a critical point in the phase diagram of the system, and it was found that the usual 2-scaling field theory applies. A model related to SOC is the Self-Organized Spinodal (SOS), or intermittent criticality model. Here a slow but persistent driving force leads to quasi-periodic approach to, and retreat from, the classical limit of stability, or spinodal. Scaling exponents for this model can be related to Gutenberg-Richter and Omori exponents observed in earthquake systems. In contrast to SOC models, nucleation, both classical and non-classical types, is possible in SOS systems. Tunneling or nucleation rates can be computed from Langer-Klein-Landau-Ginzburg theories for comparison to observations. Nucleating droplets play a role similar to characteristic earthquake events. Simulations of these systems reveals much of the phenomenology associated with earthquakes and other types of "burst" dynamics. Whereas SOC is characterized by the full scaling spectrum of avalanches, SOS is characterized by both system-size events above the nominal frequency-size scaling curve, and scaling of small events. Applications to other systems including integrate-and-fire neural networks and financial crashes will be discussed. [1] P. Bak, C. Tang and K. Weisenfeld, Self-Organized Criticality, Phys. Rev. Lett., 59, 381 (1987).

  16. The 2011 Tohoku earthquake sequences detected by IMS hydroacoustic array

    NASA Astrophysics Data System (ADS)

    Yun, S.; Lee, W.

    2011-12-01

    A Mw 9.1 thrust-fault earthquake has been occurred in the Pacific coast of Tohoku, Japan, on March 11, 2011. It is the fourth largest earthquake ever recorded since modern seismographs installed, and hundreds of strong aftershocks (M > 5) have been accompanied. We applied a cross-correlation method to the continuous data recorded in the Hawaii hydroacoustic array operated by International Monitoring System (IMS), and calculated back-azimuths of T-waves generated by the earthquake sequences. The back-azimuth values of the major events show somewhat scattered pattern, which is a different feature from that of the Great Sumatra-Andaman Earthquake. This may imply that the rupture is not likely to propagate linearly through the thrust fault line. Several aftershocks, however, clearly show gradual back-azimuthal change toward North. These differences might be caused by complex and diverse source mechanisms of the earthquakes. Combining hydroacoustic data obtained by other IMS hydroacoustic stations, if available, we could resolve a better azimuthal change regarding the earthquake sequence.

  17. Statistical Earthquake Focal Mechanism Forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.

    2013-12-01

    The new whole Earth focal mechanism forecast, based on the GCMT catalog, has been created. In the present forecast, the sum of normalized seismic moment tensors within 1000 km radius is calculated and the P- and T-axes for the focal mechanism are evaluated on the basis of the sum. Simultaneously we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms. This average angle shows tectonic complexity of a region and indicates the accuracy of the prediction. The method was originally proposed by Kagan and Jackson (1994, JGR). Recent interest by CSEP and GEM has motivated some improvements, particularly to extend the previous forecast to polar and near-polar regions. The major problem in extending the forecast is the focal mechanism calculation on a spherical surface. In the previous forecast as our average focal mechanism was computed, it was assumed that longitude lines are approximately parallel within 1000 km radius. This is largely accurate in the equatorial and near-equatorial areas. However, when one approaches the 75 degree latitude, the longitude lines are no longer parallel: the bearing (azimuthal) difference at points separated by 1000 km reach about 35 degrees. In most situations a forecast point where we calculate an average focal mechanism is surrounded by earthquakes, so a bias should not be strong due to the difference effect cancellation. But if we move into polar regions, the bearing difference could approach 180 degrees. In a modified program focal mechanisms have been projected on a plane tangent to a sphere at a forecast point. New longitude axes which are parallel in the tangent plane are corrected for the bearing difference. A comparison with the old 75S-75N forecast shows that in equatorial regions the forecasted focal mechanisms are almost the same, and the difference in the forecasted focal mechanisms rotation angle is close to zero. However, though the forecasted focal mechanisms are similar

  18. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  19. Earthquake Preparedness Checklist for Schools.

    ERIC Educational Resources Information Center

    1999

    A brochure provides a checklist highlighting the important questions and activities that should be addressed and undertaken as part of a school safety and preparedness program for earthquakes. It reminds administrators and other interested parties on what not to forget in preparing schools for earthquakes, such as staff knowledge needs, evacuation…

  20. Earthquakes Threaten Many American Schools

    ERIC Educational Resources Information Center

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  1. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  2. Catalog of earthquakes along the San Andreas fault system in Central California, April-June 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period April - June, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). A catalog for the first quarter of 1972 has been prepared by Wesson and others (1972). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 910 earthquakes in Central California. A substantial portion of the earthquakes reported in this catalog represents a continuation of the sequence of earthquakes in the Bear Valley area which began in February, 1972 (Wesson and others, 1972). Arrival times at 126 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 101 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement

  3. Catalog of earthquakes along the San Andreas fault system in Central California: January-March, 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Meagher, K.L.

    1973-01-01

    Numerous small earthquakes occur each day in the Coast Ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period January - March, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b,c,d). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1,718 earthquakes in Central California. Of particular interest is a sequence of earthquakes in the Bear Valley area which contained single shocks with local magnitudes of S.O and 4.6. Earthquakes from this sequence make up roughly 66% of the total and are currently the subject of an interpretative study. Arrival times at 118 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 94 are telemetered stations operated by NCER. Readings from the remaining 24 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley,have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the

  4. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    NASA Astrophysics Data System (ADS)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  5. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  6. Earthquake hazards: a national threat

    USGS Publications Warehouse

    U.S. Geological Survey

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  7. Exaggerated Claims About Earthquake Predictions

    NASA Astrophysics Data System (ADS)

    Kafka, Alan L.; Ebel, John E.

    2007-01-01

    The perennial promise of successful earthquake prediction captures the imagination of a public hungry for certainty in an uncertain world. Yet, given the lack of any reliable method of predicting earthquakes [e.g., Geller, 1997; Kagan and Jackson, 1996; Evans, 1997], seismologists regularly have to explain news stories of a supposedly successful earthquake prediction when it is far from clear just how successful that prediction actually was. When journalists and public relations offices report the latest `great discovery' regarding the prediction of earthquakes, seismologists are left with the much less glamorous task of explaining to the public the gap between the claimed success and the sober reality that there is no scientifically proven method of predicting earthquakes.

  8. Stresses of pipelines during earthquakes

    SciTech Connect

    Kiyomiya, O.

    1983-05-01

    Construction of submarine pipelines plays an important role in offshore development. Japan is famous for earthquake country. It is very important to estimate the earthquake proof of the submarine pipelines. An oil leakage causes the contamination of ocean if the submarine pipelines are damaged by earthquakes. Pipe stresses during earthquakes are closely related to the relative displacement of the ground. Field observation has been carried out to know the ground deformation. Steel pipe is assumed to be buried along the observation line and pipe stresses are calculated from the ground deformation obtained by the field observation. The stresses calculated by seismic deformation method that has been used for earthquake resistant design in Japan and by dynamic response analysis are compared with those from the observation.

  9. Early Earthquakes of the Americas

    NASA Astrophysics Data System (ADS)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  10. Earthquake Simulator Finds Tremor Triggers

    SciTech Connect

    Johnson, Paul

    2015-03-27

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  11. Testing an earthquake prediction algorithm

    USGS Publications Warehouse

    Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

    1997-01-01

    A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

  12. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    NASA Astrophysics Data System (ADS)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  13. Analysis of worldwide earthquake mortality using multivariate demographic and seismic data.

    PubMed

    Gutiérrez, E; Taucer, F; De Groeve, T; Al-Khudhairy, D H A; Zaldivar, J M

    2005-06-15

    In this paper, mortality in the immediate aftermath of an earthquake is studied on a worldwide scale using multivariate analysis. A statistical method is presented that analyzes reported earthquake fatalities as a function of a heterogeneous set of parameters selected on the basis of their presumed influence on earthquake mortality. The ensemble was compiled from demographic, seismic, and reported fatality data culled from available records of past earthquakes organized in a geographic information system. The authors consider the statistical relation between earthquake mortality and the available data ensemble, analyze the validity of the results in view of the parametric uncertainties, and propose a multivariate mortality analysis prediction method. The analysis reveals that, although the highest mortality rates are expected in poorly developed rural areas, high fatality counts can result from a wide range of mortality ratios that depend on the effective population size. PMID:15937024

  14. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    USGS Publications Warehouse

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  15. Basis selection in LOBPCG

    NASA Astrophysics Data System (ADS)

    Hetmaniuk, U.; Lehoucq, R.

    2006-10-01

    The purpose of our paper is to discuss basis selection for Knyazev's locally optimal block preconditioned conjugate gradient (LOBPCG) method. An inappropriate choice of basis can lead to ill-conditioned Gram matrices in the Rayleigh-Ritz analysis that can delay convergence or produce inaccurate eigenpairs. We demonstrate that the choice of basis is not merely related to computing in finite precision arithmetic. We propose a representation that maintains orthogonality of the basis vectors and so has excellent numerical properties.

  16. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  17. Exaggerated Claims About Success Rate of Earthquake Predictions: "Amazing Success" or "Remarkably Unremarkable"?

    NASA Astrophysics Data System (ADS)

    Kafka, A. L.; Ebel, J. E.

    2005-12-01

    any method that uses past seismicity maps as a basis for earthquake forecasting should have a comparable rate of success.

  18. Performance Basis for Airborne Separation

    NASA Technical Reports Server (NTRS)

    Wing, David J.

    2008-01-01

    Emerging applications of Airborne Separation Assistance System (ASAS) technologies make possible new and powerful methods in Air Traffic Management (ATM) that may significantly improve the system-level performance of operations in the future ATM system. These applications typically involve the aircraft managing certain components of its Four Dimensional (4D) trajectory within the degrees of freedom defined by a set of operational constraints negotiated with the Air Navigation Service Provider. It is hypothesized that reliable individual performance by many aircraft will translate into higher total system-level performance. To actually realize this improvement, the new capabilities must be attracted to high demand and complexity regions where high ATM performance is critical. Operational approval for use in such environments will require participating aircraft to be certified to rigorous and appropriate performance standards. Currently, no formal basis exists for defining these standards. This paper provides a context for defining the performance basis for 4D-ASAS operations. The trajectory constraints to be met by the aircraft are defined, categorized, and assessed for performance requirements. A proposed extension of the existing Required Navigation Performance (RNP) construct into a dynamic standard (Dynamic RNP) is outlined. Sample data is presented from an ongoing high-fidelity batch simulation series that is characterizing the performance of an advanced 4D-ASAS application. Data of this type will contribute to the evaluation and validation of the proposed performance basis.

  19. Review of variations in Mw < 7 earthquake motions on position and TEC (Mw = 6.5 Aegean Sea earthquake sample)

    NASA Astrophysics Data System (ADS)

    Yildirim, Omer; Inyurt, Samed; Mekik, Cetin

    2016-02-01

    Turkey is a country located in the middle latitude zone, where tectonic activity is intensive. Recently, an earthquake of magnitude 6.5 Mw occurred offshore in the Aegean Sea on 24 May 2014 at 09:25 UTC, which lasted about 40 s. The earthquake was also felt in Greece, Romania, and Bulgaria in addition to Turkey. In recent years, ionospheric anomaly detection studies have been carried out because of seismicity with total electron content (TEC) computed from the global navigation satellite system's (GNSS) signal delays and several interesting findings have been published. In this study, both TEC and positional variations have been examined separately following a moderate size earthquake in the Aegean Sea. The correlation of the aforementioned ionospheric variation with the positional variation has also been investigated. For this purpose, a total of 15 stations was used, including four continuously operating reference stations in Turkey (CORS-TR) and stations in the seismic zone (AYVL, CANA, IPSA, and YENC), as well as international GNSS service (IGS) and European reference frame permanent network (EPN) stations. The ionospheric and positional variations of the AYVL, CANA, IPSA, and YENC stations were examined using Bernese v5.0 software. When the precise point positioning TEC (PPP-TEC) values were examined, it was observed that the TEC values were approximately 4 TECU (total electron content unit) above the upper-limit TEC value at four stations located in Turkey, 3 days before the earthquake at 08:00 and 10:00 UTC. At the same stations, on the day before the earthquake at 06:00, 08:00, and 10:00 UTC, the TEC values were approximately 5 TECU below the lower-limit TEC value. The global ionosphere model TEC (GIM-TEC) values published by the Centre for Orbit Determination in Europe (CODE) were also examined. Three days before the earthquake, at all stations, it was observed that the TEC values in the time period between 08:00 and 10:00 UTC were approximately 2 TECU

  20. The Lusi mud eruption was not triggered by an earthquake

    NASA Astrophysics Data System (ADS)

    Manga, M.; Rudolph, M. L.; Tingay, M. R.; Davies, R.; Wang, C.; Shirzaei, M.; Fukushima, Y.

    2013-12-01

    The Lusi mud eruption in East Java, Indonesia has displaced tens of thousands of people with economic costs that exceed $4 billion USD to date. Consequently, understanding the cause and future of the eruption are important. There has been considerable debate as to whether the eruption was triggered by the MW 6.3 Yogyakarta earthquake, which struck two days prior to the eruption, or by drilling operations at a gas exploration well (BJP-1) 200 m from the 700 m lineament, along which mud first erupted. A recent letter by Lupi et al. (Nature Geoscience, 2013) argues for an earthquake trigger, invoking the presence of a seismically fast structure that amplifies seismic shaking in the mud source region. The absence of an eruption during larger and closer earthquakes reveals that an earthquake trigger is unlikely. Furthermore, the high seismic velocities, central to the model of Lupi et al. , are impossibly high and are primarily artifacts associated with steel casing installed in the well where the velocities were measured. Finally, the stress changes caused by drilling operations greatly exceeded those produced by the earthquake. Assuming no major changes in plumbing, we conclude by using satellite InSAR to reveal the evolution of surface deformation caused by the eruption and predict a 10 fold decrease in discharge in the next 5 years.

  1. What to Expect from the Virtual Seismologist: Delay Times and Uncertainties of Initial Earthquake Alerts in California

    NASA Astrophysics Data System (ADS)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Racine, R.; Meier, M.; Cauzzi, C.

    2013-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland, western Greece and Istanbul. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS installations in southern Italy, Romania, and Iceland are planned or underway. The possible use cases for an EEW system will be determined by the speed and reliability of earthquake source parameter estimates. A thorough understanding of both is therefore essential to evaluate the usefulness of VS. For California, we present state-wide theoretical alert times for hypothetical earthquakes by analyzing time delays introduced by the different components in the VS EEW system. Taking advantage of the fully probabilistic formulation of the VS algorithm we further present an improved way to describe the uncertainties of every magnitude estimate by evaluating the width and shape of the probability density function that describes the relationship between waveform envelope amplitudes and magnitude. We evaluate these new uncertainty values for past seismicity in California through off-line playbacks and compare them to the previously defined static definitions of uncertainty based on real-time detections. Our results indicate where VS alerts are most useful in California and also suggest where most effective improvements to the VS EEW system

  2. The physics of an earthquake

    NASA Astrophysics Data System (ADS)

    McCloskey, John

    2008-03-01

    The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.

  3. Paleomagnetic correlation of late Holocene earthquakes among estuaries in Washington and Oregon

    NASA Astrophysics Data System (ADS)

    Hagstrum, Jonathan T.; Atwater, Brian F.; Sherrod, Brian L.

    2004-10-01

    Paleomagnetic directions of estuarine mud provide additional evidence that individual earthquakes, or rapid series of earthquakes, caused widespread coseismic land-level changes during the past 2000 years in western Washington and Oregon. Most of the paleomagnetic measurements were made on mud dating from the first decades after coseismic subsidence from plate-boundary earthquakes at the Cascadia subduction zone. Mud deposited soon after the A.D. 1700 Cascadia earthquake has similar remanent directions among all five sites (k = 171) sampled along 80 km of Pacific coast between Grays Harbor and the mouth of the Columbia River. Likewise, internally consistent directions were obtained along this stretch of coast from mud deposited soon after a plate-boundary earthquake (or earthquake series) in A.D. 340-410 and soon after another such event in A.D. 680-720. Also analyzed were remanent magnetizations of mud deposited shortly before (or shortly after) land-level changes from seismicity in the North America plate beneath Puget Sound. A mean direction for sites on the Snohomish River delta, near Everett, from the time of an earthquake on the Seattle fault in A.D. 900-930 is statistically identical (95% confidence level) to a mean direction in mud that was uplifted in A.D. 800-1000 at potentially correlative sites near Tacoma and Olympia. The paleomagnetic direction from Everett for the upper-plate earthquake of A.D. 900-930 differs substantially from that for a plate-boundary earthquake in A.D. 810-1190. This difference implies that the upper-plate earthquake preceded the plate-boundary earthquake by a century or two on the basis of comparisons of their paleomagnetic poles with a previously reconstructed path of geomagnetic paleosecular variation in western North America.

  4. Probability based earthquake load and resistance factor design criteria for offshore platforms

    SciTech Connect

    Bea, R.G.

    1996-12-31

    This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

  5. Evidence of strong earthquake shaking in the lower wabash valley from prehistoric liquefaction features

    USGS Publications Warehouse

    Obermeier, S.F.; Bleuer, N.R.; Munson, C.A.; Munson, P.J.; Martin, W.S.; Mcwilliams, K.M.; Tabaczynski, D.A.; Odum, J.K.; Rubin, M.; Eggert, D.L.

    1991-01-01

    Earthquake-induced liquefaction features in Holocene sediments provide evidence of strong prehistoric shaking, magnitude mb 6.2 to 6.7, in the Wabash Valley bordering Indiana and Illinois. The source of the one or more earthquakes responsible was almost certainly in or near the Wabash Valley. The largest event is interpreted to have occurred between 7500 and 1500 years ago on the basis of archeological, pedological, and stratigraphic relations.

  6. Fracking, wastewater disposal, and earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  7. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    PubMed

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  8. MyShake: A smartphone seismic network for earthquake early warning and beyond

    PubMed Central

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis; Kwon, Young-Woo

    2016-01-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  9. Induced Earthquakes Are Not All Alike: Examples from Texas Since 2008 (Invited)

    NASA Astrophysics Data System (ADS)

    Frohlich, C.

    2013-12-01

    The EarthScope Transportable Array passed through Texas between 2008 and 2011, providing an opportunity to identify and accurately locate earthquakes near and/or within oil/gas fields and injection waste disposal operations. In five widely separated geographical locations, the results suggest seismic activity may be induced/triggered. However, the different regions exhibit different relationships between injection/production operations and seismic activity: In the Barnett Shale of northeast Texas, small earthquakes occurred only near higher-volume (volume rate > 150,000 BWPM) injection disposal wells. These included widely reported earthquakes occurring near Dallas-Fort Worth and Cleburne in 2008 and 2009. Near Alice in south Texas, M3.9 earthquakes occurred in 1997 and 2010 on the boundary of the Stratton Field, which had been highly productive for both oil and gas since the 1950's. Both earthquakes occurred during an era of net declining production, but their focal depths and location at the field boundary suggest an association with production activity. In the Eagle Ford of south central Texas, earthquakes occurred near wells following significant increases in extraction (water+produced oil) volumes as well as injection. The largest earthquake, the M4.8 Fashing earthquake of 20 October 2011, occurred after significant increases in extraction. In the Cogdell Field near Snyder (west Texas), a sequence of earthquakes beginning in 2006 followed significant increases in the injection of CO2 at nearby wells. The largest with M4.4 occurred on 11 September 2011. This is the largest known earthquake possibly attributable to CO2 injection. Near Timpson in east Texas a sequence of earthquakes beginning in 2008, including an M4.8 earthquake on 17 May 2012, occurred within three km of two high-volume injection disposal wells that had begun operation in 2007. These were the first known earthquakes at this location. In summary, the observations find possible induced

  10. Forecasting California’s earthquakes

    USGS Publications Warehouse

    Kerr, R. A.

    1988-01-01

    For the first time, researchers have reached to a consensus on the threat of large earthquakes to California, things look no worse for Los Angles than before. It still has about a 60 percent chance of being shaken by a large earthquake sometime during the next 30 years. But other heavily populated areas of California, such as San Bernardino and the East Bay area of San Francisco, are now getting their fair share of attention. The new consensus also points up the considerable uncertainties invloved in earthquake forecasting. 

  11. Seismology: dynamic triggering of earthquakes.

    PubMed

    Gomberg, Joan; Johnson, Paul

    2005-10-01

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar. PMID:16208360

  12. Earthquake damage to transportation systems

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    Earthquakes represent one of the most destructive natural hazards known to man. A large magnitude earthquake near a populated area can affect residents over thousands of square kilometers and cause billions of dollars in property damage. Such an event can kill or injure thousands of residents and disrupt the socioeconomic environment for months, sometimes years. A serious result of a large-magnitude earthquake is the disruption of transportation systems, which limits post-disaster emergency response. Movement of emergency vehicles, such as police cars, fire trucks and ambulances, is often severely restricted. Damage to transportation systems is categorized below by cause including: ground failure, faulting, vibration damage, and tsunamis.

  13. The threat of silent earthquakes

    USGS Publications Warehouse

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  14. Statistical analysis of earthquakes after the 1999 MW 7.7 Chi-Chi, Taiwan, earthquake based on a modified Reasenberg-Jones model

    NASA Astrophysics Data System (ADS)

    Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq

    2015-12-01

    We investigated the temporal-spatial hazard of the earthquakes after the 1999 September 21 MW = 7.7 Chi-Chi shock in a continental region of Taiwan. The Reasenberg-Jones (RJ) model (Reasenberg and Jones, 1989, 1994) that combines the frequency-magnitude distribution (Gutenberg and Richter, 1944) and time-decaying occurrence rate (Utsu et al., 1995) is conventionally employed for assessing the earthquake hazard after a large shock. However, it is found that the b values in the frequency-magnitude distribution of the earthquakes in the study region dramatically decreased from background values after the Chi-Chi shock, and then gradually increased up. The observation of a time-dependent frequency-magnitude distribution motivated us to propose a modified RJ model (MRJ) to assess the earthquake hazard. To see how the models perform on assessing short-term earthquake hazard, the RJ and MRJ models were separately used to sequentially forecast earthquakes in the study region. To depict the potential rupture area for future earthquakes, we further constructed relative hazard (RH) maps based on the two models. The Receiver Operating Characteristics (ROC) curves (Swets, 1988) finally demonstrated that the RH map based on the MRJ model was, in general, superior to the one based on the original RJ model for exploring the spatial hazard of earthquakes in a short time after the Chi-Chi shock.

  15. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1977-01-01

    In a computer simulation study of earthquakes a seismically active strike slip fault is represented by coupled mechanical blocks which are driven by a moving plate and which slide on a friction surface. Elastic forces and time independent friction are used to generate main shock events, while viscoelastic forces and time dependent friction add aftershock features. The study reveals that the size, length, and time and place of event occurrence are strongly influenced by the magnitude and degree of homogeneity in the elastic, viscous, and friction parameters of the fault region. For example, periodically reoccurring similar events are observed in simulations with near-homogeneous parameters along the fault, whereas seismic gaps are a common feature of simulations employing large variations in the fault parameters. The study also reveals correlations between strain energy release and fault length and average displacement and between main shock and aftershock displacements.

  16. Nonextensive models for earthquakes

    NASA Astrophysics Data System (ADS)

    Silva, R.; França, G. S.; Vilar, C. S.; Alcaniz, J. S.

    2006-02-01

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment γ∝r3 . The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter q , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.

  17. Reconsidering earthquake scaling

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Wech, A.; Creager, K.; Obara, K.; Agnew, D.

    2016-06-01

    The relationship (scaling) between scalar moment, M0, and duration, T, potentially provides key constraints on the physics governing fault slip. The prevailing interpretation of M0-T observations proposes different scaling for fast (earthquakes) and slow (mostly aseismic) slip populations and thus fundamentally different driving mechanisms. We show that a single model of slip events within bounded slip zones may explain nearly all fast and slow slip M0-T observations, and both slip populations have a change in scaling, where the slip area growth changes from 2-D when too small to sense the boundaries to 1-D when large enough to be bounded. We present new fast and slow slip M0-T observations that sample the change in scaling in each population, which are consistent with our interpretation. We suggest that a continuous but bimodal distribution of slip modes exists and M0-T observations alone may not imply a fundamental difference between fast and slow slip.

  18. Earthquake Breccias (Invited)

    NASA Astrophysics Data System (ADS)

    Rowe, C. D.; Melosh, B. L.; Lamothe, K.; Schnitzer, V.; Bate, C.

    2013-12-01

    Fault breccias are one of the fundamental classes of fault rocks and are observed in many exhumed faults. Some breccias have long been assumed to form co-seismically, but textural or mechanistic evidence for the association with earthquakes has never been documented. For example, at dilational jogs in brittle faults, it is common to find small bodies of chaotic breccia in lenticular or rhombohedral voids bounded by main slip surfaces and linking segments. Sibson interpreted these 'implosion breccias' as evidence of wall rock fracturing during sudden unloading when the dilational jogs open during earthquake slip (Sibson 1985, PAGEOPH v. 124, n. 1, 159-175). However, the role of dynamic fracturing in forming these breccias has not been tested. Moreover, the criteria for identifying implosion breccia have not been defined - do all breccias in dilational jogs or step-overs represent earthquake slip? We are building a database of breccia and microbreccia textures to develop a strictly observational set of criteria for distinction of breccia texture classes. Here, we present observations from the right-lateral Pofadder Shear Zone, South Africa, and use our textural criteria to identify the relative roles of dynamic and quasi-static fracture patterns, comminution/grinding and attrition, hydrothermal alteration, dissolution, and cementation. Nearly 100% exposure in the hyper-arid region south of the Orange River allowed very detailed mapping of frictional fault traces associated with rupture events, containing one or more right-steps in each rupture trace. Fracture patterns characteristic of on- and off-fault damage associated with propagation of dynamic rupture are observed along straight segments of the faults. The wall rock fractures are regularly spaced, begin at the fault trace and propagate at a high angle to the fault, and locally branch into subsidiary fractures before terminating a few cm away. This pattern of fractures has been previously linked to dynamic

  19. Earthquake funding restored

    NASA Astrophysics Data System (ADS)

    Bush, Susan

    Funding levels for the U.S. Geological Survey's part of the National Earthquake Hazards Reduction Program for FY92 have been restored by the House and a Senate subcommittee. The president's budget request for FY92 was only $37.3 million, lower than the $54.5 million authorized by Congress for FY91. Earlier this year the House agreed on restoring $10 million to the program. Some AGU members have been trying to see the full $17.2 million difference restored. It is reported that the Senate will agree to give $15 million to the program.When Congress reconvenes in September the full Senate will vote on the Department of Interior and Related Agencies appropriations bill (HR2686). After that, the bill will go to a joint conference committee, where differences between the House and Senate will be resolved before the bill is passed along to the president.

  20. Nonextensive models for earthquakes.

    PubMed

    Silva, R; França, G S; Vilar, C S; Alcaniz, J S

    2006-02-01

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment epsilon proportional to r3. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude. PMID:16605393

  1. Sichuan Earthquake in China

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Sichuan earthquake in China occurred on May 12, 2008, along faults within the mountains, but near and almost parallel the mountain front, northwest of the city of Chengdu. This major quake caused immediate and severe damage to many villages and cities in the area. Aftershocks pose a continuing danger, but another continuing hazard is the widespread occurrence of landslides that have formed new natural dams and consequently new lakes. These lakes are submerging roads and flooding previously developed lands. But an even greater concern is the possible rapid release of water as the lakes eventually overflow the new dams. The dams are generally composed of disintegrated rock debris that may easily erode, leading to greater release of water, which may then cause faster erosion and an even greater release of water. This possible 'positive feedback' between increasing erosion and increasing water release could result in catastrophic debris flows and/or flooding. The danger is well known to the Chinese earthquake response teams, which have been building spillways over some of the new natural dams.

    This ASTER image, acquired on June 1, 2008, shows two of the new large landslide dams and lakes upstream from the town of Chi-Kua-Kan at 32o12'N latitude and 104o50'E longitude. Vegetation is green, water is blue, and soil is grayish brown in this enhanced color view. New landslides appear bright off-white. The northern (top) lake is upstream from the southern lake. Close inspection shows a series of much smaller lakes in an elongated 'S' pattern along the original stream path. Note especially the large landslides that created the dams. Some other landslides in this area, such as the large one in the northeast corner of the image, occur only on the mountain slopes, so do not block streams, and do not form lakes.

  2. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Mayeda, K.; Ruppert, S.

    2002-12-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by analyzing aftershock sequences in the Western U.S. and Turkey using two different techniques. First we examine the observed regional S-wave spectra by fitting with a parametric model (Walter and Taylor, 2002) with and without variable stress drop scaling. Because the aftershock sequences have common stations and paths we can examine the S-wave spectra of events by size to determine what type of apparent stress scaling, if any, is most consistent with the data. Second we use regional coda envelope techniques (e.g. Mayeda and Walter, 1996; Mayeda et al, 2002) on the same events to directly measure energy and moment. The coda techniques corrects for path and site effects using an empirical Green function technique and independent calibration with surface wave derived moments. Our hope is that by carefully analyzing a very large number of events in a consistent manner using two different techniques we can start to resolve this apparent stress scaling issue. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  3. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  4. Earthquake Shaking and Damage to Buildings: Recent evidence for severe ground shaking raises questions about the earthquake resistance of structures.

    PubMed

    Page, R A; Joyner, W B; Blume, J A

    1975-08-22

    Ground shaking close to the causative fault of an earthquake is more intense than it was previously believed to be. This raises the possibility that large numbers of buildings and other structures are not sufficiently resistant for the intense levels of shaking that can occur close to the fault. Many structures were built before earthquake codes were adopted; others were built according to codes formulated when less was known about the intensity of near-fault shaking. Although many building types are more resistant than conventional design analyses imply, the margin of safety is difficult to quantify. Many modern structures, such as freeways, have not been subjected to and tested by near-fault shaking in major earthquakes (magnitude 7 or greater). Damage patterns in recent moderate-sized earthquakes occurring in or adjacent to urbanized areas (17), however, indicate that many structures, including some modern ones designed to meet earthquake code requirements, cannot withstand the severe shaking that can occur close to a fault. It is necessary to review the ground motion assumed and the methods utilized in the design of important existing structures and, if necessary, to strengthen or modify the use of structures that are found to be weak. New structures situated close to active faults should be designed on the basis of ground motion estimates greater than those used in the past. The ultimate balance between risk of earthquake losses and cost for both remedial strengthening and improved earthquake-resistant construction must be decided by the public. Scientists and engineers must inform the public about earthquake shaking and its effect on structures. The exposure to damage from seismic shaking is steadily increasing because of continuing urbanization and the increasing complexity of lifeline systems, such as power, water, transportation, and communication systems. In the near future we should expect additional painful examples of the damage potential of moderate

  5. The key role of eyewitnesses in rapid earthquake impact assessment

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  6. Estimating pore fluid pressures during the Youngstown, Ohio earthquakes

    NASA Astrophysics Data System (ADS)

    Hsieh, P. A.

    2014-12-01

    Several months after fluid injection began in December 2010 at the Northstar 1 well in Youngstown, Ohio, low-magnitude earthquakes were detected in the Youngstown area, where no prior earthquakes had been detected. Concerns that the injection might have triggered the earthquakes lead to shutdown of the well in December 2011. Earthquake relocation analysis by Kim (2013, J. Geophy. Res., v 118, p. 3506-3518) showed that, from March 2011 to January 2012, 12 earthquakes with moment magnitudes of 1.8 to 3.9 occurred at depths of 3.5 to 4 km in the Precambrian basement along a previously unmapped vertical fault. The 2.8 km deep Northstar 1 well, which penetrated the top 60 m of the basement, appeared to have been drilled into the same fault. The earthquakes occurred at lateral distances of 0 to 1 km from the well. The present study aims to estimate the fluid pressure increase due to injection. The groundwater flow model MODFLOW is used to simulate fluid pressure propagation from the well injection interval into the basement fault and two permeable sandstone layers above the basement. The basement rock away from the fault is assumed impermeable. Reservoir properties (permeability and compressibility) of the fault and sandstone layers are estimated by calibrating the model to match injection history and wellhead pressure recorded daily during the operational period. Although the available data are not sufficient to uniquely determine reservoir properties, it is possible to determine reasonable ranges. Simulated fluid pressure increases at the locations and times of the earthquakes range from less than 0.01 MPa to about 1 MPa. Pressure measurements in the well after shut-in might enhance the estimation of reservoir properties. Such data could also improve the estimation of pore fluid pressure increase due to injection.

  7. Seismology: Remote-controlled earthquakes

    NASA Astrophysics Data System (ADS)

    Hayes, Gavin

    2016-04-01

    Large earthquakes cause other quakes near and far. Analyses of quakes in Pakistan and Chile suggest that such triggering can occur almost instantaneously, making triggered events hard to detect, and potentially enhancing the associated hazards.

  8. Earthquakes in Stable Continental Crust.

    ERIC Educational Resources Information Center

    Johnston, Arch C.; Kanter, Lisa R.

    1990-01-01

    Discussed are some of the reasons for earthquakes which occur in stable crust away from familiar zones at the ends of tectonic plates. Crust stability and the reactivation of old faults are described using examples from India and Australia. (CW)

  9. The next new Madrid earthquake

    SciTech Connect

    Atkinson, W.

    1988-01-01

    Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questions as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.

  10. Electrostatics in sandstorms and earthquakes

    NASA Astrophysics Data System (ADS)

    Shinbrot, Troy; Thyagu, Nirmal; Paehtz, Thomas; Herrmann, Hans

    2010-11-01

    We present new data demonstrating (1) that electrostatic charging in sandstorms is a necessary outcome in a class of rapid collisional flows, and (2) that electrostatic precursors to slip events - long reported in earthquakes - can be reproduced in the laboratory.

  11. Geochemical challenge to earthquake prediction.

    PubMed Central

    Wakita, H

    1996-01-01

    The current status of geochemical and groundwater observations for earthquake prediction in Japan is described. The development of the observations is discussed in relation to the progress of the earthquake prediction program in Japan. Three major findings obtained from our recent studies are outlined. (i) Long-term radon observation data over 18 years at the SKE (Suikoen) well indicate that the anomalous radon change before the 1978 Izu-Oshima-kinkai earthquake can with high probability be attributed to precursory changes. (ii) It is proposed that certain sensitive wells exist which have the potential to detect precursory changes. (iii) The appearance and nonappearance of coseismic radon drops at the KSM (Kashima) well reflect changes in the regional stress state of an observation area. In addition, some preliminary results of chemical changes of groundwater prior to the 1995 Kobe (Hyogo-ken nanbu) earthquake are presented. PMID:11607665

  12. Earthquakes and relative sealevel changes

    NASA Astrophysics Data System (ADS)

    Melini, D.; Piersanti, A.; Spada, G.; Soldati, G.; Casarotti, E.; Boschi, E.

    2004-05-01

    Using a spherical model of postseismic deformation, for the first time we have computed the global contribution of large earthquakes to the relative sealevel variations in the twentieth century. We have found that great earthquakes have the overall tendency to produce a sealevel rise, and that they affect the measurements taken at those tide-gauge sites that are commonly employed to obtain global estimates of sealevel rise. Though on a global scale most of the signal is associated with thrust events, on a regional scale the effects of great transcurrent earthquakes cannot be neglected. Depending on the viscosity of the asthenosphere, the contribution of earthquakes to the long-term sealevel changes amounts to at least 0.1 mm/yr. Thus, the climate-driven long-term sealevel changes deduced by tide-gauge observations may be slightly, but not negligibly, overestimated.

  13. Earthquake Alert System feasibility study

    SciTech Connect

    Harben, P.E.

    1991-12-01

    An Earthquake Alert System (EAS) could give several seconds to several tens of seconds warning before the strong motion from a large earthquake arrives. Such a system would include a large network of sensors distributed within an earthquake-prone region. The sensors closest to the epicenter of a particular earthquake would transmit data at the speed of light to a central processing center, which would broadcast an area-wide alarm in advance of the spreading elastic wave energy from the earthquake. This is possible because seismic energy travels slowly (3--6 km/s) compared to the speed of light. Utilities, public and private institutions, businesses, and the general public would benefit from an EAS. Although many earthquake protection systems exist that automatically shut down power, gas mains, etc. when ground motion at a facility reaches damaging levels, not EAS -- that is, a system that can provide warning in advance of elastic wave energy arriving at a facility -- has ever been developed in the United States. A recent study by the National Academy of Sciences (NRC, 1991) concludes that an EAS is technically feasible and strongly recommends installing a prototype system that makes use of existing microseismic stations as much as possible. The EAS concept discussed here consists of a distributed network of remote seismic stations that measure weak and strong earth motion and transmit the data in real time to central facility. This facility processes the data and issues warning broadcasts in the form of information packets containing estimates of earthquake location, zero time (the time the earthquake began), magnitude, and reliability of the predictions. User of the warning broadcasts have a dedicated receiver that monitors the warning broadcast frequency. The user also has preprogrammed responses that are automatically executed when the warning information packets contain location and magnitude estimates above a facility`s tolerance.

  14. Earthquake Alert System feasibility study

    SciTech Connect

    Harben, P.E.

    1991-12-01

    An Earthquake Alert System (EAS) could give several seconds to several tens of seconds warning before the strong motion from a large earthquake arrives. Such a system would include a large network of sensors distributed within an earthquake-prone region. The sensors closest to the epicenter of a particular earthquake would transmit data at the speed of light to a central processing center, which would broadcast an area-wide alarm in advance of the spreading elastic wave energy from the earthquake. This is possible because seismic energy travels slowly (3--6 km/s) compared to the speed of light. Utilities, public and private institutions, businesses, and the general public would benefit from an EAS. Although many earthquake protection systems exist that automatically shut down power, gas mains, etc. when ground motion at a facility reaches damaging levels, not EAS -- that is, a system that can provide warning in advance of elastic wave energy arriving at a facility -- has ever been developed in the United States. A recent study by the National Academy of Sciences (NRC, 1991) concludes that an EAS is technically feasible and strongly recommends installing a prototype system that makes use of existing microseismic stations as much as possible. The EAS concept discussed here consists of a distributed network of remote seismic stations that measure weak and strong earth motion and transmit the data in real time to central facility. This facility processes the data and issues warning broadcasts in the form of information packets containing estimates of earthquake location, zero time (the time the earthquake began), magnitude, and reliability of the predictions. User of the warning broadcasts have a dedicated receiver that monitors the warning broadcast frequency. The user also has preprogrammed responses that are automatically executed when the warning information packets contain location and magnitude estimates above a facility's tolerance.

  15. Mitigating earthquakes; the federal role

    USGS Publications Warehouse

    Press, F.

    1977-01-01

    With rapid approach of a capability to make reliable earthquake forecasts, it essential that the Federal Government play a strong, positive role in formulating and implementing plans to reduce earthquake hazards. Many steps are being taken in this direction, with the President looking to the Office of Science and Technology Policy (OSTP) in his Executive Office to provide leadership in establishing and coordinating Federal activities. 

  16. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    NASA Astrophysics Data System (ADS)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    approach of statistics of universal precursors or stress level. The approach is more related to failure physics, by studying the ongoing failure. But it requires watching and relevant modeling for years, even decades. Useful information on fault process and warnings can be issued along the way, starting when we discover a fault showing signs of preparatory processes, up to the time of the earthquake. Such information and warnings could be issued by government agencies in cooperation with scientists to the local Civil Protection committee closest to the fault with information about how to prepare, including directives about enhanced watching. For such a warning service we need a continuously operating geo-watching system, applying modern computing technology to the multidisciplinary data, and a rule based schedule to prepare adequate warnings.

  17. Earthquake catalogue for Germany and adjacent areas for the years 800 to 2008

    NASA Astrophysics Data System (ADS)

    Leydecker, Günter

    2010-05-01

    The presented earthquake catalogue for Germany and adjacent areas (47°N - 56°N and 5°E - 16°E) for the years 800 to 2008 contains ca 12000 events with ML ≥ 2.0. The earthquake catalogue in a digital format was published for the first time by the author in the year 1986. It contained ca 2000 earthquakes covering the time period from 1000 to 1981. Since then the spatial area and the time period of the catalogue were extended, the catalogue was annually updated and according to the actual historical earthquake research corrected and supplemented. The immense growth of the number of events since 1986 must be seen parallel to the increase of seismic stations in the local and regional networks. The data for the annual update are taken from the Data Catalogues of the German Regional Seismic Network - compiled and edited by the Federal Institute for Geosciences and Natural Resources (BGR), Hannover - and, with priority, from the later results of the hypocenter computations by the operators of the local networks. Added to the earthquake data are the macroseismic observations as epicentral intensity and isoseismal radii. Each earthquake is linked to its seismo-geographical region. The digital earthquake catalogue actual at any one time was presented for download under www.bgr.de/quakecat. Over the past years an alignment was carried out with the catalogues of the adjacent countries, in particular with the new catalogue of Switzerland. By co-operation with colleagues the data of the Vogtland swarm quakes of the 20th century could be corrected and harmonized with respect to epicenter, magnitude and epicentral intensity. A re-evaluation of the seismic activity of the Upper Rhine Graben was accomplished using the data from the long lasting monitoring with local seismic networks. Furthermore, the moment magnitude was newly added to the list of the earthquake parameters. The appendices to the earthquake catalogue contain detailed references to each earthquake, lists of the

  18. Elastic energy release in great earthquakes and eruptions

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust

    2014-05-01

    The sizes of earthquakes are measured using well-defined, measurable quantities such as seismic moment and released (transformed) elastic energy. No similar measures exist for the sizes of volcanic eruptions, making it difficult to compare the energies released in earthquakes and eruptions. Here I provide a new measure of the elastic energy (the potential mechanical energy) associated with magma chamber rupture and contraction (shrinkage) during an eruption. For earthquakes and eruptions, elastic energy derives from two sources: (1) the strain energy stored in the volcano/fault zone before rupture, and (2) the external applied load (force, pressure, stress, displacement) on the volcano/fault zone. From thermodynamic considerations it follows that the elastic energy released or transformed (dU) during an eruption is directly proportional to the excess pressure (pe) in the magma chamber at the time of rupture multiplied by the volume decrease (-dVc) of the chamber, so that . This formula can be used as a basis for a new eruption magnitude scale, based on elastic energy released, which can be related to the moment-magnitude scale for earthquakes. For very large eruptions (>100 km3), the volume of the feeder-dike is negligible, so that the decrease in chamber volume during an eruption corresponds roughly to the associated volume of erupted materials , so that the elastic energy is . Using a typical excess pressures of 5 MPa, it is shown that the largest known eruptions on Earth, such as the explosive La Garita Caldera eruption (27-28 million years ago) and largest single (effusive) Colombia River basalt lava flows (15-16 million years ago), both of which have estimated volumes of about 5000 km3, released elastic energy of the order of 10EJ. For comparison, the seismic moment of the largest earthquake ever recorded, the M9.5 1960 Chile earthquake, is estimated at 100 ZJ and the associated elastic energy release at 10EJ.

  19. Coseismic ionospheric and geomagnetic disturbances caused by great earthquakes

    NASA Astrophysics Data System (ADS)

    Hao, Yongqiang; Zhang, Donghe; Xiao, Zuo

    2016-04-01

    Despite primary energy disturbances from the Sun, oscillations of the Earth surface due to a large earthquake will couple with the atmosphere and therefore the ionosphere, then the so-called coseismic ionospheric disturbances (CIDs) can be detected in the ionosphere. Using a combination of techniques, total electron content, HF Doppler, and ground magnetometer, a new time-sequence of such effects propagation were developed on observational basis and ideas on explanation provided. In the cases of 2008 Wenchuan and 2011 Tohoku earthquakes, infrasonic waves accompanying the propagation of seismic Rayleigh waves were observed in the ionosphere by all the three kinds of techniques. This is the very first report to present CIDs recorded by different techniques at co-located sites and profiled with regard to changes of both ionospheric plasma and current (geomagnetic field) simultaneously. Comparison between the oceanic (2011 Tohoku) and inland (2008 Wenchuan) earthquakes revealed that the main directional lobe of latter case is more distinct which is perpendicular to the direction of the fault rupture. We argue that the different fault slip (inland or submarine) may affect the way of couplings of lithosphere with atmosphere. References Zhao, B., and Y. Hao (2015), Ionospheric and geomagnetic disturbances caused by the 2008 Wenchuan earthquake: A revisit, J. Geophys. Res. Space Physics, 120, doi:10.1002/2015JA021035. Hao, Y. Q., Z. Xiao, and D. H. Zhang (2013), Teleseismic magnetic effects (TMDs) of 2011 Tohoku earthquake, J. Geophys. Res. Space Physics, 118, 3914-3923, doi:10.1002/jgra.50326. Hao, Y. Q., Z. Xiao, and D. H. Zhang (2012), Multi-instrument observation on co-seismic ionospheric effects after great Tohoku earthquake, J. Geophys. Res., 117, A02305, doi:10.1029/2011JA017036.

  20. Hydrological signatures of earthquake strain

    SciTech Connect

    Muir-Wood, R.; King, G.C.P. |

    1993-12-01

    The character of the hydrological changes that follow major earthquakes has been investigated and found to be dependent on the style of faulting. The most significant response is found to accompany major normal fault earthquakes. Increases in spring and river discharges peak a few days after the earthquake, and typically, excesss flow is sustained for a period of 6-12 months. In contrast, hydrological changes accompanying pure reverse fault earthquakes are either undetected or indicate lowering of well levels and spring flows. Strike-slip and oblique-slip fault movements are associated with a mixture of responses but appear to release no more than 10% of the water volume of the same sized normal fault event. For two major normal fault earthquakes in the western United States (those of Hebgen Lake on August 17, 1959, and Borah Peak on October 28, 1983), there is sufficient river flow information to allow the magnitude and extent of the postseismic discharge to be quantified. The discharge has been converted to a rainfall equivalent, which is found to exceed 100 mm close to the fault and to remain above 10 mm at distances greater than 50 km. Results suggest that water-filled craks are ubiquitous throughout the brittle continental crust and that these cracks open and close throughout the earthquake cycle. The existence of tectonically induced fluid flows on the scale that we demonstrate has major implications for our understanding of the mechanical and chemical behavior of crustal rocks.

  1. Building with Earthquakes in Mind

    NASA Astrophysics Data System (ADS)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  2. Earthquakes - Volcanoes (Causes and Forecast)

    NASA Astrophysics Data System (ADS)

    Tsiapas, E.

    2009-04-01

    EARTHQUAKES - VOLCANOES (CAUSES AND FORECAST) ELIAS TSIAPAS RESEARCHER NEA STYRA, EVIA,GREECE TEL.0302224041057 tsiapas@hol.gr The earthquakes are caused by large quantities of liquids (e.g. H2O, H2S, SO2, ect.) moving through lithosphere and pyrosphere (MOHO discontinuity) till they meet projections (mountains negative projections or projections coming from sinking lithosphere). The liquids are moved from West Eastward carried away by the pyrosphere because of differential speed of rotation of the pyrosphere by the lithosphere. With starting point an earthquake which was noticed at an area and from statistical studies, we know when, where and what rate an earthquake may be, which earthquake is caused by the same quantity of liquids, at the next east region. The forecast of an earthquake ceases to be valid if these components meet a crack in the lithosphere (e.g. limits of lithosphere plates) or a volcano crater. In this case the liquids come out into the atmosphere by the form of gasses carrying small quantities of lava with them (volcano explosion).

  3. Global earthquake fatalities and population

    USGS Publications Warehouse

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  4. Two models for earthquake forerunners

    USGS Publications Warehouse

    Mjachkin, V.I.; Brace, W.F.; Sobolev, G.A.; Dieterich, J.H.

    1975-01-01

    Similar precursory phenomena have been observed before earthquakes in the United States, the Soviet Union, Japan, and China. Two quite different physical models are used to explain these phenomena. According to a model developed by US seismologists, the so-called dilatancy diffusion model, the earthquake occurs near maximum stress, following a period of dilatant crack expansion. Diffusion of water in and out of the dilatant volume is required to explain the recovery of seismic velocity before the earthquake. According to a model developed by Soviet scientists growth of cracks is also involved but diffusion of water in and out of the focal region is not required. With this model, the earthquake is assumed to occur during a period of falling stress and recovery of velocity here is due to crack closure as stress relaxes. In general, the dilatancy diffusion model gives a peaked precursor form, whereas the dry model gives a bay form, in which recovery is well under way before the earthquake. A number of field observations should help to distinguish between the two models: study of post-earthquake recovery, time variation of stress and pore pressure in the focal region, the occurrence of pre-existing faults, and any changes in direction of precursory phenomena during the anomalous period. ?? 1975 Birkha??user Verlag.

  5. Modeling of the Coseismic Electromagnetic Field Observed during the 28 September 2004, M 6.0 Parkfield Earthquake

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Harris, J. M.; Wen, J.; Chen, X.; Hu, H.

    2014-12-01

    On 28 September 2004, the M6.0 Parkfield earthquake took place on the San Andreas fault, California. A seismic station which is named PKD and located near the epicenter recorded both of the seismic and electromagnetic (EM) signals during this earthquake. This station is operated by Berkeley Seismological Laboratory and installed with broadband seismometer and EM sensors which are close to each other. Significant seismic signals as well as clear coseismic EM signals were recorded during this earthquake, providing a good opportunity to study the coseismic EM phenomenon. We modeled the coseismic EM signals from the viewpoint of the electrokinetic effect on the basis of Pride's equations. The earthquake source is taken as a finite fault with length of 40 km along the strike direction and width of 15 km along the dip direction. The source parameters that we use for calculation were inverted by Liu et al. [2006, BSSA] by utilizing the seismic data. While in their inversion the earth crust are treated as 7 horizontally-layered elastic solids, in our calculation these solid layers are regarded as porous media. Each porous layer has the same P-velocity, S-velocity and density to its counterpart solid layer. The salinity is set to be 0.1 mol/L for all the layers so that conductivity is uniformly distributed with the value of 0.036 S/m. To evaluate the electric and magnetic responses during the rupturing of the earthquake, we use the algorithm developed by Hu and Gao [2011, JGR] which calculates both the seismic and EM wavefields simultaneously. Since the inversion of source parameters was operated in the frequency band 0.16 Hz-1 Hz, we filter both of the synthetic seismoelectric wavefields and the real data before making comparison between them. Our preliminary result shows that in this frequency range, the amplitude of the simulated coseismic electric field is of the order of 1μV/m, which is the same to the real electric data. This supports the electrokinetic effect to be

  6. Lessons learned by the DOE complex from recent earthquakes

    SciTech Connect

    Eli, M.W.

    1993-07-01

    Recent earthquake damage investigations at various industrial facilities have resulted in providing the DOE complex with reminders of practical lessons for structures, systems, and components (SSCs) involving: confinement of hazardous materials; continuous, safe operations; occupant safety; and protection of DOE investments and mission-dependent items. Recent assessments are summarized, showing examples of damage caused by the 1992 California Earthquakes (Cape Mendocino, Landers, and Big Bear) and the 1991 Costa Rica Earthquake (Valle de la Estrella). These lessons if applied along with the new DOE NPH Standards (1020--92 Series) can help assure that DOE facilities will meet the intent of the seismic requirements in the new DOE NPH Order 5480.28.

  7. Sumatran megathrust earthquakes: from science to saving lives.

    PubMed

    Sieh, Kerry

    2006-08-15

    Most of the loss of life, property and well-being stemming from the great Sumatran earthquake and tsunami of 2004 could have been avoided and losses from similar future events can be largely prevented. However, achieving this goal requires forging a chain linking basic science-the study of why, when and where these events occur-to people's everyday lives. The intermediate links in this chain are emergency response preparedness, warning capability, education and infrastructural changes. In this article, I first describe our research on the Sumatran subduction zone. This research has allowed us to understand the basis of the earthquake cycle on the Sumatran megathrust and to reconstruct the sequence of great earthquakes that have occurred there in historic and prehistoric times. On the basis of our findings, we expect that one or two more great earthquakes and tsunamis, nearly as devastating as the 2004 event, are to be expected within the next few decades in a region of coastal Sumatra to the south of the zone affected in 2004. I go on to argue that preventing future tragedies does not necessarily involve hugely expensive or high-tech solutions such as the construction of coastal defences or sensor-based tsunami warning systems. More valuable and practical steps include extending the scientific research, educating the at-risk populations as to what to do in the event of a long-lasting earthquake (i.e. one that might be followed by a tsunami), taking simple measures to strengthen buildings against shaking, providing adequate escape routes and helping the residents of the vulnerable low-lying coastal strips to relocate their homes and businesses to land that is higher or farther from the coast. Such steps could save hundreds and thousands of lives in the coastal cities and offshore islands of western Sumatra, and have general applicability to strategies for helping the developing nations to deal with natural hazards. PMID:16844643

  8. Seismogeodesy and Rapid Earthquake and Tsunami Source Assessment

    NASA Astrophysics Data System (ADS)

    Melgar Moctezuma, Diego

    This dissertation presents an optimal combination algorithm for strong motion seismograms and regional high rate GPS recordings. This seismogeodetic solution produces estimates of ground motion that recover the whole seismic spectrum, from the permanent deformation to the Nyquist frequency of the accelerometer. This algorithm will be demonstrated and evaluated through outdoor shake table tests and recordings of large earthquakes, notably the 2010 Mw 7.2 El Mayor-Cucapah earthquake and the 2011 Mw 9.0 Tohoku-oki events. This dissertations will also show that strong motion velocity and displacement data obtained from the seismogeodetic solution can be instrumental to quickly determine basic parameters of the earthquake source. We will show how GPS and seismogeodetic data can produce rapid estimates of centroid moment tensors, static slip inversions, and most importantly, kinematic slip inversions. Throughout the dissertation special emphasis will be placed on how to compute these source models with minimal interaction from a network operator. Finally we will show that the incorporation of off-shore data such as ocean-bottom pressure and RTK-GPS buoys can better-constrain the shallow slip of large subduction events. We will demonstrate through numerical simulations of tsunami propagation that the earthquake sources derived from the seismogeodetic and ocean-based sensors is detailed enough to provide a timely and accurate assessment of expected tsunami intensity immediately following a large earthquake.

  9. PRELIMINARY SELECTION OF MGR DESIGN BASIS EVENTS

    SciTech Connect

    J.A. Kappes

    1999-09-16

    The purpose of this analysis is to identify the preliminary design basis events (DBEs) for consideration in the design of the Monitored Geologic Repository (MGR). For external events and natural phenomena (e.g., earthquake), the objective is to identify those initiating events that the MGR will be designed to withstand. Design criteria will ensure that radiological release scenarios resulting from these initiating events are beyond design basis (i.e., have a scenario frequency less than once per million years). For internal (i.e., human-induced and random equipment failures) events, the objective is to identify credible event sequences that result in bounding radiological releases. These sequences will be used to establish the design basis criteria for MGR structures, systems, and components (SSCs) design basis criteria in order to prevent or mitigate radiological releases. The safety strategy presented in this analysis for preventing or mitigating DBEs is based on the preclosure safety strategy outlined in ''Strategy to Mitigate Preclosure Offsite Exposure'' (CRWMS M&O 1998f). DBE analysis is necessary to provide feedback and requirements to the design process, and also to demonstrate compliance with proposed 10 CFR 63 (Dyer 1999b) requirements. DBE analysis is also required to identify and classify the SSCs that are important to safety (ITS).

  10. The Kangding earthquake swarm of November, 2014

    NASA Astrophysics Data System (ADS)

    Yang, Wen; Cheng, Jia; Liu, Jie; Zhang, Xuemei

    2015-06-01

    There was an earthquake swarm of two major events of M S6.3 and M S5.8 on the Xianshuihe fault in November, 2014. The two major earthquakes are both strike-slip events with aftershock zone along NW direction. We have analyzed the characteristics of this earthquake sequence. The b value and the h value show the significant variations in different periods before and after the M S5.8 earthquake. Based on the data of historical earthquakes, we also illustrated the moderate-strong seismic activity on the Xianshuihe fault. The Kangding earthquake swarm manifests the seismic activity on Xianshuihe fault may be in the late seismic active period. The occurrence of the Kangding earthquake may be an adjustment of the strong earthquakes on the Xianshuihe fault. The Coulomb failure stress changes caused by the historical earthquakes were also given in this article. The results indicate that the earthquake swarm was encouraged by the historical earthquakes since 1893, especially by the M S7.5 Kangding earthquake in 1955. The Coulomb failure stress changes also shows the subsequent M S5.8 earthquake was triggered by the M S6.3 earthquake.

  11. Biological Indicators in Studies of Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Sidorin, A. Ya.; Deshcherevskii, A. V.

    2012-04-01

    Time series of data on variations in the electric activity (EA) of four species of weakly electric fish Gnathonemus leopoldianus and moving activity (MA) of two cat-fishes Hoplosternum thoracatum and two groups of Columbian cockroaches Blaberus craniifer were analyzed. The observations were carried out in the Garm region of Tajikistan within the frameworks of the experiments aimed at searching for earthquake precursors. An automatic recording system continuously recorded EA and DA over a period of several years. Hourly means EA and MA values were processed. Approximately 100 different parameters were calculated on the basis of six initial EA and MA time series, which characterize different variations in the EA and DA structure: amplitude of the signal and fluctuations of activity, parameters of diurnal rhythms, correlated changes in the activity of various biological indicators, and others. A detailed analysis of the statistical structure of the total array of parametric time series obtained in the experiment showed that the behavior of all animals shows a strong temporal variability. All calculated parameters are unstable and subject to frequent changes. A comparison of the data obtained with seismicity allow us to make the following conclusions: (1) The structure of variations in the studied parameters is represented by flicker noise or even a more complex process with permanent changes in its characteristics. Significant statistics are required to prove the cause-and-effect relationship of the specific features of such time series with seismicity. (2) The calculation of the reconstruction statistics in the EA and MA series structure demonstrated an increase in their frequency in the last hours or a few days before the earthquake if the hypocenter distance is comparable to the source size. Sufficiently dramatic anomalies in the behavior of catfishes and cockroaches (changes in the amplitude of activity variation, distortions of diurnal rhythms, increase in the

  12. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  13. Was the 2015 Hindu-Kush intermediate-depth earthquake a repeat of the previous M~7 earthquakes ?

    NASA Astrophysics Data System (ADS)

    Harada, Tomoya; Satake, Kenji; Ishibashi, Katsuhiko

    2016-04-01

    On Oct. 26, 2015, an Mw7.5 earthquake occurred at intermediate depth (230 km) beneath Hindu-Kush. This event took place in the source region of the six previous M~7 earthquakes which recurred about every nine years:1956 (mb 6.5), 1965 (mb 7.5), 1974 (mb 7.1), 1983 (Mw 7.4), 1993 (Mw 7.0), and 2002 (Mw 7.3). On the basis of these past events, Harada and Ishibashi (2012, EGU) proposed that next event might be imminent in this region. However, recurrence interval between the 2002 and 2015 events is longer than those of events before 2002. In this study, in order to examine whether the 2015 earthquake re-ruptured the source region of the repeating M~7 earthquakes, we performed the same analysis of Harada and Ishibashi (2012) for the previous M~7 intermediate-depth earthquakes; namely, simultaneous relocation of the 1956 main shock and the earthquakes from 1964 to 2015, and mechanism determination / slip distribution estimation of the six events by tele-seismic body-wave analysis. As a result, the 2015 main shock is located close to the 1956, 1965, 1974, and 1983 main shocks and the 1993 foreshock (Mw 6.3) which occurred about 30 minutes before the main shock. The 2015 mechanism solution is very similar to those of the former six events (ESE-WNW striking and southward-dipping high-angle reverse faulting with a down-dip tension). However, the 2015 slip is distributed at the un-ruptured area by the five earthquakes from 1965 to 2002. The 1965, 1974, 1983, and 1993 events rupture the same region repeatedly. The main slips of the 1993, 2002, and 2015 events do not overlap each other; this was confirmed by re-analysis of the waveforms recorded at the same stations. As for the 1965, 1974, and 1983 earthquakes, overlap of the slip distributions may be caused by the low quality of the waveform data. From slip distributions, the M~7 earthquakes, at least for the 1993, 2002, and 2015 events, may not be considered as characteristic earthquakes. However, it is notable that main

  14. The Los Alamos Seismic Network (LASN): Improved Network Instrumentation, Local Earthquake Catalog Updates, and Peculiar Types of Data

    NASA Astrophysics Data System (ADS)

    Roberts, P. M.; Ten Cate, J. A.; House, L. S.; Greene, M. K.; Morton, E.; Kelley, R. E.

    2013-12-01

    The Los Alamos Seismic Network (LASN) has operated for 41 years, and provided the data to locate more than 2,500 earthquakes in north-central New Mexico. The network was installed for seismic verification research, as well as to monitor and locate earthquakes near Los Alamos National Laboratory (LANL). LASN stations are the only monitoring stations in New Mexico north of Albuquerque. The original network once included 22 stations in northern Mew Mexico. With limited funding in the early 1980's, the network was downsized to 7 stations within an area of about 15 km (N-S) by 15 km (E-W), centered on Los Alamos. Over the last four years, eight additional stations have been installed, which have considerably expanded the spatial coverage of the network. Currently, 7 stations have broadband, three-component seismometers with digital telemetry, and the remaining 8 have traditional 1 Hz short-period seismometers with either analog telemetry or on-site digital recording. A vertical array of accelerometers was also installed in a wellbore on LANL property. This borehole array has 3-component digital strong-motion sensors. Recently we began upgrading the local strong-motion accelerometer (SMA) network as well, with the addition of high-resolution digitizers and high-sensitivity force-balance accelerometers (FBA). We will present an updated description of the current LASN station, instrumentation and telemetry configurations, as well as the data acquisition and event-detection software structure used to record events in Earthworm. Although more than 2,000 earthquakes were detected and located in north-central New Mexico during the first 11 years of LASN's operation (1973 to 1984), currently only 1-2 earthquakes per month are detected and located within about 150 km of Los Alamos. Over 850 of these nearby earthquakes have been located from 1973 to present. We recently updated the LASN earthquake catalog for north-central New Mexico up through 2012 and most of 2013. Locations

  15. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  16. An evaluation of coordination relationships during earthquake emergency rescue using entropy theory.

    PubMed

    Rong, Huang; Xuedong, Liang; Guizhi, Zeng; Yulin, Ye; Da, Wang

    2015-05-01

    Emergency rescue after an earthquake is complex work which requires the participation of relief and social organizations. Studying earthquake emergency coordination efficiency can not only help rescue organizations to define their own rescue missions, but also strengthens inter-organizational communication and collaboration tasks, improves the efficiency of emergency rescue, and reduces loss. In this paper, collaborative entropy is introduced to study earthquake emergency rescue operations. To study the emergency rescue coordination relationship, collaborative matrices and collaborative entropy functions are established between emergency relief work and relief organizations, and the collaborative efficiency of the emergency rescue elements is determined based on this entropy function. Finally, the Lushan earthquake is used as an example to evaluate earthquake emergency rescue coordination efficiency. PMID:26083170

  17. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  18. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1976-01-01

    Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

  19. Earthquakes with non--double-couple mechanisms.

    PubMed

    Frohlich, C

    1994-05-01

    Seismological observations confirm that the pattern of seismic waves from some earthquakes cannot be produced by slip along a planar fault surface. More than one physical mechanism is required to explain the observed varieties of these non-double-couple earthquakes. The simplest explanation is that some earthquakes are complex, with stress released on two or more suitably oriented, nonparallel fault surfaces. However, some shallow earthquakes in volcanic and geothermal areas require other explanations. Current research focuses on whether fault complexity explains most observed non-double-couple earthquakes and to what extent ordinary earthquakes have non-double-couple components. PMID:17794721

  20. Seismic gaps and source zones of recent large earthquakes in coastal Peru

    USGS Publications Warehouse

    Dewey, J.W.; Spence, W.

    1979-01-01

    The earthquakes of central coastal Peru occur principally in two distinct zones of shallow earthquake activity that are inland of and parallel to the axis of the Peru Trench. The interface-thrust (IT) zone includes the great thrust-fault earthquakes of 17 October 1966 and 3 October 1974. The coastal-plate interior (CPI) zone includes the great earthquake of 31 May 1970, and is located about 50 km inland of and 30 km deeper than the interface thrust zone. The occurrence of a large earthquake in one zone may not relieve elastic strain in the adjoining zone, thus complicating the application of the seismic gap concept to central coastal Peru. However, recognition of two seismic zones may facilitate detection of seismicity precursory to a large earthquake in a given zone; removal of probable CPI-zone earthquakes from plots of seismicity prior to the 1974 main shock dramatically emphasizes the high seismic activity near the rupture zone of that earthquake in the five years preceding the main shock. Other conclusions on the seismicity of coastal Peru that affect the application of the seismic gap concept to this region are: (1) Aftershocks of the great earthquakes of 1966, 1970, and 1974 occurred in spatially separated clusters. Some clusters may represent distinct small source regions triggered by the main shock rather than delimiting the total extent of main-shock rupture. The uncertainty in the interpretation of aftershock clusters results in corresponding uncertainties in estimates of stress drop and estimates of the dimensions of the seismic gap that has been filled by a major earthquake. (2) Aftershocks of the great thrust-fault earthquakes of 1966 and 1974 generally did not extend seaward as far as the Peru Trench. (3) None of the three great earthquakes produced significant teleseismic activity in the following month in the source regions of the other two earthquakes. The earthquake hypocenters that form the basis of this study were relocated using station

  1. Authorization basis for the 209-E Building

    SciTech Connect

    TIFFANY, M.S.

    1999-02-23

    This Authorization Basis document is one of three documents that constitute the Authorization Basis for the 209-E Building. Per the U.S. Department of Energy, Richland Operations Office (RL) letter 98-WSD-074, this document, the 209-E Building Preliminary Hazards Analysis (WHC-SD-WM-TI-789), and the 209-E Building Safety Evaluation Report (97-WSD-074) constitute the Authorization Basis for the 209-E Building. This Authorization Basis and the associated controls and safety programs will remain in place until safety documentation addressing deactivation of the 209-E Building is developed by the contractor and approved by RL.

  2. Laboratory Generated M -6 Earthquakes

    NASA Astrophysics Data System (ADS)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-10-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick-slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick-slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3-6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1-10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  3. Earthquake clusters in Corinth Rift

    NASA Astrophysics Data System (ADS)

    Mesimeri, Maria; Papadimitriou, Eleftheria; Karakostas, Vasilios; Tsaklidis, George

    2013-04-01

    Clusters commonly occur as main shock-aftershock (MS-AS) sequences but also as earthquake swarms, which are empirically defined as an increase in seismicity rate above the background rate without a clear triggering main shock earthquake. Earthquake swarms occur in a variety of different environments and might have a diversity of origins, characterized by a high b-value in their magnitude distribution. The Corinth Rift, which was selected as our target area, appears to be the most recent extensional structure, with a likely rate of fault slip of about 1cm/yr and opening of 7mm/yr. High seismic activity accommodates the active deformation with frequent strong (M≥6.0) events and several seismic excitations without a main shock with clearly discriminative magnitude. Identification of earthquake clusters that occurred in this area in last years and investigation of their spatio-temporal distribution is attempted, with the application of known declustering algorithms, aiming to associate their occurrence with certain patterns in seismicity behavior. The earthquake catalog of the National Hellenic Seismological Network is used, and a certain number of clusters were extracted from the dataset, with the MS-AS sequences being distinguished from earthquake swarms. Spatio-temporal properties of each subset were analyzed in detail, after determining the respective completeness magnitude. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non-extensive statistical physics - Application to the geodynamic system of the Hellenic Arc, SEISMO FEAR HELLARC".

  4. 47 CFR 13.1 - Basis and purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Basis and purpose. 13.1 Section 13.1 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMERCIAL RADIO OPERATORS General § 13.1 Basis and purpose. (a) Basis. The basis for the rules contained in this part is the Communications Act of 1934,...

  5. Catalog of earthquakes along the San Andreas fault system in Central California, July-September 1972

    USGS Publications Warehouse

    Wesson, R.L.; Meagher, K.L.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period July - September, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). Catalogs for the first and second quarters of 1972 have been prepared by Wessan and others (1972 a & b). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1254 earthquakes in Central California. Arrival times at 129 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 104 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB), the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

  6. Earthquake induced Landslides in the Sikkim Himalaya - A Consequences of the 18th September 2011 Earthquake

    NASA Astrophysics Data System (ADS)

    Sharma, Ashok Kumar

    2015-04-01

    On September 18, 2011 an earthquake of 6.8 magnitude on the Richter scale struck Sikkim at 18.11 hours IST. The epicenter of the quake was latidude 27.7o North and longitude 88.2o East about 64 km North-West of Gangtok along the junction point of Teesta lineament and Kanchenjunga fault in the North District of Sikkim. The high intensity tremor triggered various types of natural calamities in the form of landslides, road blocks, falling boulders, lake bursts, flash floods, falling of trees, etc. and caused severe damage to life and property of the people in Sikkim. As the earthquake occurred during the monsoon season, heavy rain and landslides rendered rescue operations extremely difficult. Almost all road connectivity and communication network were disrupted. Sikkim experiences landslides year after year, especially during the monsoons and periods of intense rain. This hazard affects the economy of the State very badly. But due to the earthquake, many new and a few reactivated landslides have occurred in the Sikkim Himalaya.

  7. Earthquake early warning for Romania - most recent improvements

    NASA Astrophysics Data System (ADS)

    Marmureanu, Alexandru; Elia, Luca; Martino, Claudio; Colombelli, Simona; Zollo, Aldo; Cioflan, Carmen; Toader, Victorin; Marmureanu, Gheorghe; Marius Craiu, George; Ionescu, Constantin

    2014-05-01

    EWS for Vrancea earthquakes uses the time interval (28-32 sec.) between the moment when the earthquake is detected by the local seismic network installed in the epicenter area (Vrancea) and the arrival time of the seismic waves in the protected area (Bucharest) to send earthquake warning to users. In the last years, National Institute for Earth Physics (NIEP) upgraded its seismic network in order to cover better the seismic zones of Romania. Currently the National Institute for Earth Physics (NIEP) operates a real-time seismic network designed to monitor the seismic activity on the Romania territory, dominated by the Vrancea intermediate-depth (60-200 km) earthquakes. The NIEP real-time network consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Ranger, gs21, Mark l22) and acceleration sensors (Episensor). Recent improvement of the seismic network and real-time communication technologies allows implementation of a nation-wide EEWS for Vrancea and other seismic sources from Romania. We present a regional approach to Earthquake Early Warning for Romania earthquakes. The regional approach is based on PRESTo (Probabilistic and Evolutionary early warning SysTem) software platform: PRESTo processes in real-time three channel acceleration data streams: once the P-waves arrival have been detected, it provides earthquake location and magnitude estimations, and peak ground motion predictions at target sites. PRESTo is currently implemented in real- time at National Institute for Earth Physics, Bucharest for several months in parallel with a secondary EEWS. The alert notification is issued only when both systems validate each other. Here we present the results obtained using offline earthquakes originating from Vrancea area together with several real

  8. [Deep vein thrombosis in Noto Peninsula earthquake victims].

    PubMed

    Terakami, Takako; Ohba, Noriko; Morishita, Eriko; Yoshida, Tomotaka; Asakura, Hidesaku; Kimura, Keiichi; Ohtake, Hiroshi; Watanabe, Gou; Fujita, Shinichi; Wada, Takashi

    2009-05-01

    The earthquake occurred in the Noto Peninsula in the northern part of Ishikawa prefecture, Japan, at 9:25 a.m. on March 25th 2007. Medical activities for prevention of deep vein thrombosis (DVT), early detection of DVT, and early treatment of DVT were performed immediately after the earthquake on the basis of a previous report regarding earthquake disasters. This report described the conditions involved in the development of DVT. General inhabitants in shelters were examined by questionnaires, venous ultrasonography of lower limb, and blood tests. The DVT-positive rate was 10.6% (21 cases/198 cases), and the soleal vein was the most common location of DVT accounting for 71.4% of cases(20 lower limbs/28 lower limbs). Plasma levels of fibrin/fibrinogen degradation products and D-dimer in the DVT-positive group (20 cases) were significantly higher than those in the DVT-negative group(162 cases) (P<0.03). No deaths or cases of serious illness caused by DVT were reported in the earthquake. The medical activities described here were effective due to the past experience and the cooperation of many people. PMID:19522245

  9. Ten Years of Real-Time Earthquake Loss Alerts

    NASA Astrophysics Data System (ADS)

    Wyss, M.

    2013-12-01

    The priorities of the most important parameters of an earthquake disaster are: Number of fatalities, number of injured, mean damage as a function of settlement, expected intensity of shaking at critical facilities. The requirements to calculate these parameters in real time are: 1) Availability of reliable earthquake source parameters within minutes. 2) Capability of calculating expected intensities of strong ground shaking. 3) Data sets on population distribution and conditions of building stock as a function of settlements. 4) Data on locations of critical facilities. 5) Verified methods of calculating damage and losses. 6) Personnel available on a 24/7 basis to perform and review these calculations. There are three services available that distribute information about the likely consequences of earthquakes within about half an hour of the event. Two of these calculate losses, one gives only general information. Although, much progress has been made during the last ten years improving the data sets and the calculating methods, much remains to be done. The data sets are only first order approximations and the methods bare refinement. Nevertheless, the quantitative loss estimates after damaging earthquakes in real time are generally correct in the sense that they allow distinguishing disastrous from inconsequential events.

  10. Testing Damage Scenarios. From Historical Earthquakes To Silent Active Faults

    NASA Astrophysics Data System (ADS)

    Galli, P.; Orsini, G.; Bosi, V.; di Pasquale, G.; Galadini, F.

    Italy is rich with historical scenarios of disruption and death that arrived up to us through the insight descriptions of hundreds of manuscripts, reports, treatises, letters and epigraphs. All these historical data constitute today one of the most powerful data-base of earthquake-induced effects. Moreover, it is now possible to relate many of these earthquakes to geological structures, the seismogenetic behavior of which has been investigated by means of paleoseismological studies. On the basis of these information and of those gathered through the national census (performed on popu- lation and dwellings by ISTAT, Italian Institute of Statistics in 1991) we developed a methodology (FaCES, Fault-Controlled Earthquake Scenario) which reproduce the damage scenario caused by the rupture of a defined fault, providing an estimate of the losses in terms of damages to building and consequences to population. The reliabil- ity of scenarios has been tested by comparing the historical damage distribution of an earthquake with that obtained applying FaCES to the responsible fault. Finally, we hypothesize the scenario related to three historically-silent faults of central Apennines (Mt. Vettore, Mt. Gorzano and Gran Sasso faults), the Holocene activity of which has been recently ascertained though paleoseimological analyses.

  11. Physical model for earthquakes, 2. Application to southern California

    SciTech Connect

    Rundle, J.B.

    1988-06-10

    The purpose of this paper is to apply ideas developed in a previous paper to the construction of a detailed model for earthquake dynamics in southern California. The basis upon which the approach is formulated is that earthquakes are perturbations on, or more specifically fluctuations about, the long-term motions of the plates. This concept is made mathematically precise by means of a ''fluctuation hypothesis,'' which states that all physical quantities associated with earthquakes can be expressed as integral expansions in a fluctuating quantity called the ''offset phase.'' While in general, the frictional stick-slip properties of the complex, interacting faults should properly come out of the underlying physics, a simplification is made here, and a simple, spatially varying friction law is assumed. Together with the complex geometry of the major active faults, an assumed, spatially varying Earth rheology, the average rates of long-term offsets on all the major faults, and the friction coefficients, one can generate synthetic earthquake histories for comparison to the real data.

  12. Quantitative assessment of earthquake damages: approximate economic loss

    NASA Astrophysics Data System (ADS)

    Badal, J.; Vazquez-Prada, M.; Gonzalez, A.; Samardzhieva, E.

    2003-04-01

    Prognostic estimations about the approximate direct economic cost associated with the damages caused by earthquakes are made following a suitable methodology of wide-ranging application. For an evaluation in advance of the economic cost derived from the damages, we take into account the local social wealth as a function of the gross domestic product of the country. We use a GIS-based tool, tacking advantage of the possibilities of such a system for the treatment of space-distributed data. The work is performed on the basis of the relationship between macroseismic intensity and earthquake economic loss in percentage of the wealth. We have implemented interactive software permitting to efficiently show the information by screen and the rapid visual evaluation of the performance of our method. Such an approach to earthquake casualties and damages is carried out for sites near to important urban concentrations located in a seismically active zone of Spain, thus contributing to an easier taking of decisions in contemporary earthquake engineering, emergency preparedness planning and seismic risk prevention.

  13. Groundwater Ion Content Precursors of Strong Earthquakes in Kamchatka (Russia)

    NASA Astrophysics Data System (ADS)

    Biagi, P. F.; Ermini, A.; Kingsley, S. P.; Khatkevich, Y. M.; Gordeev, E. I.

    The Kamchatka peninsula, located in the far east of Russia, is a geologically active margin where the Pacific plate subducts beneath the North American and Eurasia plates. This area is characterised by frequent and strong seismic activity (magnitudes reaching 8.6) and epicentres are generally distributed offshore along the eastern coast of the peninsula. For many years, hydrogeochemicals have been sampled with a mean interval of three days to measure the most common ions in the groundwater of five deep wells in the southern area of the Kamchatka peninsula. In the last decade, five earthquakes with M > 6.5 have occurred at distances less than 250 km from these wells. These earthquakes were powerful enough for them to be considered as potential originators of precursors. In order to reveal possible precursors of these earthquakes, we analysed the groundwater ion contents. The quasi-periodic annual variation was filtered out, together with other slow trends, and then we smoothed out the high frequency fluctuations that arise from errors in a single measurement. When examining the data, we labelled each signal with an amplitude greater than three times the standard deviation as an irregularity and we made a first attempt at defining an anomaly as an irregularity occurring simultaneously in more than one parameter at each well. In a second definition we used the existence of an irregularity occurring simultaneously in each ion at more than one well. Then, on the basis of past results worldwide and the time interval between the earthquakes observed, we chose 158 days as the maximum temporal window between a possible anomaly and the subsequent earthquake. With the first anomaly definition we identified 6 anomalies with 4 possible successes and 2 failures. For the five earthquakes considered capable of producing precursors we obtained precursors in three cases. With the second anomaly definition we identified 10 anomalies with 7 possible successes and 3 failures and we

  14. Authorization basis requirements comparison report

    SciTech Connect

    Brantley, W.M.

    1997-08-18

    The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation.

  15. Update NEMC Database using Arcgis Software and Example of Simav-Kutahya earthquake sequences

    NASA Astrophysics Data System (ADS)

    Altuncu Poyraz, S.; Kalafat, D.; Kekovali, K.

    2011-12-01

    In this study, totally 144043 earthquake data from the Kandilli Observatory Earthquake Research Institute & National Earthquake Monitoring Center (KOERI-NEMC) seismic catalog between 2.0≤M≤7.9 occured in Turkey for the time interval 1900-2011 were used. The data base includes not only coordinates, date, magnitude and depth of these earthquakes but also location and installation information, field studies, geology, technical properties of 154 seismic stations. Additionally, 1063 historical earthquakes included to the data base. Source parameters of totally 738 earthquakes bigger than M≥4.0 occured between the years 1938-2008 were added to the database. In addition, 103 earthquake's source parameters were calculated (bigger than M≥4.5) since 2008. In order to test the charateristics of earthquakes, questioning, visualization and analyzing aftershock sequences on 19 May 2011 Simav-Kutahya earthquake were selected and added to the data base. The Simav earthquake (western part of Anatolia) with magnitude Ml= 5.9 occurred at local time 23:15 is investigated, in terms of accurate event locations and source properties of the largest events. The aftershock distribution of Simav earthquake shows the activation of a 17-km long zone, which extends in depth between 5 and 10 km. In order to make contribution to better understand the neotectonics of this region, we analysed the earthquakes using the KOERI (Kandilli Observatory and Earthquake Research Institute) seismic stations along with the seismic stations that are operated by other communities and recorded suscessfuly the Simav seismic activity in 2011. Source mechanisms of 19 earthquakes with magnitudes between 3.8 ≤ML<6.0 were calculated by means of Regional Moment Tensor Inversion (RMT) technique. The mechanism solutions show the presence of east-west direction normal faults in the region. As a result an extensional regime is dominated in the study area. The aim of this study is to store and compile earthquake

  16. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  17. The music of earthquakes and Earthquake Quartet #1

    USGS Publications Warehouse

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  18. Detection of change points in underlying earthquake rates, with application to global mega-earthquakes

    NASA Astrophysics Data System (ADS)

    Touati, Sarah; Naylor, Mark; Main, Ian

    2016-02-01

    The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large

  19. Record-Breaking Intervals: Detecting Trends in the Incidence of Self-Similar Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Yoder, Mark R.; Rundle, John B.

    2015-08-01

    We introduce a method of resolving temporal incidence trends in earthquake sequences. We have developed a catalog partitioning method based on canonical earthquake scaling relationships, and have further developed a metric based on record-breaking interval (RBI) statistics to resolve increasing and decreasing seismicity in time series of earthquakes. We calculated the RBI metric over fixed-length sequences of earthquake intervals and showed that the length of those sequences is related to the magnitude of the earthquake to which the method is sensitive—longer sequences resolve large earthquakes, shorter sequences resolve small-magnitude events. This sequence length effectively constitutes a local temporal catalog constraint, and we show that spatial constraints can be defined from rupture length scaling. We have applied the method to several high-profile earthquakes and have shown that it consistently resolves aftershock sequences after a period of accelerating seismicity before the targeted mainshock. The method also suggests a minimum detectable (forecastable) mainshock magnitude on the basis of the catalog's minimum completeness magnitude.

  20. Is Your Class a Natural Disaster? It can be... The Real Time Earthquake Education (RTEE) System

    NASA Astrophysics Data System (ADS)

    Whitlock, J. S.; Furlong, K.

    2003-12-01

    In cooperation with the U.S. Geological Survey (USGS) and its National Earthquake Information Center (NEIC) in Golden, Colorado, we have implemented an autonomous version of the NEIC's real-time earthquake database management and earthquake alert system (Earthworm). This is the same system used professionally by the USGS in its earthquake response operations. Utilizing this system, Penn State University students participating in natural hazard classes receive real-time alerts of worldwide earthquake events on cell phones distributed to the class. The students are then responsible for reacting to actual earthquake events, in real-time, with the same data (or lack thereof) as earthquake professionals. The project was first implemented in Spring 2002, and although it had an initial high intrigue and "coolness" factor, the interest of the students waned with time. Through student feedback, we observed that scientific data presented on its own without an educational context does not foster student learning. In order to maximize the impact of real-time data and the accompanying e-media, the students need to become personally involved. Therefore, in collaboration with the Incorporated Research Institutes of Seismology (IRIS), we have begun to develop an online infrastructure that will help teachers and faculty effectively use real-time earthquake information. The Real-Time Earthquake Education (RTEE) website promotes student learning by integrating inquiry-based education modules with real-time earthquake data. The first module guides the students through an exploration of real-time and historic earthquake datasets to model the most important criteria for determining the potential impact of an earthquake. Having provided the students with content knowledge in the first module, the second module presents a more authentic, open-ended educational experience by setting up an earthquake role-play situation. Through the Earthworm system, we have the ability to "set off

  1. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  2. Discriminating between explosions and earthquakes

    NASA Astrophysics Data System (ADS)

    Cho, Kwang-Hyun

    2014-12-01

    Earthquake, explosion, and a nuclear test data are compared with forward modeling and band-pass filtered surface wave amplitude data for exploring methodologies to improve earthquake-explosion discrimination. The proposed discrimination method is based on the solutions of a double integral transformation in the wavenumber and frequency domains. Recorded explosion data on June 26, 2001 (39.212°N, 125.383°E) and October 30, 2001 (38.748°N, 125.267°E), a nuclear test on October 9, 2006 (41.275°N, 129.095°E), and two earthquakes on April 14, 2002 (39.207°N, 125.686°E) and June 7, 2002 (38.703°N, 125.638°E), all in North Korea, are used to discriminate between explosions and earthquakes by seismic wave analysis and numerical modeling. The explosion signal is characterized by first P waves with higher energy than that of S waves. Rg waves are clearly dominant at 0.05-0.5 Hz in the explosion data but not in the earthquake data. This feature is attributed to the dominant P waves in the explosion and their coupling with the SH components.

  3. Tectonic earthquakes of anthropogenic origin

    NASA Astrophysics Data System (ADS)

    Adushkin, V. V.

    2016-03-01

    The enhancement of seismicity induced by industrial activity in Russia in the conditions of present-day anthropization is noted. In particular, the growth in the intensity and number of strong tectonic earthquakes with magnitudes M ≥ 3 (seismic energy 109 J) due to human activity is revealed. These man-made tectonic earthquakes have started to occur in the regions of the East European Platform which were previously aseismic. The development of such seismicity is noted in the areas of intense long-term mineral extraction due to the increasing production depth and extended mining and production. The mechanisms and generation conditions of man-made tectonic earthquakes in the anthropogenically disturbed medium with the changed geodynamical and fluid regime is discussed. The source zones of these shallow-focus tectonic earthquakes of anthropogenic origin are formed in the setting of stress state rearrangement under anthropogenic loading both near these zones and at a significant distance from them. This distance is determined by the tectonic structure of the rock mass and the character of its energy saturation, in particular, by the level of the formation pressure or pore pressure. These earthquakes occur at any time of the day, have a triggered character, and are frequently accompanied by catastrophic phenomena in the underground mines and on the surface due to the closeness to the source zones.

  4. Artificial Neural Networks for Earthquake Early-Warning

    NASA Astrophysics Data System (ADS)

    Boese, M.; Erdik, M.; Wenzel, F.

    2003-12-01

    The rapid urbanization and industrial development in areas of high seismic hazard increase the threat to human life and the vulnerability of industrial facilities by earthquakes. As earthquake prediction is elusive and, most likely, will not be achievable in near future, early-warning systems play a key role in earthquake loss reduction. Seismic waves propagate with significant lower velocity than information on these waves can be passed along to a vulnerable area or facility using modern telemetry systems. Within shortest time an earthquake early-warning system estimates the ground motion that will be caused by the oscillating seismic waves in the endangered area. Dependent on the predicted possible damage appropriate automatisms for loss reduction (such as the stoppage of trains or the interruption of gas pipelines) are triggered and executed some seconds to minutes before the devastating waves actually arrive. The Turkish megacity Istanbul faces a seismic hazard of particular severity due to its proximity to the complex fault system in the Marmara region. The likelihood for an seismic event of moment magnitude above 7.2 to occur within the next 30 years is estimated to be 70%. The Istanbul Earthquake Rapid Response and Early-Warning System (IERREWS) is an important contribution to be prepared for future earthquakes in the region. The system is operated by the Kandilli Observatory and the Earthquake Research Institute of the Bogazici University in cooperation with other agencies. The early-warning part of IERREWS consists of ten strong motion stations with 24-bit resolution, communication links and processing facilities. The accelerometers are installed on the shoreline of the Marmara Sea and are operated in on-line mode for continuous and near-real time transfer of data. Using the example of the IERREWS station configuration and seismic background of the Marmara region we present an approach that considers the problem of earthquake early-warning as a pattern

  5. Reevaluation of the macroseismic effects of the 23 January 1838 Vrancea earthquake

    NASA Astrophysics Data System (ADS)

    Rogozea, M.; Marmureanu, Gh.; Radulian, M.

    2012-04-01

    The aim of this paper is to analyze the great event that occurred on 23 January 1838 (magnitude 7.5 in the Romanian catalogue). Valuable information has been collected from original or compiled historical sources, such as chronicles and manuscripts on that time, and related books and reports. The historical data are critically analyzed and, on the basis of our investigation, we showed the degree of significance of the earthquake parameters, as resulted from the effect distribution. The pattern of the intensity data points as reevaluated for this historical earthquake is compared with the pattern of instrumentally recorded major earthquake of 4 March 1977, the two events assumed to be similar as hypocenter location, source parameters and rupture propagation. We make also a comparative investigation for the attenuation relationship in case of Vrancea earthquakes, using historical data vs. instrumental data. Implications for the seismic hazard assessment are finally discussed.

  6. Reduction of earthquake risk in the united states: Bridging the gap between research and practice

    USGS Publications Warehouse

    Hays, W.W.

    1998-01-01

    Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE.

  7. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  8. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    NASA Astrophysics Data System (ADS)

    Ingebritsen, S. E.; Shelly, D. R.; Hsieh, P. A.; Clor, L. E.; Seward, P. H.; Evans, W. C.

    2015-11-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  9. Irregular recurrence of large earthquakes along the san andreas fault: evidence from trees.

    PubMed

    Jacoby, G C; Sheppard, P R; Sieh, K E

    1988-07-01

    Old trees growing along the San Andreas fault near Wrightwood, California, record in their annual ring-width patterns the effects of a major earthquake in the fall or winter of 1812 to 1813. Paleoseismic data and historical information indicate that this event was the "San Juan Capistrano" earthquake of 8 December 1812, with a magnitude of 7.5. The discovery that at least 12 kilometers of the Mojave segment of the San Andreas fault ruptured in 1812, only 44 years before the great January 1857 rupture, demonstrates that intervals between large earthquakes on this part of the fault are highly variable. This variability increases the uncertainty of forecasting destructive earthquakes on the basis of past behavior and accentuates the need for a more fundamental knowledge of San Andreas fault dynamics. PMID:17841050

  10. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench

    PubMed Central

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-01-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15–18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface. PMID:27447546

  11. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench.

    PubMed

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-01-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15-18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface. PMID:27447546

  12. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench

    NASA Astrophysics Data System (ADS)

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-07-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15-18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface.

  13. Earthquakes and the urban environment. Volume I

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 1 contains chapters on earthquake parameters and hazards.

  14. Earthquakes and the urban environment. Volume II

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 2 contains chapters on earthquake prediction, control, building design and building response.

  15. The October 12, 1992, Dahshur, Egypt, Earthquake

    USGS Publications Warehouse

    Thenhaus, P.C.; Celebi, M.; Sharp, R.V.

    1993-01-01

    We were part of an international reconnaissance team that investigated the Dahsur earthquake. This article summarizes our findings and points out how even a relatively moderate sized earthquake can cause widespread damage and a large number of casualities. 

  16. Earthquakes & Volcanoes, Volume 23, Number 6, 1992

    USGS Publications Warehouse

    U.S. Geological Survey; Gordon, David W., (Edited By)

    1993-01-01

    Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers.

  17. The effects of the Yogyakarta earthquake at LUSI mud volcano, Indonesia

    NASA Astrophysics Data System (ADS)

    Lupi, M.; Saenger, E. H.; Fuchs, F.; Miller, S. A.

    2013-12-01

    The M6.3 Yogyakarta earthquake shook Central Java on May 27th, 2006. Forty seven hours later, hot mud outburst at the surface near Sidoarjo, approximately 250 km from the earthquake epicentre. The mud eruption continued and originated LUSI, the youngest mud volcanic system on earth. Since the beginning of the eruption, approximately 30,000 people lost their homes and 13 people died due to the mud flooding. The causes that initiated the eruption are still debated and are based on different geological observations. The earthquake-triggering hypothesis is supported by the evidence that at the time of the earthquake ongoing drilling operations experienced a loss of the drilling mud downhole. In addition, the eruption of the mud began only 47 hours after the Yogyakarta earthquake and the mud reached the surface at different locations aligned along the Watukosek fault, a strike-slip fault upon which LUSI resides. Moreover, the Yogyakarta earthquake also affected the volcanic activity of Mt. Semeru, located as far as Lusi from the epicentre of the earthquake. However, the drilling-triggering hypothesis points out that the earthquake was too far from LUSI for inducing relevant stress changes at depth and highlight how upwelling fluids that reached the surface first emerged only 200 m far from the drilling rig that was operative at the time. Hence, was LUSI triggered by the earthquake or by drilling operations? We conducted a seismic wave propagation study on a geological model based on vp, vs, and density values for the different lithologies and seismic profiles of the crust beneath LUSI. Our analysis shows compelling evidence for the effects produced by the passage of seismic waves through the geological formations and highlights the importance of the overall geological structure that focused and reflected incoming seismic energy.

  18. 340 waste handling facility interim safety basis

    SciTech Connect

    VAIL, T.S.

    1999-04-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  19. 340 Waste handling facility interim safety basis

    SciTech Connect

    Stordeur, R.T.

    1996-10-04

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  20. Characterisation of Liquefaction Effects for Beyond-Design Basis Safety Assessment of Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Bán, Zoltán; Győri, Erzsébet; János Katona, Tamás; Tóth, László

    2015-04-01

    Preparedness of nuclear power plants to beyond design base external effects became high importance after 11th of March 2011 Great Tohoku Earthquakes. In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be considered as a beyond design basis hazard. The consequences of liquefaction have to be analysed with the aim of definition of post-event plant condition, identification of plant vulnerabilities and planning the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The case of Nuclear Power Plant at Paks, Hungary is used as an example for demonstration of practical importance of the presented results and considerations. Contrary to the design, conservatism of the methodology for the evaluation of beyond design basis liquefaction effects for an operating plant has to be limited to a reasonable level. Consequently, applicability of all existing methods has to be considered for the best estimation. The adequacy and conclusiveness of the results is mainly limited by the epistemic uncertainty of the methods used for liquefaction hazard definition and definition of engineering parameters characterizing the consequences of liquefaction. The methods have to comply with controversial requirements. They have to be consistent and widely accepted and used in the practice. They have to be based on the comprehensive database. They have to provide basis for the evaluation of dominating engineering parameters that control the post-liquefaction response of the plant structures. Experience of Kashiwazaki-Kariwa plant hit by Niigata-ken Chuetsu-oki earthquake of 16 July 2007 and analysis of site conditions and plant layout at Paks plant have shown that the differential settlement is found to be the dominating effect in case considered. They have to be based on the probabilistic seismic hazard assessment and allow the integration into logic

  1. Earthquakes - Volcanoes (Causes - Forecast - Counteraction)

    NASA Astrophysics Data System (ADS)

    Tsiapas, Elias

    2013-04-01

    Earthquakes and volcanoes are caused by: 1)Various liquid elements (e.g. H20, H2S, S02) which emerge from the pyrosphere and are trapped in the space between the solid crust and the pyrosphere (Moho discontinuity). 2)Protrusions of the solid crust at the Moho discontinuity (mountain range roots, sinking of the lithosphere's plates). 3)The differential movement of crust and pyrosphere. The crust misses one full rotation for approximately every 100 pyrosphere rotations, mostly because of the lunar pull. The above mentioned elements can be found in small quantities all over the Moho discontinuity, and they are constantly causing minor earthquakes and small volcanic eruptions. When large quantities of these elements (H20, H2S, SO2, etc) concentrate, they are carried away by the pyrosphere, moving from west to east under the crust. When this movement takes place under flat surfaces of the solid crust, it does not cause earthquakes. But when these elements come along a protrusion (a mountain root) they concentrate on its western side, displacing the pyrosphere until they fill the space created. Due to the differential movement of pyrosphere and solid crust, a vacuum is created on the eastern side of these protrusions and when the aforementioned liquids overfill this space, they explode, escaping to the east. At the point of their escape, these liquids are vaporized and compressed, their flow accelerates, their temperature rises due to fluid friction and they are ionized. On the Earth's surface, a powerful rumbling sound and electrical discharges in the atmosphere, caused by the movement of the gasses, are noticeable. When these elements escape, the space on the west side of the protrusion is violently taken up by the pyrosphere, which collides with the protrusion, causing a major earthquake, attenuation of the protrusions, cracks on the solid crust and damages to structures on the Earth's surface. It is easy to foresee when an earthquake will occur and how big it is

  2. Earthquakes - Volcanoes (Causes - Forecast - Counteraction)

    NASA Astrophysics Data System (ADS)

    Tsiapas, Elias

    2015-04-01

    Earthquakes and volcanoes are caused by: 1) Various liquid elements (e.g. H20, H2S, S02) which emerge from the pyrosphere and are trapped in the space between the solid crust and the pyrosphere (Moho discontinuity). 2) Protrusions of the solid crust at the Moho discontinuity (mountain range roots, sinking of the lithosphere's plates). 3) The differential movement of crust and pyrosphere. The crust misses one full rotation for approximately every 100 pyrosphere rotations, mostly because of the lunar pull. The above mentioned elements can be found in small quantities all over the Moho discontinuity, and they are constantly causing minor earthquakes and small volcanic eruptions. When large quantities of these elements (H20, H2S, SO2, etc) concentrate, they are carried away by the pyrosphere, moving from west to east under the crust. When this movement takes place under flat surfaces of the solid crust, it does not cause earthquakes. But when these elements come along a protrusion (a mountain root) they concentrate on its western side, displacing the pyrosphere until they fill the space created. Due to the differential movement of pyrosphere and solid crust, a vacuum is created on the eastern side of these protrusions and when the aforementioned liquids overfill this space, they explode, escaping to the east. At the point of their escape, these liquids are vaporized and compressed, their flow accelerates, their temperature rises due to fluid friction and they are ionized. On the Earth's surface, a powerful rumbling sound and electrical discharges in the atmosphere, caused by the movement of the gasses, are noticeable. When these elements escape, the space on the west side of the protrusion is violently taken up by the pyrosphere, which collides with the protrusion, causing a major earthquake, attenuation of the protrusions, cracks on the solid crust and damages to structures on the Earth's surface. It is easy to foresee when an earthquake will occur and how big it is

  3. EARTHQUAKES - VOLCANOES (Causes - Forecast - Counteraction)

    NASA Astrophysics Data System (ADS)

    Tsiapas, Elias

    2014-05-01

    Earthquakes and volcanoes are caused by: 1)Various liquid elements (e.g. H20, H2S, S02) which emerge from the pyrosphere and are trapped in the space between the solid crust and the pyrosphere (Moho discontinuity). 2)Protrusions of the solid crust at the Moho discontinuity (mountain range roots, sinking of the lithosphere's plates). 3)The differential movement of crust and pyrosphere. The crust misses one full rotation for approximately every 100 pyrosphere rotations, mostly because of the lunar pull. The above mentioned elements can be found in small quantities all over the Moho discontinuity, and they are constantly causing minor earthquakes and small volcanic eruptions. When large quantities of these elements (H20, H2S, SO2, etc) concentrate, they are carried away by the pyrosphere, moving from west to east under the crust. When this movement takes place under flat surfaces of the solid crust, it does not cause earthquakes. But when these elements come along a protrusion (a mountain root) they concentrate on its western side, displacing the pyrosphere until they fill the space created. Due to the differential movement of pyrosphere and solid crust, a vacuum is created on the eastern side of these protrusions and when the aforementioned liquids overfill this space, they explode, escaping to the east. At the point of their escape, these liquids are vaporized and compressed, their flow accelerates, their temperature rises due to fluid friction and they are ionized. On the Earth's surface, a powerful rumbling sound and electrical discharges in the atmosphere, caused by the movement of the gasses, are noticeable. When these elements escape, the space on the west side of the protrusion is violently taken up by the pyrosphere, which collides with the protrusion, causing a major earthquake, attenuation of the protrusions, cracks on the solid crust and damages to structures on the Earth's surface. It is easy to foresee when an earthquake will occur and how big it is

  4. Incubation of Chile's 1960 Earthquake

    NASA Astrophysics Data System (ADS)

    Atwater, B. F.; Cisternas, M.; Salgado, I.; Machuca, G.; Lagos, M.; Eipert, A.; Shishikura, M.

    2003-12-01

    Infrequent occurrence of giant events may help explain how the 1960 Chile earthquake attained M 9.5. Although old documents imply that this earthquake followed great earthquakes of 1575, 1737 and 1837, only three earthquakes of the past 1000 years produced geologic records like those for 1960. These earlier earthquakes include the 1575 event but not 1737 or 1837. Because the 1960 earthquake had nearly twice the seismic slip expected from plate convergence since 1837, much of the strain released in 1960 may have been accumulating since 1575. Geologic evidence for such incubation comes from new paleoseismic findings at the R¡o Maullin estuary, which indents the Pacific coast at 41.5§ S midway along the 1960 rupture. The 1960 earthquake lowered the area by 1.5 m, and the ensuing tsunami spread sand across lowland soils. The subsidence killed forests and changed pastures into sandy tidal flats. Guided by these 1960 analogs, we inferred tsunami and earthquake history from sand sheets, tree rings, and old maps. At Chuyaquen, 10 km upriver from the sea, we studied sand sheets in 31 backhoe pits on a geologic transect 1 km long. Each sheet overlies the buried soil of a former marsh or meadow. The sand sheet from 1960 extends the entire length of the transect. Three earlier sheets can be correlated at least half that far. The oldest one, probably a tsunami deposit, surrounds herbaceous plants that date to AD 990-1160. Next comes a sandy tidal-flat deposit dated by stratigraphic position to about 1000-1500. The penultimate sheet is a tsunami deposit younger than twigs from 1410-1630. It probably represents the 1575 earthquake, whose accounts of shaking, tsunami, and landslides rival those of 1960. In that case, the record excludes the 1737 and 1837 events. The 1737 and 1837 events also appear missing in tree-ring evidence from islands of Misquihue, 30 km upriver from the sea. Here the subsidence in 1960 admitted brackish tidal water that defoliated tens of thousands of

  5. Major earthquake shakes northern Pakistan

    NASA Astrophysics Data System (ADS)

    Kumar, Mohi

    A magnitude 7.6 earthquake that shook the western Himalayas on 8 October killed at least 23,000 in Pakistan and 1,400 in India, injured more than 50,000 people, and left more 2.5 million people homeless across the Kashmir region. The official death toll could exceed 30,000, placing this among most deadly earthquakes to have ever occurred on the Indian subcontinent.Scientists warn that, given the lack of development and poor construction in the area, future earthquakes in more densely populated areas could be devastating. David Simpson, president of the Incorporated Research Institutes for Seismology, said the 8 October quake ‘was a terrible disaster, but not to the level of what could happen in the future. This is yet again another warning message of things to come.”

  6. Earthquake-dammed lakes in New Zealand

    SciTech Connect

    Adams, J.

    1981-05-01

    Eleven small lakes were formed by landslides caused by the 1929 Buller earthquake; four others were formed by other historic earthquakes in New Zealand. At least nine other New Zealand lakes are also dammed by landslides and were probably formed by prehistoric earthquakes. When recognized by morphology, synchronous age, and areal distribution, earthquake-dammed lakes could provide an estimate of paleoseismicity for the past few hundred or thousand years.

  7. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  8. Monitoring the ionosphere during the earthquake on GPS data

    NASA Astrophysics Data System (ADS)

    Smirnov, V. M.; Smirnova, E. V.

    The problem of stability estimation of physical state of an atmosphere attracts a rapt attention of the world community but it is still far from being solved A lot of global atmospheric processes which have direct influence upon all forms of the earth life have been detected The comprehension of cause effect relations stipulating their origin and development is possible only on the basis of long-term sequences of observations data of time-space variations of the atmosphere characteristics which should be received on a global scale and in the interval of altitudes as brand as possible Such data can be obtained only with application satellite systems The latest researches have shown that the satellite systems can be successfully used for global and continuous monitoring ionosphere of the Earth In turn the ionosphere can serve a reliable indicator of different kinds of effects on an environment both of natural and anthropogenic origin Nowadays the problem of the short-term forecast of earthquakes has achieved a new level of understanding There have been revealed indisputable factors which show that the ionosphere anomalies observed during the preparation of seismic events contain the information allowing to detect and to interpret them as earthquake precursors The partial decision of the forecast problem of earthquakes on ionospheric variations requires the processing data received simultaneously from extensive territories Such requirements can be met only on the basis of ground-space system of ionosphere monitoring The navigating systems

  9. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  10. Izmit, Turkey 1999 Earthquake Interferogram

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul.

  11. Fault failure with moderate earthquakes

    USGS Publications Warehouse

    Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

    1987-01-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

  12. Statistical earthquake focal mechanism forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2014-04-01

    Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

  13. GPS Earthquake Early Warning in Cascadia

    NASA Astrophysics Data System (ADS)

    Melbourne, T. I.; Scrivner, C. W.; Santillan, V. M.; Webb, F.

    2011-12-01

    Over 400 GPS receivers of the combined PANGA and PBO networks currently operate along the Cascadia subduction zone, all of which are high-rate and telemetered in real-time. These receivers span the M9 megathrust, M7 crustal faults beneath population centers, several active Cascades volcanoes, and a host of other hazard sources, and together enable a host of new approaches towards hazards mitigation. Data from the majority of the stations is received in real time at CWU and processed into one-second position estimates using 1) relative positioning within several reference frames constrained by 2) absolute point positioning using streamed satellite orbit and clock corrections. While the former produces lower-noise time series, for earthquakes greater than ~M7 and ground displacements exceeding ~20 cm, point positioning alone is shown to provide very rapid and robust estimates of the location and amplitude of both dynamic strong ground motion and permanent deformation. The advantage of point-positioning over relative positioning for earthquake applications lies primarily in the fact that each station's position is estimated independently, without double-differencing, within a reference frame defined by earth's center of mass and the satellite orbits. Point positioning does not require a nearby stable reference station or network whose motion (such as during a seismic event) aliases directly into fictitious displacement of any station in question. Thus, for real-time GPS earthquake characterization, this is of great importance in ensuring a robust measurement. We are now producing real-time point-positions using GIPSY5 and corrections to broadcast satellite clocks and orbits streamed live from the DLR in Germany. We have also developed a stream-editor to flag and fix cycle-slips and other data problems on the fly prior to positioning. We are achieving < 3s latency and RMS scatter of under 4 cm. For use in earthquake early warning, we have developed estimation routines

  14. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  15. Earthquakes in the New Zealand Region.

    ERIC Educational Resources Information Center

    Wallace, Cleland

    1995-01-01

    Presents a thorough overview of earthquakes in New Zealand, discussing plate tectonics, seismic measurement, and historical occurrences. Includes 10 figures illustrating such aspects as earthquake distribution, intensity, and fissures in the continental crust. Tabular data includes a list of most destructive earthquakes and descriptive effects…

  16. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  17. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  18. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  19. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  20. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  1. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  2. Response facilitation: implications for perceptual theory, psychotherapy, neurophysiology, and earthquake prediction.

    PubMed

    Medici, R G; Frey, A H; Frey, D

    1985-04-01

    There have been numerous naturalistic observations and anecdotal reports of abnormal animal behavior prior to earthquakes. Basic physiological and behavioral data have been brought together with geophysical data to develop a specific explanation to account for how animals could perceive and respond to precursors of impending earthquakes. The behavior predicted provides a reasonable approximation to the reported abnormal behaviors; that is, the behavior appears to be partly reflexive and partly operant. It can best be described as agitated stereotypic behavior. The explanation formulated has substantial implications for perceptual theory, psychotherapy, and neurophysiology, as well as for earthquake prediction. Testable predictions for biology, psychology, and geophysics can be derived from the explanation. PMID:3997385

  3. Detection of crustal deformation from the Landers earthquake sequence using continuous geodetic measurements

    NASA Technical Reports Server (NTRS)

    Bock, Yehuda; Agnew, Duncan C.; Fang, Peng; Genrich, Joachim F.; Hager, Bradford H.; Herring, Thomas A.; Hudnut, Kenneth W.; King, Robert W.; Larsen, Shawn; Minster, J.-B.

    1993-01-01

    The first measurements are reported for a major earthquake by a continuously operating GPS network, the permanent GPS Genetic ARRY (PGGA) in southern California. The Landers and Big Bear earthquakes of June 28, 1992 were monitored by daily observations. Ten weeks of measurements indicate significant coseismic motion at all PGGA sites, significant postseismic motion at one site for two weeks after the earthquakes, and no significant preseismic motion. These measurements demonstrate the potential of GPS monitoring for precise detection of precursory and aftershock seismic deformation in the near and far field.

  4. Earth science: lasting earthquake legacy

    USGS Publications Warehouse

    Parsons, Thomas E.

    2009-01-01

    On 31 August 1886, a magnitude-7 shock struck Charleston, South Carolina; low-level activity continues there today. One view of seismic hazard is that large earthquakes will return to New Madrid and Charleston at intervals of about 500 years. With expected ground motions that would be stronger than average, that prospect produces estimates of earthquake hazard that rival those at the plate boundaries marked by the San Andreas fault and Cascadia subduction zone. The result is two large 'bull's-eyes' on the US National Seismic Hazard Maps — which, for example, influence regional building codes and perceptions of public safety.

  5. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  6. Rheological Limitations on Deep Earthquakes

    NASA Astrophysics Data System (ADS)

    Weidner, D. J.; Raterron, P.; Chen, J.; Li, L.

    2002-05-01

    The occurrence of earthquakes deeper than 50 km requires processes that differ from the friction-mitigated phenomena that control fault slip of shallow events. These earthquakes that extend to depths of about 700 km have been useful in delineating plate tectonic processes and stresses, but remain elusive as to why the stress is released as earthquakes and not dissipated by slow plastic flow. Indeed, our understanding of plate tectonic and fracture processes would not be challenged if these events did not occur at all. While the source of deviatoric stress for these earthquakes results from the dynamics of plate tectonics, the ability to store the stress and catastrophically release the stress in an earthquake is controlled by the properties of the minerals that constitute the subducting slab. Guided by our recent experimental results, we propose that the deep earthquake process is largely controlled by the rheology of the subducted material. The core of the slab for depths shallower than 400 km, at temperatures less than 500,aC, is capable of supporting high levels of shear stress with a mildly temperature dependent strength. This region is not seismogenic as it does not have access to stress instabilities. Above 600,aC, olivine becomes too weak to be seismogenic. Between 500 and 600,aC, olivine strength is highly temperature dependent and the region is ripe for runaway plastic instabilities which give rise to earthquakes. This can account for double seismic zones owing to the distorted temperature profile. If the tectonic stress is insufficient in one of these zones, then there will be only a single seismic plane. Deeper than 400 km even though temperature continues to increase, the properties of the high pressure phases of olivine, wadsleyite and ringwoodite, control the seismicity. Higher temperatures are required of these phases to access the plastic instability process, thus allowing seismicity to increase in this region. Our data for perovskite indicate that

  7. Earthquake history, 1769-1989

    SciTech Connect

    Ellsworth, W.L.

    1990-01-01

    Motion between the North American and Pacific plates at the latitude of the San Andreas fault produces a broad zone of large-magnitude earthquake activity extending more than 500 km into the continental interior. The San Andreas fault system defines the western limits of plate interaction and dominates the overall pattern of seismic strain release. Few of the M {ge} 6 earthquakes that have occurred in the past 2 centuries were located on the San Andreas fault proper, an observation emphasizing the importance of secondary faults for both seismic-hazard assessment and tectonic processes.

  8. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  9. Earthquakes triggered by fluid extraction

    USGS Publications Warehouse

    Segall, P.

    1989-01-01

    Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author

  10. Array Measurements of Earthquake Rupture.

    NASA Astrophysics Data System (ADS)

    Goldstein, Peter

    Accurate measurements of earthquake rupture are an essential step in the development of an understanding of the earthquake source process. In this dissertation new array analysis techniques are developed and used to make the first measurements of two-dimensional earthquake rupture propagation. In order to measure earthquake rupture successfully it is necessary to account for the nonstationary behavior of seismic waves and nonplanar wavefronts due to time delays caused by local heterogeneities. Short time windows are also important because they determine the precision with which it is possible to measure rupture times of earthquake sources. The subarray spatial averaging and seismogram alignment methods were developed for these reasons. The basic algorithm which is used to compute frequency-wavenumber power spectra is the multiple signal characterization (MUSIC) method. Although a variety of methods could be applied with subarray spatial averaging and seismogram alignment, MUSIC is used because it has better resolution of multiple sources than other currently available methods and it provides a unique solution. Power spectra observed at the array are converted into source locations on the fault plane by tracing rays through a layered medium. A dipping layer correction factor is introduced to account for a laterally varying basin structure such as that found beneath the SMART 1 array in Taiwan. A framework is presented that allows for the estimation of precision and resolution of array measurements of source locations and can be used to design an optimum array for a given source. These methods are used to show that the November 14th 1986, M_{L} = 7.0 Hualien, Taiwan earthquake began as a shallow event with unilateral rupture from southwest to northeast. A few seconds later a second, deeper and larger event began rupturing from below the hypocentral region from southwest to northeast slightly down-dip. Energy density estimates indicate larger energy sources at greater

  11. Analysis of post-earthquake landslide activity and geo-environmental effects

    NASA Astrophysics Data System (ADS)

    Tang, Chenxiao; van Westen, Cees; Jetten, Victor

    2014-05-01

    Large earthquakes can cause huge losses to human society, due to ground shaking, fault rupture and due to the high density of co-seismic landslides that can be triggered in mountainous areas. In areas that have been affected by such large earthquakes, the threat of landslides continues also after the earthquake, as the co-seismic landslides may be reactivated by high intensity rainfall events. Earthquakes create Huge amount of landslide materials remain on the slopes, leading to a high frequency of landslides and debris flows after earthquakes which threaten lives and create great difficulties in post-seismic reconstruction in the earthquake-hit regions. Without critical information such as the frequency and magnitude of landslides after a major earthquake, reconstruction planning and hazard mitigation works appear to be difficult. The area hit by Mw 7.9 Wenchuan earthquake in 2008, Sichuan province, China, shows some typical examples of bad reconstruction planning due to lack of information: huge debris flows destroyed several re-constructed settlements. This research aim to analyze the decay in post-seismic landslide activity in areas that have been hit by a major earthquake. The areas hit by the 2008 Wenchuan earthquake will be taken a study area. The study will analyze the factors that control post-earthquake landslide activity through the quantification of the landslide volume changes well as through numerical simulation of their initiation process, to obtain a better understanding of the potential threat of post-earthquake landslide as a basis for mitigation planning. The research will make use of high-resolution stereo satellite images, UAV and Terrestrial Laser Scanning(TLS) to obtain multi-temporal DEM to monitor the change of loose sediments and post-seismic landslide activities. A debris flow initiation model that incorporates the volume of source materials, vegetation re-growth, and intensity-duration of the triggering precipitation, and that evaluates

  12. A Revised Earthquake Catalogue for South Iceland

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Zechar, J. Douglas; Vogfjörd, Kristín S.; Eberhard, David A. J.

    2016-01-01

    In 1991, a new seismic monitoring network named SIL was started in Iceland with a digital seismic system and automatic operation. The system is equipped with software that reports the automatic location and magnitude of earthquakes, usually within 1-2 min of their occurrence. Normally, automatic locations are manually checked and re-estimated with corrected phase picks, but locations are subject to random errors and systematic biases. In this article, we consider the quality of the catalogue and produce a revised catalogue for South Iceland, the area with the highest seismic risk in Iceland. We explore the effects of filtering events using some common recommendations based on network geometry and station spacing and, as an alternative, filtering based on a multivariate analysis that identifies outliers in the hypocentre error distribution. We identify and remove quarry blasts, and we re-estimate the magnitude of many events. This revised catalogue which we consider to be filtered, cleaned, and corrected should be valuable for building future seismicity models and for assessing seismic hazard and risk. We present a comparative seismicity analysis using the original and revised catalogues: we report characteristics of South Iceland seismicity in terms of b value and magnitude of completeness. Our work demonstrates the importance of carefully checking an earthquake catalogue before proceeding with seismicity analysis.

  13. 4 Earthquake: Major offshore earthquakes recall the Aztec myth

    USGS Publications Warehouse

    United States Department of Commerce

    1970-01-01

    Long before the sun clears the eastern mountains of April 29, 1970, the savanna highlands of Chiapas tremble from a magnitude 6.7 earthquake centered off the Pacific coast near Mexico’s southern border. Then, for a few hours, he Isthmus of Tehuantepec is quiet.

  14. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    ERIC Educational Resources Information Center

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  15. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  16. Transuranic storage and assay facility interim safety basis

    SciTech Connect

    Porten, D.R., Fluor Daniel Hanford

    1997-02-12

    The Transuranic Waste Storage and Assay Facility (TRUSAF) Interim Safety Basis document provides the authorization basis for the interim operation and restriction on interim operations for the TRUSAF. The TRUSAF ISB demonstrates that the TRUSAF can be operated safely, protecting the workers, the public, and the environment. The previous safety analysis document TRUSAF Hazards Identification and Evaluation (WHC 1987) is superseded by this document.

  17. TIR Anomalies Associated with some of the Major Earthquakes in 1999-2003

    NASA Astrophysics Data System (ADS)

    Taylor, P.; Ouzounov, D.; Bryant, N.; Logan, T.; Pulinets, S.

    2006-12-01

    Satellite Thermal Infrared (TIR) imaging data have recorded short-lived anomalies prior to major earthquakes. Others have proposed that these signals originate from electromagnetic phenomena associated with pre- seismic processes, causing enhanced IR emissions, that we are now calling TIR anomalies. The purpose of this study is to determine if TIR anomalies can be found in association with known earthquakes by systematically applying satellite data analysis techniques to imagery recorded prior-to and immediately after large earthquakes. Our approach utilizes both a mapping of surface TIR transient fields from polar orbiting satellites (Terra/MODIS, Aqua/MODIS, AVHRR and Landsat) and co-registering geosynchronous weather satellites images (GOES, METEOSAT). These observations were compared with recent strong earthquakes (1999-2003) using the techniques we developed map the pattern of these TIR anomalies. Our analysis of TIR satellite data recorded before the earthquakes we investigated supports both the hypothesis that these transient TIR anomalies occur prior to some earthquakes and our earlier results. Anomalies originate from the main faults for the Bhuj (India), Kunlun, (China), Boumerdes, (North Algeria) and Colima (Mexico) earthquakes. Anomalous TIR variations could be seen within a radius of approximately 100km around the epicenter over both land and sea. Two independent techniques, based on different satellite sources, confirm the existence of TIR anomalies prior to strong earthquakes that occur in different seismo tectonic settings. This outcome could be used as a basis for theoretical studies defining the mechanism of these phenomena. Ouzounov D., N. Bryant, T. Logan, S. Pulinets, P.Taylor Satellite thermal IR phenomena associated with some of the major earthquakes in 1999-2003, Physics and Chemistry of the Earth, 31, 154-163, 2006

  18. ERTS Applications in earthquake research and mineral exploration in California

    NASA Technical Reports Server (NTRS)

    Abdel-Gawad, M.; Silverstein, J.

    1973-01-01

    Examples that ERTS imagery can be effectively utilized to identify, locate, and map faults which show geomorphic evidence of geologically recent breakage are presented. Several important faults not previously known have been identified. By plotting epicenters of historic earthquakes in parts of California, Sonora, Mexico, Arizona, and Nevada, we found that areas known for historic seismicity are often characterized by abundant evidence of recent fault and crustal movements. There are many examples of seismically quiet areas where outstanding evidence of recent fault movements is observed. One application is clear: ERTS-1 imagery could be effectively utilized to delineate areas susceptible to earthquake recurrence which, on the basis of seismic data alone, may be misleadingly considered safe. ERTS data can also be utilized in planning new sites in the geophysical network of fault movement monitoring and strain and tilt measurements.

  19. Kappa and regional attenuation for Vrancea (Romania) earthquakes

    NASA Astrophysics Data System (ADS)

    Pavel, Florin; Vacareanu, Radu

    2015-07-01

    In this short paper, we investigate ground motion recordings from nine intermediate-depth Vrancea (Romania) earthquakes with M w ≥ 5.2 which occurred between 1986 and 2013. From these recordings, the high-frequency spectral decay parameter (kappa) is computed for 57 seismic stations in Romania. The relation between kappa and several parameters (event, source-to-site distance, soil class, geographical region) is evaluated through inversion techniques. The results show a very distinct influence of the earthquake magnitude and of the geographical position on the kappa values. Subsequently, a conventional frequency-dependent Q model of the form Q( f) = 100* f 1.20 is derived from the geometric spreading functions. The proposed Q model and the site specific kappa values represent the basis for future stochastic simulations of ground motions generated by the Vrancea subcrustal seismic source.

  20. Basis of Articulation.

    ERIC Educational Resources Information Center

    Kelz, Heinrich

    This article intends to shed light on the somewhat nebulous term "basis of articulation," which is used frequently in Eastern European phonetic and linguistic literature but highly neglected in contemporary American literature. In a historical approach, it is shown how the term originated and developed, how it is defined by various authors, and…

  1. Utilizing online monitoring of water wells for detecting earthquake precursors

    NASA Astrophysics Data System (ADS)

    Reuveni, Y.; Anker, Y.; Inbar, N.; Yellin-Dror, A.; Guttman, J.; Flexer, A.

    2015-12-01

    Groundwater reaction to earthquakes is well known and documented, mostly as changes in water levels or springs discharge, but also as changes in groundwater chemistry. During 2004 groundwater level undulations preceded a series of moderate (ML~5) earthquakes, which occurred along the Dead Sea Rift System (DSRS). In order to try and validate these preliminary observations monitoring of several observation wells was initiated. The monitoring and telemetry infrastructure as well as the wells were allocated specifically for the research by the Israeli National Water Company (Mekorot LTD.). Once several earthquake events were skipped due to insufficient sampling frequency and owing to insufficient storage capacity that caused loss of data, it was decided to establish an independent monitoring system. This current stage of research had commenced at 2011 and just recently became fully operative. At present there are four observation wells that are located along major faults, adjacent to the DSRS. The wells must be inactive and with a confined production layer. The wells are equipped with sensors for groundwter level, water conductivity and groundwater temperature measurements. The data acquisition and transfer resolution is of one minute and the dataset is being transferred through a GPRS network to a central database server. Since the start of the present research stage, most of the earthquakes recorded at the vicinity of the DSRS were smaller then ML 5, with groundwater response only after the ground movement. Nonetheless, distant earthquakes occurring as far as 300 km along a DSRS adjacent fault (ML~3), were noticed at the observation wells. A recent earthquake precursory reoccurrence was followed by a 5.5ML earthquake with an epicenter near the eastern shore of the Red Sea about 400km south to the wells that alerted the quake (see figure). In both wells anomalies is water levels and conductivity were found few hours before the quake, although any single anomaly cannot

  2. A comprehensive framework for post-earthquake rehabilitation plan of lifeline networks

    SciTech Connect

    Liang, J.W.

    1996-12-01

    Post-earthquake rehabilitation process of lifeline networks can be divided into three stages: emergency operation, short-term restoration, and long-term restoration, and different rehabilitation measures should be taken for different stages. This paper outlines the post-earthquake rehabilitation plan of lifeline networks which is being developed for Tianjin City. The objective of this plane is to shorten the time for restoration of lifeline networks and to reduce secondary disasters.

  3. Systematic Underestimation of Earthquake Magnitudes from Large Intracontinental Reverse Faults: Historical Ruptures Break Across Segment Boundaries

    NASA Technical Reports Server (NTRS)

    Rubin, C. M.

    1996-01-01

    Because most large-magnitude earthquakes along reverse faults have such irregular and complicated rupture patterns, reverse-fault segments defined on the basis of geometry alone may not be very useful for estimating sizes of future seismic sources. Most modern large ruptures of historical earthquakes generated by intracontinental reverse faults have involved geometrically complex rupture patterns. Ruptures across surficial discontinuities and complexities such as stepovers and cross-faults are common. Specifically, segment boundaries defined on the basis of discontinuities in surficial fault traces, pronounced changes in the geomorphology along strike, or the intersection of active faults commonly have not proven to be major impediments to rupture. Assuming that the seismic rupture will initiate and terminate at adjacent major geometric irregularities will commonly lead to underestimation of magnitudes of future large earthquakes.

  4. Intrastab Earthquakes: Dehydration of the Cascadia Slab

    USGS Publications Warehouse

    Preston, L.A.; Creager, K.C.; Crosson, R.S.; Brocher, T.M.; Trehu, A.M.

    2003-01-01

    We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intrastab earthquakes into two groups, permitting a new understanding of the origins of intrastab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation.

  5. Intraslab earthquakes: dehydration of the Cascadia slab.

    PubMed

    Preston, Leiph A; Creager, Kenneth C; Crosson, Robert S; Brocher, Thomas M; Trehu, Anne M

    2003-11-14

    We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intraslab earthquakes into two groups, permitting a new understanding of the origins of intraslab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation. PMID:14615535

  6. Predictability of population displacement after the 2010 Haiti earthquake

    PubMed Central

    Lu, Xin; Bengtsson, Linus; Holme, Petter

    2012-01-01

    Most severe disasters cause large population movements. These movements make it difficult for relief organizations to efficiently reach people in need. Understanding and predicting the locations of affected people during disasters is key to effective humanitarian relief operations and to long-term societal reconstruction. We collaborated with the largest mobile phone operator in Haiti (Digicel) and analyzed the movements of 1.9 million mobile phone users during the period from 42 d before, to 341 d after the devastating Haiti earthquake of January 12, 2010. Nineteen days after the earthquake, population movements had caused the population of the capital Port-au-Prince to decrease by an estimated 23%. Both the travel distances and size of people’s movement trajectories grew after the earthquake. These findings, in combination with the disorder that was present after the disaster, suggest that people’s movements would have become less predictable. Instead, the predictability of people’s trajectories remained high and even increased slightly during the three-month period after the earthquake. Moreover, the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds. For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought. PMID:22711804

  7. Physical model for earthquakes, 1. Fluctuations and interactions

    SciTech Connect

    Rundle, J.B.

    1988-06-10

    This is the first of a series of papers whose purpose is to develop the apparatus needed to understand the problem of earthquake occurrence in a more physical context than has often been the case. To begin, it is necessary to introduce the idea that earthquakes represent a fluctuation about the long-term motion of the plates. This idea is made mathematically explicit by the introduction of a concept called the fluctuation hypothesis. Under this hypothesis, all physical quantities which pertain to the occurrence of earthquakes are required to depend on a physically meaningful quantity called the offset phase, the difference between the present state of slip on the fault and its long-term average. For the mathematical treatment of the fluctuation problem it is most convenient to introduce a spatial averaging, or ''coarse-graining'' operation, dividing the fault plane into a lattice of N patches. In this way, integrals are replaced by sums, and differential equations are replaced by algebraic equations. As a result of these operations the physics of earthquake occurrence can be stated in terms of a physically meaningful energy functional: an ''external potential'' W/sub E/. W/sub E/ is a functional potential for the stress on the fault plane acting from the external medium and characterizes the energy contained within the medium external to the fault plane which is available to produce earthquakes. A simple example is discussed which involves the dynamics of a one-dimensional fault model. To gain some understanding, a simple friction law and a failure algorithm are assumed. It is shown that under certain circumstances the model fault dynamics undergo a sudden transition from a spatially ordered, temporally disordered state to a spatially disordered, temporally ordered state.

  8. Predictability of population displacement after the 2010 Haiti earthquake.

    PubMed

    Lu, Xin; Bengtsson, Linus; Holme, Petter

    2012-07-17

    Most severe disasters cause large population movements. These movements make it difficult for relief organizations to efficiently reach people in need. Understanding and predicting the locations of affected people during disasters is key to effective humanitarian relief operations and to long-term societal reconstruction. We collaborated with the largest mobile phone operator in Haiti (Digicel) and analyzed the movements of 1.9 million mobile phone users during the period from 42 d before, to 341 d after the devastating Haiti earthquake of January 12, 2010. Nineteen days after the earthquake, population movements had caused the population of the capital Port-au-Prince to decrease by an estimated 23%. Both the travel distances and size of people's movement trajectories grew after the earthquake. These findings, in combination with the disorder that was present after the disaster, suggest that people's movements would have become less predictable. Instead, the predictability of people's trajectories remained high and even increased slightly during the three-month period after the earthquake. Moreover, the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds. For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought. PMID:22711804

  9. Pre-earthquake magnetic pulses

    NASA Astrophysics Data System (ADS)

    Scoville, J.; Heraud, J.; Freund, F.

    2015-08-01

    A semiconductor model of rocks is shown to describe unipolar magnetic pulses, a phenomenon that has been observed prior to earthquakes. These pulses are suspected to be generated deep in the Earth's crust, in and around the hypocentral volume, days or even weeks before earthquakes. Their extremely long wavelength allows them to pass through kilometers of rock. Interestingly, when the sources of these pulses are triangulated, the locations coincide with the epicenters of future earthquakes. We couple a drift-diffusion semiconductor model to a magnetic field in order to describe the electromagnetic effects associated with electrical currents flowing within rocks. The resulting system of equations is solved numerically and it is seen that a volume of rock may act as a diode that produces transient currents when it switches bias. These unidirectional currents are expected to produce transient unipolar magnetic pulses similar in form, amplitude, and duration to those observed before earthquakes, and this suggests that the pulses could be the result of geophysical semiconductor processes.

  10. Loss estimation of Membramo earthquake

    NASA Astrophysics Data System (ADS)

    Damanik, R.; Sedayo, H.

    2016-05-01

    Papua Tectonics are dominated by the oblique collision of the Pacific plate along the north side of the island. A very high relative plate motions (i.e. 120 mm/year) between the Pacific and Papua-Australian Plates, gives this region a very high earthquake production rate, about twice as much as that of Sumatra, the western margin of Indonesia. Most of the seismicity occurring beneath the island of New Guinea is clustered near the Huon Peninsula, the Mamberamo region, and the Bird's Neck. At 04:41 local time(GMT+9), July 28th 2015, a large earthquake of Mw = 7.0 occurred at West Mamberamo Fault System. The earthquake focal mechanism are dominated by northwest-trending thrust mechanisms. GMPE and ATC vulnerability curve were used to estimate distribution of damage. Mean of estimated losses was caused by this earthquake is IDR78.6 billion. We estimated insurance loss will be only small portion in total general due to deductible.

  11. Earthquake Safety: Activities for Children.

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Lessons on earthquake safety involving planning, preparation, and practice are presented in this booklet for teachers. Included are classroom activities designed for K-6 students, teaching notes, "Learning Links" that summarize interdisciplinary connections, a set of 15 masters for reproducing transparencies, handouts, and worksheets. Part 1 shows…

  12. Using Smartphones to Detect Earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  13. Seismicity dynamics and earthquake predictability

    NASA Astrophysics Data System (ADS)

    Sobolev, G. A.

    2011-02-01

    Many factors complicate earthquake sequences, including the heterogeneity and self-similarity of the geological medium, the hierarchical structure of faults and stresses, and small-scale variations in the stresses from different sources. A seismic process is a type of nonlinear dissipative system demonstrating opposing trends towards order and chaos. Transitions from equilibrium to unstable equilibrium and local dynamic instability appear when there is an inflow of energy; reverse transitions appear when energy is dissipating. Several metastable areas of a different scale exist in the seismically active region before an earthquake. Some earthquakes are preceded by precursory phenomena of a different scale in space and time. These include long-term activation, seismic quiescence, foreshocks in the broad and narrow sense, hidden periodical vibrations, effects of the synchronization of seismic activity, and others. Such phenomena indicate that the dynamic system of lithosphere is moving to a new state - catastrophe. A number of examples of medium-term and short-term precursors is shown in this paper. However, no precursors identified to date are clear and unambiguous: the percentage of missed targets and false alarms is high. The weak fluctuations from outer and internal sources play a great role on the eve of an earthquake and the occurrence time of the future event depends on the collective behavior of triggers. The main task is to improve the methods of metastable zone detection and probabilistic forecasting.

  14. Detecting Earthquakes--Part 2.

    ERIC Educational Resources Information Center

    Isenberg, C.; And Others

    1983-01-01

    Basic concepts associated with seismic wave propagation through the earth and the location of seismic events were explained in part 1 (appeared in January 1983 issue). This part focuses on the construction of a student seismometer for detecting earthquakes and underground nuclear explosions anywhere on the earth's surface. (Author/JN)

  15. On the Deterministic Description of Earthquakes

    NASA Astrophysics Data System (ADS)

    Bizzarri, Andrea

    2011-08-01

    The quantitative estimate of earthquake damage due to ground shaking is of pivotal importance in geosciences, and its knowledge should hopefully lead to the formulation of improved strategies for seismic hazard assessment. Numerical models of the processes occurring during seismogenic faulting represent a powerful tool to explore realistic scenarios that are often far from being fully reproduced in laboratory experiments because of intrinsic, technical limitations. In this paper we discuss the prominent role of the fault governing model, which describes the behavior of the fault traction during a dynamic slip failure and accounts for the different, and potentially competing, chemical and physical dissipative mechanisms. We show in a comprehensive sketch the large number of constitutive models adopted in dynamic modeling of seismic events, and we emphasize their prominent features, limitations, and specific advantages. In a quantitative comparison, we show through numerical simulations that spontaneous dynamic ruptures obeying the idealized, linear slip-weakening (SW) equation and a more elaborated rate- and state-dependent friction law produce very similar results (in terms of rupture times, peaks slip velocity, developed slip, and stress drops), provided that the frictional parameters are adequately comparable and, more importantly, that the fracture energy density is the same. Our numerical experiments also illustrate that the different models predict fault slip velocity time histories characterized by a similar frequency content; a feeble predominance of high frequencies in the SW case emerges in the frequency ranges [0.3, 1] and [11, 50] Hz. These simulations clearly indicate that, even forgiving the frequency band limitation, it would be very difficult (virtually impossible) to discriminate between two different, but energetically identical, constitutive models, on the basis of the seismograms recorded after a natural earthquake.

  16. Earthquake precursors: activation or quiescence?

    NASA Astrophysics Data System (ADS)

    Rundle, John B.; Holliday, James R.; Yoder, Mark; Sachs, Michael K.; Donnellan, Andrea; Turcotte, Donald L.; Tiampo, Kristy F.; Klein, William; Kellogg, Louise H.

    2011-10-01

    We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These

  17. Predecessors of the giant 1960 Chile earthquake.

    PubMed

    Cisternas, Marco; Atwater, Brian F; Torrejón, Fernando; Sawai, Yuki; Machuca, Gonzalo; Lagos, Marcelo; Eipert, Annaliese; Youlton, Cristián; Salgado, Ignacio; Kamataki, Takanobu; Shishikura, Masanobu; Rajendran, C P; Malik, Javed K; Rizal, Yan; Husni, Muhammad

    2005-09-15

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended. PMID:16163355

  18. Predecessors of the giant 1960 Chile earthquake

    USGS Publications Warehouse

    Cisternas, M.; Atwater, B.F.; Torrejon, F.; Sawai, Y.; Machuca, G.; Lagos, M.; Eipert, A.; Youlton, C.; Salgado, I.; Kamataki, T.; Shishikura, M.; Rajendran, C.P.; Malik, J.K.; Rizal, Y.; Husni, M.

    2005-01-01

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended. ?? 2005 Nature Publishing Group.

  19. Discrimination of Earthquake Precursors using Radio-Tomography of the Ionosphere

    NASA Astrophysics Data System (ADS)

    Rekenthaler, Douglas; Currie, Douglas; Kunitsyn, Vyacheslav; Gribkov, Dmitrii; Andreeva, Elena; Nesterov, Ivan

    2014-05-01

    This program relates to addresses lithospheric-ionospheric coupling during strong earthquakes (EQ). We discuss both the ionospheric implications of EQs, and the ionospheric precursors to EQ. the data are analyzed using the methods of satellite radio tomography (RT). Signals from both low-orbiting beacons ("LORT": Transit, Parus, Tsikada, etc.) and high orbiting global navigational satellite systems ("HORT": the GNSS satellites: GPS, GLONASS, Beidot, ....)are used for tomographic reconstructions. Our resulting 2D and 3D tomographic images and their time flow (4-D RT) allow us to map spatio-temporal changes due to ionospheric perturbations induced by EQs and EQ precursors. Low-orbital RT (LORT) provides near "instantaneous" mappings, with a time span of 5-8 minutes, and 2-D graphics of the electron density over the seismically active region of interest. LORT supports 2D imaging of various anomalies, including wave structures such as ionospheric manifestations of acoustic-gravity waves (AGW), wave-like disturbances, and solitary waves with the gaps between images, depending on the number of operating satellites (currently, 30-100 minutes). High-orbital RT (HORT) provides imaging of 4D distributions of ionospheric plasma (resulting in 3D snapshots every 20-30 minutes). Using this approach, one can reconstruct RT images of ionospheric irregularities, wave structures, and perturbations such as solitary waves. In regions with a sufficient number of GNSS receivers (California, Japan), 4-D RT images can be generated every 2-4 minutes. The spatial resolution of LORT and HORT systems is on the order of 20-40 km, and 100 km, respectively. We present the results of a long-term study using HORT and LORT techniques for study of the ionosphere over California, Alaska, and Southeast Asia (Taiwan region). In particular, we established a ground station array extending from Washington to California, which we operated from 2011 to 2013 on a 24/7 basis. Reconstructions of the ionosphere

  20. Earthquakes and the urban environment. Volume III

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 3 contains chapters on seismic planning, social aspects and future prospects.