Science.gov

Sample records for operating basis earthquake

  1. Study on Cumulative Absolute Velocity as Operating Basis Earthquake Criteria to Nuclear Power Plants in Korea

    NASA Astrophysics Data System (ADS)

    Park, D.; Yun, K.; Chang, C.; Cho, S.; Choi, W.

    2013-12-01

    In recognition of the need to develop a new criteria for determining when the operating basis earthquake(OBE) has been exceeded at nuclear power plants in Korea, cumulative absolute velocity(CAV) is introduced in this paper. CAV OBE was determined as the minimum CAV value for the modified Mercalli intensity(MMI) VII based on the relation between the CAV and the seismic intensity. The MMI VII intensity can be defined as the ground-motion level that could cause a minor damage to a well-designed structure. Therefore, no damage to the more rugged NPP structure, which is reinforced against earthquakes, will be guaranteed if the minimum CAV value is used as a threshold of OBE exceedance criteria. In deriving the CAV OBE exceedance criteria, it is necessary to generate a suite of simulated ground-motions for a range of earthquake magnitudes and calibrated distances to the site. It is also necessary to use an instrumental MMI intensity of Fourier acceleration spectra(FAS) MMI because there have been no strong ground-motion records or experienced intensity data from damaging earthquakes in Korea. The empirical Green's function method and stochastic ground motion simulation method is used to simulate ground motion. Based on the relation between the CAV values given for a specific NPP site and the values for the instrumental MMI intensity (FAS MMI), the CAV OBE value was calculated as 0.16g.sec. However, since this result is totally based on the simulation, there still remains a margin of the CAV threshold value that must consider characteristics of the real strong ground-motion records. For the future work, data on the limited earthquake damage reported in Korea will be used to validate the simulated CAV values.

  2. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  3. [Autism after an earthquake: the experience of L'Aquila (Central Italy) as a basis for an operative guideline].

    PubMed

    Valenti, Marco; Di Giovanni, Chiara; Mariano, Melania; Pino, Maria Chiara; Sconci, Vittorio; Mazza, Monica

    2016-01-01

    People with autism, their families, and their specialised caregivers are a social group at high health risk after a disruptive earthquake. They need emergency assistance and immediate structured support according to definite protocols and quality standards. We recommend to establish national guidelines for taking-in-charge people with autism after an earthquake. The adaptive behaviour of participants with autism declined dramatically in the first months after the earthquake in all the dimensions examined (i.e., communication, daily living, socialisation, and motor skills). After relatively stable conditions returned and with immediate and intensive post-disaster intervention, children and adolescents with autism showed a trend towards partial recovery of adaptive functioning. As to the impact on services, this study indicates the need for supporting exposed caregivers at high risk of burnout over the first two years after the disaster and for an immediate reorganisation of person-tailored services. PMID:27291209

  4. Earthquake!

    ERIC Educational Resources Information Center

    Markle, Sandra

    1987-01-01

    A learning unit about earthquakes includes activities for primary grade students, including making inferences and defining operationally. Task cards are included for independent study on earthquake maps and earthquake measuring. (CB)

  5. The potential uses of operational earthquake forecasting

    USGS Publications Warehouse

    Field, Ned; Jordan, Thomas; Jones, Lucille; Michael, Andrew; Blanpied, Michael L.

    2016-01-01

    This article reports on a workshop held to explore the potential uses of operational earthquake forecasting (OEF). We discuss the current status of OEF in the United States and elsewhere, the types of products that could be generated, the various potential users and uses of OEF, and the need for carefully crafted communication protocols. Although operationalization challenges remain, there was clear consensus among the stakeholders at the workshop that OEF could be useful.

  6. Linking earthquakes and hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2013-01-01

    Hydraulic fracturing, also known as fracking, to extract oil and gas from rock, has been a controversial but increasingly common practice; some studies have linked it to groundwater contamination and induced earthquakes. Scientists discussed several studies on the connection between fracking and earthquakes at the AGU Fall Meeting in San Francisco in December.

  7. The Bender-Dunne basis operators as Hilbert space operators

    SciTech Connect

    Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph

    2014-02-15

    The Bender-Dunne basis operators, T{sub −m,n}=2{sup −n}∑{sub k=0}{sup n}(n/k )q{sup k}p{sup −m}q{sup n−k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub −m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)

  8. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart.

  9. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  10. Minimization of Basis Risk in Parametric Earthquake Cat Bonds

    NASA Astrophysics Data System (ADS)

    Franco, G.

    2009-12-01

    A catastrophe -cat- bond is an instrument used by insurance and reinsurance companies, by governments or by groups of nations to cede catastrophic risk to the financial markets, which are capable of supplying cover for highly destructive events, surpassing the typical capacity of traditional reinsurance contracts. Parametric cat bonds, a specific type of cat bonds, use trigger mechanisms or indices that depend on physical event parameters published by respected third parties in order to determine whether a part or the entire bond principal is to be paid for a certain event. First generation cat bonds, or cat-in-a-box bonds, display a trigger mechanism that consists of a set of geographic zones in which certain conditions need to be met by an earthquake’s magnitude and depth in order to trigger payment of the bond principal. Second generation cat bonds use an index formulation that typically consists of a sum of products of a set of weights by a polynomial function of the ground motion variables reported by a geographically distributed seismic network. These instruments are especially appealing to developing countries with incipient insurance industries wishing to cede catastrophic losses to the financial markets because the payment trigger mechanism is transparent and does not involve the parties ceding or accepting the risk, significantly reducing moral hazard. In order to be successful in the market, however, parametric cat bonds have typically been required to specify relatively simple trigger conditions. The consequence of such simplifications is the increase of basis risk. This risk represents the possibility that the trigger mechanism fails to accurately capture the actual losses of a catastrophic event, namely that it does not trigger for a highly destructive event or vice versa, that a payment of the bond principal is caused by an event that produced insignificant losses. The first case disfavors the sponsor who was seeking cover for its losses while the

  11. Retrospective tests of hybrid operational earthquake forecasting models for Canterbury

    NASA Astrophysics Data System (ADS)

    Rhoades, D. A.; Liukis, M.; Christophersen, A.; Gerstenberger, M. C.

    2016-01-01

    The Canterbury, New Zealand, earthquake sequence, which began in September 2010, occurred in a region of low crustal deformation and previously low seismicity. Because, the ensuing seismicity in the region is likely to remain above previous levels for many years, a hybrid operational earthquake forecasting model for Canterbury was developed to inform decisions on building standards and urban planning for the rebuilding of Christchurch. The model estimates occurrence probabilities for magnitudes M ≥ 5.0 in the Canterbury region for each of the next 50 yr. It combines two short-term, two medium-term and four long-term forecasting models. The weight accorded to each individual model in the operational hybrid was determined by an expert elicitation process. A retrospective test of the operational hybrid model and of an earlier informally developed hybrid model in the whole New Zealand region has been carried out. The individual and hybrid models were installed in the New Zealand Earthquake Forecast Testing Centre and used to make retrospective annual forecasts of earthquakes with magnitude M > 4.95 from 1986 on, for time-lags up to 25 yr. All models underpredict the number of earthquakes due to an abnormally large number of earthquakes in the testing period since 2008 compared to those in the learning period. However, the operational hybrid model is more informative than any of the individual time-varying models for nearly all time-lags. Its information gain relative to a reference model of least information decreases as the time-lag increases to become zero at a time-lag of about 20 yr. An optimal hybrid model with the same mathematical form as the operational hybrid model was computed for each time-lag from the 26-yr test period. The time-varying component of the optimal hybrid is dominated by the medium-term models for time-lags up to 12 yr and has hardly any impact on the optimal hybrid model for greater time-lags. The optimal hybrid model is considerably more

  12. Operational earthquake forecasting in the South Iceland Seismic Zone: improving the earthquake catalogue

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Vogfjörd, Kristin; Zechar, J. Douglas; Eberhard, David

    2014-05-01

    A major earthquake sequence is ongoing in the South Iceland Seismic Zone (SISZ), where experts expect earthquakes of up to MW = 7.1 in the coming years to decades. The historical seismicity in this region is well known and many major faults here and on Reykjanes Peninsula (RP) have already been mapped. The faults are predominantly N-S with right-lateral strike-slip motion, while the overall motion in the SISZ is E-W oriented left-lateral motion. The area that we propose for operational earthquake forecasting(OEF) contains both the SISZ and the RP. The earthquake catalogue considered for OEF, called the SIL catalogue, spans the period from 1991 until September 2013 and contains more than 200,000 earthquakes. Some of these events have a large azimuthal gap between stations, and some have large horizontal and vertical uncertainties. We are interested in building seismicity models using high-quality data, so we filter the catalogue using the criteria proposed by Gomberg et al. (1990) and Bondar et al. (2004). The resulting filtered catalogue contains around 130,000 earthquakes. Magnitude estimates in the Iceland catalogue also require special attention. The SIL system uses two methods to estimate magnitude. The first method is based on an empirical local magnitude (ML) relationship. The other magnitude scale is a so-called "local moment magnitude" (MLW), originally constructed by Slunga et al. (1984) to agree with local magnitude scales in Sweden. In the SIL catalogue, there are two main problems with the magnitude estimates and consequently it is not immediately possible to convert MLW to moment magnitude (MW). These problems are: (i) immediate aftershocks of large events are assigned magnitudes that are too high; and (ii) the seismic moment of large earthquakes is underestimated. For this reason the magnitude values in the catalogue must be corrected before developing an OEF system. To obtain a reliable MW estimate, we calibrate a magnitude relationship based on

  13. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  14. The Establishment of an Operational Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Lombardi, Anna Maria; Casarotti, Emanuele

    2014-05-01

    Just after the Mw 6.2 earthquake that hit L'Aquila, on April 6 2009, the Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) that paved the way to the development of the Operational Earthquake Forecasting (OEF), defined as the "procedures for gathering and disseminating authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes". In this paper we introduce the first official OEF system in Italy that has been developed by the new-born Centro di Pericolosità Sismica at the Istituto Nazionale di Geofisica e Vulcanologia. The system provides every day an update of the weekly probabilities of ground shaking over the whole Italian territory. In this presentation, we describe in detail the philosophy behind the system, the scientific details, and the output format that has been preliminary defined in agreement with Civil Protection. To our knowledge, this is the first operational system that fully satisfies the ICEF guidelines. Probably, the most sensitive issue is related to the communication of such a kind of message to the population. Acknowledging this inherent difficulty, in agreement with Civil Protection we are planning pilot tests to be carried out in few selected areas in Italy; the purpose of such tests is to check the effectiveness of the message and to receive feedbacks.

  15. FB Line Basis for Interim Operation

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The safety analysis of the FB-Line Facility indicates that the operation of FB-Line to support the current mission does not present undue risk to the facility and co-located workers, general public, or the environment.

  16. Design basis for the NRC Operations Center

    SciTech Connect

    Lindell, M.K.; Wise, J.A.; Griffin, B.N.; Desrosiers, A.E.; Meitzler, W.D.

    1983-05-01

    This report documents the development of a design for a new NRC Operations Center (NRCOC). The project was conducted in two phases: organizational analysis and facility design. In order to control the amount of traffic, congestion and noise within the facility, it is recommended that information flow in the new NRCOC be accomplished by means of an electronic Status Information Management System. Functional requirements and a conceptual design for this system are described. An idealized architectural design and a detailed design program are presented that provide the appropriate amount of space for operations, equipment and circulation within team areas. The overall layout provides controlled access to the facility and, through the use of a zoning concept, provides each team within the NRCOC the appropriate balance of ready access and privacy determined from the organizational analyses conducted during the initial phase of the project.

  17. Geological and seismological survey for new design-basis earthquake ground motion of Kashiwazaki-Kariwa NPS

    NASA Astrophysics Data System (ADS)

    Takao, M.; Mizutani, H.

    2009-05-01

    At about 10:13 on July 16, 2007, a strong earthquake named 'Niigata-ken Chuetsu-oki Earthquake' of Mj6.8 on Japan Meteorological Agencyfs scale occurred offshore Niigata prefecture in Japan. However, all of the nuclear reactors at Kashiwazaki-Kariwa Nuclear Power Station (KKNPS) in Niigata prefecture operated by Tokyo Electric Power Company shut down safely. In other words, automatic safety function composed of shutdown, cooling and containment worked as designed immediately after the earthquake. During the earthquake, the peak acceleration of the ground motion exceeded the design-basis ground motion (DBGM), but the force due to the earthquake applied to safety-significant facilities was about the same as or less than the design basis taken into account as static seismic force. In order to assess anew the safety of nuclear power plants, we have evaluated a new DBGM after conducting geomorphological, geological, geophysical, seismological survey and analyses. [Geomorphological, Geological and Geophysical survey] In the land area, aerial photograph interpretation was performed at least within the 30km radius to extract geographies that could possibly be tectonic reliefs as a geomorphological survey. After that, geological reconnaissance was conducted to confirm whether the extracted landforms are tectonic reliefs or not. Especially we carefully investigated Nagaoka Plain Western Boundary Fault Zone (NPWBFZ), which consists of Kakuda-Yahiko fault, Kihinomiya fault and Katakai fault, because NPWBFZ is the one of the active faults which have potential of Mj8 class in Japan. In addition to the geological survey, seismic reflection prospecting of approximate 120km in total length was completed to evaluate the geological structure of the faults and to assess the consecutiveness of the component faults of NPWBFZ. As a result of geomorphological, geological and geophysical surveys, we evaluated that the three component faults of NPWBFZ are independent to each other from the

  18. Solid waste retrieval. Phase 1, Operational basis

    SciTech Connect

    Johnson, D.M.

    1994-09-30

    This Document describes the operational requirements, procedures, and options for execution of the retrieval of the waste containers placed in buried storage in Burial Ground 218W-4C, Trench 04 as TRU waste or suspect TRU waste under the activity levels defining this waste in effect at the time of placement. Trench 04 in Burial Ground 218W-4C is totally dedicated to storage of retrievable TRU waste containers or retrievable suspect TRU waste containers and has not been used for any other purpose.

  19. Lessons of L'Aquila for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  20. Operational real-time GPS-enhanced earthquake early warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.; Johanson, I. A.; Allen, R. M.

    2014-10-01

    Moment magnitudes for large earthquakes (Mw≥7.0) derived in real time from near-field seismic data can be underestimated due to instrument limitations, ground tilting, and saturation of frequency/amplitude-magnitude relationships. Real-time high-rate GPS resolves the buildup of static surface displacements with the S wave arrival (assuming nonsupershear rupture), thus enabling the estimation of slip on a finite fault and the event's geodetic moment. Recently, a range of high-rate GPS strategies have been demonstrated on off-line data. Here we present the first operational system for real-time GPS-enhanced earthquake early warning as implemented at the Berkeley Seismological Laboratory (BSL) and currently analyzing real-time data for Northern California. The BSL generates real-time position estimates operationally using data from 62 GPS stations in Northern California. A fully triangulated network defines 170+ station pairs processed with the software trackRT. The BSL uses G-larmS, the Geodetic Alarm System, to analyze these positioning time series and determine static offsets and preevent quality parameters. G-larmS derives and broadcasts finite fault and magnitude information through least-squares inversion of the static offsets for slip based on a priori fault orientation and location information. This system tightly integrates seismic alarm systems (CISN-ShakeAlert, ElarmS-2) as it uses their P wave detections to trigger its processing; quality control runs continuously. We use a synthetic Hayward Fault earthquake scenario on real-time streams to demonstrate recovery of slip and magnitude. Reanalysis of the Mw7.2 El Mayor-Cucapah earthquake tests the impact of dynamic motions on offset estimation. Using these test cases, we explore sensitivities to disturbances of a priori constraints (origin time, location, and fault strike/dip).

  1. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a ...

  2. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  3. Is there a basis for preferring characteristic earthquakes over a Gutenberg-Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg-Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg-Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  4. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  5. Scientific and non-scientific challenges for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2015-12-01

    Tracking the time evolution of seismic hazard in time windows shorter than the usual 50-years of long-term hazard models may offer additional opportunities to reduce the seismic risk. This is the target of operational earthquake forecasting (OEF). During the OEF development in Italy we identify several challenges that range from pure science to the more practical interface of science with society. From a scientific point of view, although earthquake clustering is the clearest empirical evidence about earthquake occurrence, and OEF clustering models are the most (successfully) tested hazard models in seismology, we note that some seismologists are still reluctant to accept their scientific reliability. After exploring the motivations of these scientific doubts, we also look into an issue that is often overlooked in this discussion, i.e., in any kind of hazard analysis, we do not use a model because it is the true one, but because it is the better than anything else we can think of. The non-scientific aspects are mostly related to the fact that OEF usually provides weekly probabilities of large eartquakes smaller than 1%. These probabilities are considered by some seismologists too small to be of interest or useful. However, in a recent collaboration with engineers we show that such earthquake probabilities may lead to intolerable individual risk of death. Interestingly, this debate calls for a better definition of the still fuzzy boundaries among the different expertise required for the whole risk mitigation process. The last and probably more pressing challenge is related to the communication to the public. In fact, a wrong message could be useless or even counterproductive. Here we show some progresses that we have made in this field working with communication experts in Italy.

  6. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  7. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  8. Circuit breaker operation and potential failure modes during an earthquake

    SciTech Connect

    Lambert, H.E.; Budnitz, R.J.

    1987-01-01

    This study addresses the effect of a strong-motion earthquake on circuit breaker operation. It focuses on the loss of offsite power (LOSP) transient caused by a strong-motion earthquake at the Zion Nuclear Power Plant. This paper also describes the operator action necessary to prevent core melt if the above circuit breaker failure modes occur simultaneously on three 4.16 KV buses. Numerous circuit breakers important to plant safety, such as circuit breakers to diesel generators and engineered safety systems (ESS), must open and/or close during this transient while strong motion is occurring. Potential seismically-induced circuit-breaker failures modes were uncovered while the study was conducted. These failure modes include: circuit breaker fails to close; circuit breaker trips inadvertently; circuit breaker fails to reclose after trip. The causes of these failure modes include: Relay chatter causes the circuit breaker to trip; Relay chatter causes anti-pumping relays to seal-in which prevents automatic closure of circuit breakers; Load sequencer failures. The incorporation of these failure modes as well as other instrumentation and control failures into a limited scope seismic probabilistic risk assessment is also discussed in this paper.

  9. Ground motion following selection of SRS design basis earthquake and associated deterministic approach. Final report: Revision 1

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section`s Seismic Qualification Program for reactor restart.

  10. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  11. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  12. 1/f and the Earthquake Problem: Scaling constraints to facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Rundle, J. B.; Glasscoe, M. T.

    2013-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or '1/f', nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this '1/f problem,' it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area), in combination with a metric to quantify rate trends in local seismicity, to the local earthquake magnitude potential - the magnitudes of earthquakes the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.

  13. Post Test Analysis of a PCCV Model Dynamically Tested Under Simulated Design-Basis Earthquakes

    SciTech Connect

    Cherry, J.; Chokshi, N.; James, R.J.; Rashid, Y.R.; Tsurumaki, S.; Zhang, L.

    1998-11-09

    In a collaborative program between the United States Nuclear Regulatory Commission (USNRC) and the Nuclear Power Engineering Corporation (NUPEC) of Japan under sponsorship of the Ministry of International Trade and Ihdustry, the seismic behavior of Prestressed Concrete Containment Vessels (PCCV) is being investigated. A 1:10 scale PCCV model has been constructed by NUPEC and subjected to seismic simulation tests using the high performance shaking table at the Tadotsu Engineering Laboratory. A primary objective of the testing program is to demonstrate the capability of the PCCV to withstand design basis earthquakes with a significant safety margin against major damage or failure. As part of the collaborative program, Sandia National Laboratories (SNL) is conducting research in state-of-the-art analytical methods for predicting the seismic behavior of PCCV structures, with the eventual goal of understanding, validating, and improving calculations dated to containment structure performance under design and severe seismic events. With the increased emphasis on risk-informed- regulatory focus, more accurate ch&@erization (less uncertainty) of containment structural and functional integri~ is desirable. This paper presents results of post-test calculations conducted at ANATECH to simulate the design level scale model tests.

  14. Earthquake Response Modeling for a Parked and Operating Megawatt-Scale Wind Turbine

    SciTech Connect

    Prowell, I.; Elgamal, A.; Romanowitz, H.; Duggan, J. E.; Jonkman, J.

    2010-10-01

    Demand parameters for turbines, such as tower moment demand, are primarily driven by wind excitation and dynamics associated with operation. For that purpose, computational simulation platforms have been developed, such as FAST, maintained by the National Renewable Energy Laboratory (NREL). For seismically active regions, building codes also require the consideration of earthquake loading. Historically, it has been common to use simple building code approaches to estimate the structural demand from earthquake shaking, as an independent loading scenario. Currently, International Electrotechnical Commission (IEC) design requirements include the consideration of earthquake shaking while the turbine is operating. Numerical and analytical tools used to consider earthquake loads for buildings and other static civil structures are not well suited for modeling simultaneous wind and earthquake excitation in conjunction with operational dynamics. Through the addition of seismic loading capabilities to FAST, it is possible to simulate earthquake shaking in the time domain, which allows consideration of non-linear effects such as structural nonlinearities, aerodynamic hysteresis, control system influence, and transients. This paper presents a FAST model of a modern 900-kW wind turbine, which is calibrated based on field vibration measurements. With this calibrated model, both coupled and uncoupled simulations are conducted looking at the structural demand for the turbine tower. Response is compared under the conditions of normal operation and potential emergency shutdown due the earthquake induced vibrations. The results highlight the availability of a numerical tool for conducting such studies, and provide insights into the combined wind-earthquake loading mechanism.

  15. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  16. M6.0 South Napa Earthquake Forecasting on the basis of jet stream precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.

    2014-12-01

    Currently earthquake prediction research methods can be divided into the crust change, radon concentration, well water level, animal behavior, Very high frequency (VHF) signals, GPS/TEC in ionospheric variations, thermal infrared radiation (TIR) anomalies. Before major earthquakes (M> 6) occurred, jet stream in the epicenter area will interrupt or velocity flow lines cross. That meaning is that before earthquake happen, atmospheric pressure in high altitude suddenly dropped during 6~12 hours (Wu & Tikhonov, 2014). This technique has been used to predict the strong earthquakes in real time, and then pre-registered on the website. For example: M6.0 Northern California earthquake on 2014/08/24(figure1) , M6.6 Russia earthquake on 2013/10/12(figure2), As far as 2014/08/24 M6.6 earthquake in CA, USA, the front end of the 60knots speed line was at the S.F. on 2014/06/16 12:00, and then after 69 days ,M6.1 earthquake happened. We predicted that magnitude is larger than 5.5 but the period is only 30 days on 2014/07/16 . The deviation of predicted point was about 70 km. Lithosphere-atmosphere-ionosphere (LAI) coupling model may be explained this phenomenon : Ionization of the air produced by an increased emanation of radon at epicenter. The water molecules in the air react with these ions, and then release heat. The heat result in temperature rise in the air. They are also accompanied by a large-scale change in the atmospheric pressure and jet streams morphology.We obtain satisfactory accuracy of estimation of the epicenter location. As well we define the short alarm period. That's the positive aspects of our forecast. However, estimates of magnitude jet contain a big uncertainty.Reference:H.C Wu, I.N. Tikhonov, 2014, "Jet streams anomalies as possible short-term precursors of earthquakes with M>6.0", Research in geophysics, DOI: http://dx.doi.org/10.4081/ rg.2014.4939 http://www.pagepress.org/journals/index.php/rg/article/view/rg.2014.4939

  17. Earthquake Early Warning using a Seismogeodetic Approach: An operational plan for Cascadia

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Bodin, P.; Vidale, J. E.; Schmidt, D. A.; Melbourne, T. I.; Scrivner, C. W.; Santillan, V. M.; Szeliga, W. M.; Minson, S. E.; Bock, Y.; Melgar, D.

    2013-12-01

    We present an operational plan for implementing combined seismic and geodetic time series in an earthquake early warning system for Cascadia. The Cascadian subduction zone presents one of the greatest risks for a megaquake in the continental United States. Ascertaining the full magnitude and extent of large earthquakes is problematic for earthquake early warning systems due to instability when double integrating strong-motion records to ground displacement. This problem can be mitigated by augmenting earthquake early warning systems with real-time GPS data, allowing for the progression and spatial extent of large earthquakes to be better resolved due to GPS's ability to measure both dynamic and permanent displacements. The Pacific Northwest Seismic Network (PNSN) at the University of Washington is implementing an integrated seismogeodetic approach to earthquake early warning. Regional GPS data are provided by the Pacific Northwest Geodetic Array (PANGA) at Central Washington University. Precise Point Positioning (PPP) solutions are sent from PANGA to the PNSN through JSON formatted streams and processed with a Python-based quality control (QC) module. The QC module also ingest accelerations from PNSN seismic stations through the Earthworm seismic acquisition and processing system for the purpose of detecting outliers and Kalman filtering when collocated instruments exist. The QC module outputs time aligned and cleaned displacement waveforms to ActiveMQ, an XML-based messaging broker that is currently used in seismic early warning architecture. Earthquake characterization modules read displacement information from ActiveMQ when triggered by warnings from ElarmS earthquake early warning algorithm. Peak ground displacement and P-wave scaling relationships from Kalman filtered waveforms provide initial magnitude estimates. Additional modules perform more complex source modeling such as centroid moment tensors and slip inversions that characterize the full size and

  18. PBO Southwest Region: Baja Earthquake Response and Network Operations

    NASA Astrophysics Data System (ADS)

    Walls, C. P.; Basset, A.; Mann, D.; Lawrence, S.; Jarvis, C.; Feaux, K.; Jackson, M. E.

    2011-12-01

    The SW region of the Plate Boundary Observatory consists of 455 continuously operating GPS stations located principally along the transform system of the San Andreas fault and Eastern California Shear Zone. In the past year network uptime exceeded an average of 97% with greater than 99% data acquisition. Communications range from CDMA modem (307), radio (92), Vsat (30), DSL/T1/other (25) to manual downloads (1). Sixty-three stations stream 1 Hz data over the VRS3Net typically with <0.5 second latency. Over 620 maintenance activities were performed during 316 onsite visits out of approximately 368 engineer field days. Within the past year there have been 7 incidences of minor (attempted theft) to moderate vandalism (solar panel stolen) with one total loss of receiver and communications gear. Security was enhanced at these sites through fencing and more secure station configurations. In the past 12 months, 4 new stations were installed to replace removed stations or to augment the network at strategic locations. Following the M7.2 El Mayor-Cucapah earthquake CGPS station P796, a deep-drilled braced monument, was constructed in San Luis, AZ along the border within 5 weeks of the event. In addition, UNAVCO participated in a successful University of Arizona-led RAPID proposal for the installation of six continuous GPS stations for post-seismic observations. Six stations are installed and telemetered through a UNAM relay at the Sierra San Pedro Martir. Four of these stations have Vaisala WXT520 meteorological sensors. An additional site in the Sierra Cucapah (PTAX) that was built by CICESE, an Associate UNAVCO Member institution in Mexico, and Caltech has been integrated into PBO dataflow. The stations will be maintained as part of the PBO network in coordination with CICESE. UNAVCO is working with NOAA to upgrade PBO stations with WXT520 meteorological sensors and communications systems capable of streaming real-time GPS and met data. The real-time GPS and

  19. Operational earthquake forecasting in California: A prototype system combining UCERF3 and CyberShake

    NASA Astrophysics Data System (ADS)

    Milner, K. R.; Jordan, T. H.; Field, E. H.

    2014-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about time-dependent earthquake probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To attain this goal, OEF must provide a complete description of the seismic hazard—ground motion exceedance probabilities as well as short-term rupture probabilities—in concert with the long-term forecasts of probabilistic seismic hazard analysis. We have combined the Third Uniform California Earthquake Rupture Forecast (UCERF3) of the Working Group on California Earthquake Probabilities (Field et al., 2014) with the CyberShake ground-motion model of the Southern California Earthquake Center (Graves et al., 2011; Callaghan et al., this meeting) into a prototype OEF system for generating time-dependent hazard maps. UCERF3 represents future earthquake activity in terms of fault-rupture probabilities, incorporating both Reid-type renewal models and Omori-type clustering models. The current CyberShake model comprises approximately 415,000 earthquake rupture variations to represent the conditional probability of future shaking at 285 geographic sites in the Los Angeles region (~236 million horizontal-component seismograms). This combination provides significant probability gains relative to OEF models based on empirical ground-motion prediction equations (GMPEs), primarily because the physics-based CyberShake simulations account for the rupture directivity, basin effects, and directivity-basin coupling that are not represented by the GMPEs.

  20. Circuit breaker operation and potential failure modes during an earthquake: a preliminary investigation

    SciTech Connect

    Lambert, H.E.

    1984-04-09

    This study addresses the effect of a strong-motion earthquake on circuit breaker operation. It focuses on the loss of offsite power (LOSP) transient caused by a strong-motion earthquake at the Zion Nuclear Power Plant. This report also describes the operator action necessary to prevent core melt if the above circuit breaker failure modes occur simultaneously on three 4.16 KV buses. Numerous circuit breakers important to plant safety, such as circuit breakers to diesel generators and engineered safety systems, (ESS), must open and/or close during this transient while strong motion is occurring. Nearly 500 electrical drawings were examined to address the effects of earthquakes on circuit breaker operation. Due to the complexity of the problem, this study is not intended to be definitive but serves as a focusing tool for further work. 5 references, 9 figures, 3 tables.

  1. Earthquake.

    PubMed

    Cowen, A R; Denney, J P

    1994-04-01

    On January 25, 1 week after the most devastating earthquake in Los Angeles history, the Southern California Hospital Council released the following status report: 928 patients evacuated from damaged hospitals. 805 beds available (136 critical, 669 noncritical). 7,757 patients treated/released from EDs. 1,496 patients treated/admitted to hospitals. 61 dead. 9,309 casualties. Where do we go from here? We are still waiting for the "big one." We'll do our best to be ready when Mother Nature shakes, rattles and rolls. The efforts of Los Angeles City Fire Chief Donald O. Manning cannot be overstated. He maintained department command of this major disaster and is directly responsible for implementing the fire department's Disaster Preparedness Division in 1987. Through the chief's leadership and ability to forecast consequences, the city of Los Angeles was better prepared than ever to cope with this horrendous earthquake. We also pay tribute to the men and women who are out there each day, where "the rubber meets the road." PMID:10133439

  2. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology.

  3. Large Earthquakes at the Ibero-Maghrebian Region: Basis for an EEWS

    NASA Astrophysics Data System (ADS)

    Buforn, Elisa; Udías, Agustín; Pro, Carmen

    2015-09-01

    Large earthquakes (Mw > 6, Imax > VIII) occur at the Ibero-Maghrebian region, extending from a point (12ºW) southwest of Cape St. Vincent to Tunisia, with different characteristics depending on their location, which cause considerable damage and casualties. Seismic activity at this region is associated with the boundary between the lithospheric plates of Eurasia and Africa, which extends from the Azores Islands to Tunisia. The boundary at Cape St. Vincent, which has a clear oceanic nature in the westernmost part, experiences a transition from an oceanic to a continental boundary, with the interaction of the southern border of the Iberian Peninsula, the northern border of Africa, and the Alboran basin between them, corresponding to a wide area of deformation. Further to the east, the plate boundary recovers its oceanic nature following the northern coast of Algeria and Tunisia. The region has been divided into four zones with different seismic characteristics. From west to east, large earthquake occurrence, focal depth, total seismic moment tensor, and average seismic slip velocities for each zone along the region show the differences in seismic release of deformation. This must be taken into account in developing an EEWS for the region.

  4. Plutonium uranium extraction (PUREX) end state basis for interim operation (BIO) for surveillance and maintenance

    SciTech Connect

    DODD, E.N.

    1999-05-12

    This Basis for Interim Operation (BIO) was developed for the PUREX end state condition following completion of the deactivation project. The deactivation project has removed or stabilized the hazardous materials within the facility structure and equipment to reduce the hazards posed by the facility during the surveillance and maintenance (S and M) period, and to reduce the costs associated with the S and M. This document serves as the authorization basis for the PUREX facility, excluding the storage tunnels, railroad cut, and associated tracks, for the deactivated end state condition during the S and M period. The storage tunnels, and associated systems and areas, are addressed in WHC-SD-HS-SAR-001, Rev. 1, PUREX Final Safety Analysis Report. During S and M, the mission of the facility is to maintain the conditions and equipment in a manner that ensures the safety of the workers, environment, and the public. The S and M phase will continue until the final decontamination and decommissioning (D and D) project and activities are begun. Based on the methodology of DOE-STD-1027-92, Hazards Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, the final facility hazards category is identified as hazards category This considers the remaining material inventories, form and distribution of the material, and the energies present to initiate events of concern. Given the current facility configuration, conditions, and authorized S and M activities, there are no operational events identified resulting in significant hazard to any of the target receptor groups (e.g., workers, public, environment). The only accident scenarios identified with consequences to the onsite co-located workers were based on external natural phenomena, specifically an earthquake. The dose consequences of these events are within the current risk evaluation guidelines and are consistent with the expectations for a hazards category 2

  5. The establishment of a standard operation procedure for psychiatric service after an earthquake.

    PubMed

    Su, Chao-Yueh; Chou, Frank Huang-Chih; Tsai, Kuan-Yi; Lin, Wen-Kuo

    2011-07-01

    This study presents information on the design and creation of a standard operation procedure (SOP) for psychiatric service after an earthquake. The strategies employed focused on the detection of survivors who developed persistent psychiatric illness, particularly post-traumatic stress and major depressive disorders. In addition, the study attempted to detect the risk factors for psychiatric illness. A Disaster-Related Psychological Screening Test (DRPST) was designed by five psychiatrists and two public health professionals for rapidly and simply interviewing 4,223 respondents within six months of the September 1999 Chi-Chi earthquake. A SOP was established through a systemic literature review, action research, and two years of data collection. Despite the limited time and resources inherent to a disaster situation, it is necessary to develop an SOP for psychiatric service after an earthquake in order to assist the high number of survivors suffering from subsequent psychiatric impairment. PMID:21410747

  6. Development of Site-Specific Soil Design Basis Earthquake (DBE) Parameters for the Integrated Waste Treatment Unit (IWTU)

    SciTech Connect

    Payne, Suzette

    2008-08-01

    Horizontal and vertical PC 3 (2,500 yr) Soil Design Basis Earthquake (DBE) 5% damped spectra, corresponding time histories, and strain-compatible soil properties were developed for the Integrated Waste Treatment Unit (IWTU). The IWTU is located at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Laboratory (INL). Mean and 84th percentile horizontal DBE spectra derived from site-specific site response analyses were evaluated for the IWTU. The horizontal and vertical PC 3 (2,500 yr) Soil DBE 5% damped spectra at the 84th percentile were selected for Soil Structure Interaction (SSI) analyses at IWTU. The site response analyses were performed consistent with applicable Department of Energy (DOE) Standards, recommended guidance of the Nuclear Regulatory Commission (NRC), American Society of Civil Engineers (ASCE) Standards, and recommendations of the Blue Ribbon Panel (BRP) and Defense Nuclear Facilities Safety Board (DNFSB).

  7. The G-FAST Geodetic Earthquake Early Warning System: Operational Performance and Synthetic Testing

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Schmidt, D. A.; Bodin, P.; Vidale, J. E.; Melbourne, T. I.; Santillan, V. M.

    2015-12-01

    The G-FAST (Geodetic First Approximation of Size and TIming) earthquake early warning module is part of a joint seismic and geodetic earthquake early warning system currently under development at the Pacific Northwest Seismic Network (PNSN). Our two-stage approach to earthquake early warning includes: (1) initial detection and characterization from PNSN strong-motion and broadband data with the ElarmS package within ShakeAlert, and then (2) modeling of GPS data from the Pacific Northwest Geodetic Array (PANGA). The two geodetic modeling modules are (1) a fast peak-ground-displacement magnitude and depth estimate and (2) a CMT-based finite fault inversion that utilizes coseismic offsets to compute earthquake extent, slip and magnitude. The seismic and geodetic source estimates are then combined in a decision module currently under development. In this presentation, we first report on the operational performance during the first several months that G-FAST has been live with respect to magnitude estimates, timing information, and stability. Secondly, we report on the performance of the G-FAST test system using simulated displacements from plausible Cascadian earthquake scenarios. The test system permits us to: (1) replay segments of actual seismic waveform data recorded from the PNSN and neighboring networks to investigate both earthquakes and noise conditions, and (2) broadcast synthetic data into the system to simulate signals we anticipate from earthquakes for which we have no actual ground motion recordings. The test system lets us also simulate various error conditions (latent and/or out-of-sequence data, telemetry drop-outs, etc.) in order to explore how best to mitigate them. For example, we show for a replay of the 2001 M6.8 Nisqually earthquake that telemetry drop-outs create the largest variability and biases in magnitude and depth estimates whereas latency only causes some variability towards the beginning of the recordings before quickly stabilizing

  8. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  9. The Earthquake Prediction Experiment on the Basis of the Jet Stream's Precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.; Tikhonov, I. N.

    2014-12-01

    Simultaneous analysis of the jet stream maps and EQ data of M > 6.0 have been made. 58 cases of EQ occurred in 2006-2010 were studied. It has been found that interruption or velocity flow lines cross above an epicenter of EQ take place 1-70 days prior to event. The duration was 6-12 hours. The assumption is that jet stream will go up or down near an epicenter. In 45 cases the distance between epicenters and jet stream's precursor does not exceed 90 km. The forecast during 30 days before the EQ was 66.1 % (Wu and Tikhonov, 2014). This technique has been used to predict the strong EQ and pre-registered on the website (for example, the 23 October 2011, M 7.2 EQ (Turkey); the 20 May 2012, M 6.1 EQ (Italy); the 16 April 2013, M 7.8 EQ (Iran); the 12 November 2013, M 6.6 EQ (Russia); the 03 March 2014, M 6.7 Ryukyu EQ (Japan); the 21 July 2014, M 6.2 Kuril EQ). We obtain satisfactory accuracy of the epicenter location. As well we define the short alarm period. That's the positive aspects of forecast. However, estimates of magnitude contain a big uncertainty. Reference Wu, H.C., Tikhonov, I.N., 2014. Jet streams anomalies as possible short-term precursors of earthquakes with M > 6.0. Research in Geophysics, Special Issue on Earthquake Precursors. Vol. 4. No 1. doi:10.4081/rg.2014.4939. The precursor of M9.0 Japan EQ on 2011/03/11(fig1). A. M6.1 Italy EQ (2012/05/20, 44.80 N, 11.19 E, H = 5.1 km) Prediction: 2012/03/20~2012/04/20 (45.6 N, 10.5 E), M > 5.5(fig2) http://ireport.cnn.com/docs/DOC-764800 B. M7.8 Iran EQ (2013/04/16, 28.11 N, 62.05 E, H = 82.0 km) Prediction: 2013/01/14~2013/02/04 (28.0 N, 61.3 E) M > 6.0(fig3) http://ireport.cnn.com/docs/DOC-910919 C. M6.6 Russia EQ (2013/11/12, 54.68 N, 162.29 E, H = 47.2 km). Prediction: 2013/10/27~2013/11/13 (56.0 N, 162.9 E) M > 5.5 http://ireport.cnn.com/docs/DOC-1053599 D. M6.7 Japan EQ (2014/03/03, 27.41 N, 127.34 E, H = 111.2 km). Prediction: 2013/12/02 ~2014/01/15 (26.7 N, 128.1 E) M > 6.5(fig4) http

  10. Planning a Preliminary program for Earthquake Loss Estimation and Emergency Operation by Three-dimensional Structural Model of Active Faults

    NASA Astrophysics Data System (ADS)

    Ke, M. C.

    2015-12-01

    Large scale earthquakes often cause serious economic losses and a lot of deaths. Because the seismic magnitude, the occurring time and the occurring location of earthquakes are still unable to predict now. The pre-disaster risk modeling and post-disaster operation are really important works of reducing earthquake damages. In order to understanding disaster risk of earthquakes, people usually use the technology of Earthquake simulation to build the earthquake scenarios. Therefore, Point source, fault line source and fault plane source are the models which often are used as a seismic source of scenarios. The assessment results made from different models used on risk assessment and emergency operation of earthquakes are well, but the accuracy of the assessment results could still be upgrade. This program invites experts and scholars from Taiwan University, National Central University, and National Cheng Kung University, and tries using historical records of earthquakes, geological data and geophysical data to build underground three-dimensional structure planes of active faults. It is a purpose to replace projection fault planes by underground fault planes as similar true. The analysis accuracy of earthquake prevention efforts can be upgraded by this database. Then these three-dimensional data will be applied to different stages of disaster prevention. For pre-disaster, results of earthquake risk analysis obtained by the three-dimensional data of the fault plane are closer to real damage. For disaster, three-dimensional data of the fault plane can be help to speculate that aftershocks distributed and serious damage area. The program has been used 14 geological profiles to build the three dimensional data of Hsinchu fault and HisnCheng faults in 2015. Other active faults will be completed in 2018 and be actually applied on earthquake disaster prevention.

  11. Planning Matters: Response Operations following the 30 September 2009 Sumatran Earthquake

    NASA Astrophysics Data System (ADS)

    Comfort, L. K.; Cedillos, V.; Rahayu, H.

    2009-12-01

    Response operations following the 9/30/2009 West Sumatra earthquake tested extensive planning that had been done in Indonesia since the 26 December 2004 Sumatran Earthquake and Tsunami. After massive destruction in Aceh Province in 2004, the Indonesian National Government revised its national disaster management plans. A key component was to select six cities in Indonesia exposed to significant risk and make a focused investment of resources, planning activities, and public education to reduce risk of major disasters. Padang City was selected for this national “showcase” for disaster preparedness, planning, and response. The question is whether planning improved governmental performance and coordination in practice. There is substantial evidence that disaster preparedness planning and training initiated over the past four years had a positive effect on Padang in terms of disaster risk reduction. The National Disaster Management Agency (BNPB, 10/28/09) reported the following casualties: Padang City: deaths, 383; severe injuries, 431, minor injuries, 771. Province of West Sumatra: deaths, 1209; severe injuries, 67; minor injuries, 1179. These figures contrasted markedly with the estimated losses following the 2004 Earthquake and Tsunami when no training had been done: Banda Aceh, deaths, 118,000; Aceh Province, dead/missing, 236,169 (ID Health Ministry 2/22/05). The 2004 events were more severe, yet the comparable scale of loss was significantly lower in the 9/30/09 earthquake. Three factors contributed to reducing disaster risk in Padang and West Sumatra. First, annual training exercises for tsunami warning and evacuation had been organized by national agencies since 2004. In 2008, all exercises and training activities were placed under the newly established BNPB. The exercise held in Padang in February, 2009 served as an organizing framework for response operations in the 9/30/09 earthquake. Public officers with key responsibilities for emergency operations

  12. TECHNICAL BASIS FOR VENTILATION REQUIREMENTS IN TANK FARMS OPERATING SPECIFICATIONS DOCUMENTS

    SciTech Connect

    BERGLIN, E J

    2003-06-23

    This report provides the technical basis for high efficiency particulate air filter (HEPA) for Hanford tank farm ventilation systems (sometimes known as heating, ventilation and air conditioning [HVAC]) to support limits defined in Process Engineering Operating Specification Documents (OSDs). This technical basis included a review of older technical basis and provides clarifications, as necessary, to technical basis limit revisions or justification. This document provides an updated technical basis for tank farm ventilation systems related to Operation Specification Documents (OSDs) for double-shell tanks (DSTs), single-shell tanks (SSTs), double-contained receiver tanks (DCRTs), catch tanks, and various other miscellaneous facilities.

  13. Theoretical basis for operational ensemble forecasting of coronal mass ejections

    NASA Astrophysics Data System (ADS)

    Pizzo, V. J.; Koning, C.; Cash, M.; Millward, G.; Biesecker, D. A.; Puga, L.; Codrescu, M.; Odstrcil, D.

    2015-10-01

    We lay out the theoretical underpinnings for the application of the Wang-Sheeley-Arge-Enlil modeling system to ensemble forecasting of coronal mass ejections (CMEs) in an operational environment. In such models, there is no magnetic cloud component, so our results pertain only to CME front properties, such as transit time to Earth. Within this framework, we find no evidence that the propagation is chaotic, and therefore, CME forecasting calls for different tactics than employed for terrestrial weather or hurricane forecasting. We explore a broad range of CME cone inputs and ambient states to flesh out differing CME evolutionary behavior in the various dynamical domains (e.g., large, fast CMEs launched into a slow ambient, and the converse; plus numerous permutations in between). CME propagation in both uniform and highly structured ambient flows is considered to assess how much the solar wind background affects the CME front properties at 1 AU. Graphical and analytic tools pertinent to an ensemble approach are developed to enable uncertainties in forecasting CME impact at Earth to be realistically estimated. We discuss how uncertainties in CME pointing relative to the Sun-Earth line affects the reliability of a forecast and how glancing blows become an issue for CME off-points greater than about the half width of the estimated input CME. While the basic results appear consistent with established impressions of CME behavior, the next step is to use existing records of well-observed CMEs at both Sun and Earth to verify that real events appear to follow the systematic tendencies presented in this study.

  14. A century of oilfield operations and earthquakes in the greater Los Angeles Basin, southern California

    USGS Publications Warehouse

    Hauksson, Egill; Goebel, Thomas; Ampuero, Jean-Paul; Cochran, Elizabeth S.

    2015-01-01

    Most of the seismicity in the Los Angeles Basin (LA Basin) occurs at depth below the sediments and is caused by transpressional tectonics related to the big bend in the San Andreas fault. However, some of the seismicity could be associated with fluid extraction or injection in oil fields that have been in production for almost a century and cover ∼ 17% of the basin. In a recent study, first the influence of industry operations was evaluated by analyzing seismicity characteristics, including normalized seismicity rates, focal depths, and b-values, but no significant difference was found in seismicity characteristics inside and outside the oil fields. In addition, to identify possible temporal correlations, the seismicity and available monthly fluid extraction and injection volumes since 1977 were analyzed. Second, the production and deformation history of the Wilmington oil field were used to evaluate whether other oil fields are likely to experience similar surface deformation in the future. Third, the maximum earthquake magnitudes of events within the perimeters of the oil fields were analyzed to see whether they correlate with total net injected volumes, as suggested by previous studies. Similarly, maximum magnitudes were examined to see whether they exhibit an increase with net extraction volume. Overall, no obvious previously unidentified induced earthquakes were found, and the management of balanced production and injection of fluids appears to reduce the risk of induced-earthquake activity in the oil fields.

  15. Basis for Interim Operation for the K-Reactor in Cold Standby

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The Basis for Interim Operation (BIO) document for K Reactor in Cold Standby and the L- and P-Reactor Disassembly Basins was prepared in accordance with the draft DOE standard for BIO preparation (dated October 26, 1993).

  16. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J.; Stephenson, D.E.; Silva, W.

    1991-12-31

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  17. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J. ); Stephenson, D.E. ); Silva, W. )

    1991-01-01

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  18. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  19. Anthropogenic Triggering of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor ``foreshocks'', since the induction may occur with a delay up to several years.

  20. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years. PMID:25156190

  1. Moving towards the operational seismogeodesy component of earthquake and tsunami early warning

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Bock, Y.; Geng, J.; Melgar, D.; Crowell, B. W.; Squibb, M. B.

    2013-12-01

    deviation; 3) simultaneous solution for ground motion biases to mitigate errors due to accelerometer tilt; 4) real time integration of accelerometer data to velocity and displacement without baseline corrections, providing the fundamental input for rapid finite fault source inversion; 5) low frequency estimates of P-wave arrival displacement to support single station earth quake early warning. The operational real-time GPS analysis was implemented in time to provide waveforms from the August 2012 Brawley, CA, seismic swarm. Now the full real-time seismogeodetic analysis is operational for GPS sites we have upgraded with low-cost MEMS accelerometers, meteorological sensors and an in-house geodetic modules for efficient real-time data transmission. The analysis system does not yet incorporate an alert system but is currently available to serve as a complement to seismic-based early warning systems to increase redundancy and robustness. It is anticipated to be especially useful for large earthquakes (> M7) where rapid determination of the fault parameters is critical for early assessment of the extent of damage in affected areas, or for rapid tsunami modeling.

  2. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2010-07-01 2010-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  3. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2013-07-01 2013-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  4. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2012-07-01 2012-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  5. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2014-07-01 2014-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  6. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2011-07-01 2011-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  7. Technology basis for the Liquid Effluent Retention Facility Operating Specifications. Revision 3

    SciTech Connect

    Johnson, P.G.

    1995-05-17

    The Liquid Effluent Retention Facility (LERF) consists of three retention basins, each with a nominal storage capacity of 6.5 million gallons. LERF serves as interim storage of 242-A Evaporator process condensate for treatment in the Effluent Treatment Facility. This document provides the technical basis for the LERF Operating Specifications, OSD-T-151-00029.

  8. On the Physical Basis of Rate Law Formulations for River Evolution, and their Applicability to the Simulation of Evolution after Earthquakes

    NASA Astrophysics Data System (ADS)

    An, C.; Parker, G.; Fu, X.

    2015-12-01

    River morphology evolves in response to trade-offs among a series of environmental forcing factors, and this evolution will be disturbed if such environmental factors change. One example of response to chronic disturbance is the intensive river evolution after earthquakes in southwest China's mountain areas. When simulating river morphological response to environmental disturbance, an exponential rate law with a specified characteristic response time is often regarded as a practical tool for quantification. As conceptual models, empirical rate law formulations can be used to describe broad brush morphological response, but their physically basis is not solid in that they do not consider the details of morphodynamic processes. Meanwhile, river evolution can also be simulated with physically-based morphodynamic models which conserve sediment via the Exner equation. Here we study the links between the rate law formalism and the Exner equation through solving the Exner equation mathematically and numerically. The results show that, when implementing a very simplified form of a relation for bedload transport, the Exner equation can be reduced to the diffusion equation, the solution of which is a Gaussian function. This solution coincides with the solution associated with rate laws, thus providing a physical basis for such formulations. However, when the complexities of a natural river are considered, the solution of the Exner equation will no longer be a simple Gaussian function. Under such circumstances, the rate law becomes invalid, and a full understanding of the response of rivers to earthquakes requires a complete morphodynamic model.

  9. Real-time operative earthquake forecasting: the case of L'Aquila sequence

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Lombardi, A.

    2009-12-01

    A reliable earthquake forecast is one of the fundamental components required for reducing seismic risk. Despite very recent efforts devoted to test the validity of available models, the present skill at forecasting the evolution of seismicity is still largely unknown. The recent Mw 6.3 earthquake - that struck near the city of L'Aquila, Italy on April 6, 2009, causing hundreds of deaths and vast damages - offered to scientists a unique opportunity to test for the first time the forecasting capability in a real-time application. Here, we describe the results of this first prospective experiment. Immediately following the large event, we began producing daily one-day earthquake forecasts for the region, and we provided these forecasts to Civil Protection - the agency responsible for managing the emergency. The forecasts are based on a stochastic model that combines the Gutenberg-Richter distribution of earthquake magnitudes and power-law decay in space and time of triggered earthquakes. The results from the first month following the L'Aquila earthquake exhibit a good fit between forecasts and observations, indicating that accurate earthquake forecasting is now a realistic goal. Our experience with this experiment demonstrates an urgent need for a connection between probabilistic forecasts and decision-making in order to establish - before crises - quantitative and transparent protocols for decision support.

  10. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  11. Lessons Learned from Eight Years' Experience of Actual Operation, and Future Prospects of JMA Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Nishimae, Y.

    2015-12-01

    Since 2007, experiences of actual operation of EEW have been gained by the Japan Meteorological Agency (JMA). During this period, we have learned lessons from many M6- and M7-class earthquakes, and the Mw9.0 Tohoku earthquake. During the Mw9.0 Tohoku earthquake, JMA system functioned well: it issued a warning message more than 15 s before strong ground shaking in the Tohoku district (relatively near distance from the epicenter). However, it was not perfect: in addition to the problem of large extent of fault rupture, some false warning messages were issued due to the confusion of the system because of simultaneous multiple aftershocks which occurred at the wide rupture area. To address the problems, JMA will introduce two new methods into the operational system this year to start their tests, aiming at practical operation within a couple of years. One is Integrated Particle Filter (IPF) method, which is an integrated algorithm of multiple hypocenter determination techniques with Bayesian estimation, in which amplitude information is also used for hypocenter determination. The other is Propagation of Local Undamped Motion (PLUM) method, in which warning message is issued when strong ground shaking is detected at nearby stations around the target site (e.g., within 30 km). Here, hypocenter and magnitude are not required in PLUM. Aiming at application for several years later, we are investigating a new approach, in which current wavefield is estimated in real time, and then future wavefield is predicted time evolutionally from the current situation using physics of wave propagation. Here, hypocenter and magnitude are not necessarily required, but real-time observation of ground shaking is necessary. JMA also plans to predict long period ground motion (up to 8 s) with the EEW system for earthquake damage mitigation in high-rise buildings. Its test will start using the operational system in the near future.

  12. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  13. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  14. Computing single step operators of logic programming in radial basis function neural networks

    SciTech Connect

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  15. The Mixed Waste Management Facility. Design basis integrated operations plan (Title I design)

    SciTech Connect

    1994-12-01

    The Mixed Waste Management Facility (MWMF) will be a fully integrated, pilotscale facility for the demonstration of low-level, organic-matrix mixed waste treatment technologies. It will provide the bridge from bench-scale demonstrated technologies to the deployment and operation of full-scale treatment facilities. The MWMF is a key element in reducing the risk in deployment of effective and environmentally acceptable treatment processes for organic mixed-waste streams. The MWMF will provide the engineering test data, formal evaluation, and operating experience that will be required for these demonstration systems to become accepted by EPA and deployable in waste treatment facilities. The deployment will also demonstrate how to approach the permitting process with the regulatory agencies and how to operate and maintain the processes in a safe manner. This document describes, at a high level, how the facility will be designed and operated to achieve this mission. It frequently refers the reader to additional documentation that provides more detail in specific areas. Effective evaluation of a technology consists of a variety of informal and formal demonstrations involving individual technology systems or subsystems, integrated technology system combinations, or complete integrated treatment trains. Informal demonstrations will typically be used to gather general operating information and to establish a basis for development of formal demonstration plans. Formal demonstrations consist of a specific series of tests that are used to rigorously demonstrate the operation or performance of a specific system configuration.

  16. Power systems after the Northridge earthquake: Emergency operations and changes in seismic equipment specifications, practice, and system configuration

    SciTech Connect

    Schiff, A.J.; Tognazzini, R.; Ostrom, D.

    1995-12-31

    The Northridge earthquake caused extensive damage to high voltage substation equipment, and for the first time the failure of transmission towers. Power was lost to much of the earthquake impacted area, and 93% of the customers were restored within 24 hours. To restore service, damage monitoring, communication and protective equipment, such as current-voltage transformers, wave traps, and lightning arresters, were removed or bypassed and operation restored. To improve performance some porcelain members are being replaced with composite materials for bushings, current-voltage transformers and lightning arresters. Interim equipment seismic specifications for equipment have been instituted. Some substations are being re-configured and rigid bus and conductors are being replaced with flexible conductors. Non-load carrying conductors, such as those used on lightning arrester, are being reduced in size to reduce potential interaction problems. Better methods of documenting damage and repair costs are being considered.

  17. Representation of discrete Steklov-Poincare operator arising in domain decomposition methods in wavelet basis

    SciTech Connect

    Jemcov, A.; Matovic, M.D.

    1996-12-31

    This paper examines the sparse representation and preconditioning of a discrete Steklov-Poincare operator which arises in domain decomposition methods. A non-overlapping domain decomposition method is applied to a second order self-adjoint elliptic operator (Poisson equation), with homogeneous boundary conditions, as a model problem. It is shown that the discrete Steklov-Poincare operator allows sparse representation with a bounded condition number in wavelet basis if the transformation is followed by thresholding and resealing. These two steps combined enable the effective use of Krylov subspace methods as an iterative solution procedure for the system of linear equations. Finding the solution of an interface problem in domain decomposition methods, known as a Schur complement problem, has been shown to be equivalent to the discrete form of Steklov-Poincare operator. A common way to obtain Schur complement matrix is by ordering the matrix of discrete differential operator in subdomain node groups then block eliminating interface nodes. The result is a dense matrix which corresponds to the interface problem. This is equivalent to reducing the original problem to several smaller differential problems and one boundary integral equation problem for the subdomain interface.

  18. Real-time earthquake alert system for the greater San Francisco Bay Area: a prototype design to address operational issues

    SciTech Connect

    Harben, P.E.; Jarpe, S.; Hunter, S.

    1996-12-10

    The purpose of the earthquake alert system (EAS) is to outrun the seismic energy released in a large earthquake using a geographically distributed network of strong motion sensors that telemeter data to a rapid CPU-processing station, which then issues an area-wide warning to a region before strong motion will occur. The warning times involved are short, from 0 to 30 seconds or so; consequently, most responses must be automated. The San Francisco Bay Area is particularly well suited for an EAS because (1) large earthquakes have relatively shallow hypocenters (10- to 20-kilometer depth), giving favorable ray-path geometries for larger warning times than deeper from earthquakes, and (2) the active faults are few in number and well characterized, which means far fewer geographically distributed strong motion sensors are (about 50 in this region). An EAS prototype is being implemented in the San Francisco Bay Area. The system consists of four distinct subsystems: (1) a distributed strong motion seismic network, (2) a central processing station, (3) a warning communications system and (4) user receiver and response systems. We have designed a simple, reliable, and inexpensive strong motion monitoring station that consists of a three-component Analog Devices ADXLO5 accelerometer sensing unit, a vertical component weak motion sensor for system testing, a 16-bit digitizer with multiplexing, and communication output ports for RS232 modem or radio telemetry. The unit is battery-powered and will be sited in fire stations. The prototype central computer analysis system consists of a PC dam-acquisition platform that pipes the incoming strong motion data via Ethernet to Unix-based workstations for dam processing. Simple real-time algorithms, particularly for magnitude estimation, are implemented to give estimates of the time since the earthquake`s onset its hypocenter location, its magnitude, and the reliability of the estimate. These parameters are calculated and transmitted

  19. Review of the Technical Basis of the Hydrogen Control Limit for Operations in Hanford Tank Farms

    SciTech Connect

    Mahoney, Lenna A. ); Stewart, Charles W. )

    2002-11-30

    The waste in Hanford tanks generates a mixture of flammable gases and releases it into the tank headspace. The potential hazard resulting from flammable gas generation requires that controls be established to prevent ignition and halt operations if gas concentrations reach levels of concern. In cases where only hydrogen is monitored, a control limit of 6,250 ppm hydrogen has been in use at Hanford for several years. The hydrogen-based control limit is intended to conservatively represent 25% of the lower flammability limit of a gas mixture, accounting for the presence of flammable gases other than hydrogen, with ammonia being the primary concern. This report reviews the technical basis of the current control limit based on observed and projected concentrations of hydrogen and ammonia representing a range of gas release scenarios. The conclusion supports the continued use of the current 6,250 ppm hydrogen control limit

  20. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  1. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    SciTech Connect

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  2. Recent Experiences Operating a Large, International Network of Electromagnetic Earthquake Monitors

    NASA Astrophysics Data System (ADS)

    Bleier, T.; Dunson, J. C.; Lemon, J.

    2014-12-01

    Leading a 5-nation international collaboration, QuakeFinder currently has a network of 168 instruments along with a Data Center that processes the 10 GB of data each day, 7 days a week. Each instrument includes 3-axis induction magnetometers, positive and negative ion sensors, and a geophone. These ground instruments are augmented with GOES weather satellite infrared monitoring of California (and in the future—other countries). The nature of the signals we are trying to detect and identify to enable forecasts for significant earthquakes (>M5) involves refining algorithms that both identify quake-related signals at some distance and remove a myriad of natural and anthropogenic noise. Maximum detection range was further investigated this year. An initial estimated maximum detection distance of 10 miles (16 km) was challenged with the onset of a M8.2 quake near Iquique, Chile on April 1, 2014. We will discuss the different strategies used to push the limits of detection for this quake which was 93 miles (149 km) from the instrument that had just been installed 2 months before the quake. Identifying and masking natural and man-made noise to reduce the number of misses and false alarms, and to increase the number of "hits" in a limited earthquake data set continues to be a top priority. Several novel approaches were tried, and the resulting progress will be discussed.

  3. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  4. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1995-01-01

    Incineration as a method of treating radioactive or mixed waste is attractive because of volume reduction, but may result in high concentrations of some hazardous components. For safety reasons during operation, and because of the environmental impact of the plant, it is important to know how these materials partition between the furnace slay, the fly ash, and the stack emission. The chemistry of about 50 elements is discussed and through consideration of high temperature thermodynamic equilibria, an attempt is made to provide a basis for predicting how various radionuclides and heavy metals behave in a typical incinerator. The chemistry of the individual elements is first considered and a prediction of the most stable chemical species in the typical incinerator atmosphere is made. The treatment emphasizes volatility and the parameters considered are temperature, acidity, oxygen, sulfur, and halogen content, and the presence of several other key non-radioactive elements. A computer model is used to calculate equilibrium concentrations of many species in several systems at temperatures ranging from 500 to 1600{degrees}K. It is suggested that deliberate addition of various feed chemicals can have a major impact on the fate of many radionuclides and heavy metals. Several problems concerning limitations and application of the data are considered.

  5. The power of simplification: Operator interface with the AP1000{sup R} during design-basis and beyond design-basis events

    SciTech Connect

    Williams, M. G.; Mouser, M. R.; Simon, J. B.

    2012-07-01

    The AP1000{sup R} plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been

  6. Optimizing the Use of Chief Complaint & Diagnosis for Operational Decision Making: An EMR Case Study of the 2010 Haiti Earthquake

    PubMed Central

    Bambrick, Alexandra T.; Passman, Dina B.; Torman, Rachel M.; Livinski, Alicia A.; Olsen, Jennifer M.

    2014-01-01

    Introduction: Data from an electronic medical record (EMR) system can provide valuable insight regarding health consequences in the aftermath of a disaster. In January of 2010, the U.S. Department of Health and Human Services (HHS) deployed medical personnel to Haiti in response to a crippling earthquake. An EMR system was used to record patient encounters in real-time and to provide data for decision support during response activities. Problem: During the Haiti response, HHS monitored the EMR system by recoding diagnoses into seven broad categories. At the conclusion of the response, it was evident that a new diagnosis categorization process was needed to provide a better description of the patient encounters that were seen in the field. After examining the EMRs, researchers determined nearly half of the medical records were missing diagnosis data. The objective of this study was to develop and test a new method of categorization for patient encounters to provide more detailed data for decision making. Methods: A single researcher verified or assigned a new diagnosis for 8,787 EMRs created during the Haiti response. This created a new variable, the Operational Code, which was based on available diagnosis data and chief complaint. Retrospectively, diagnoses recorded in the field and Operational Codes were categorized into eighteen categories based on the ICD-9-CM diagnostic system. Results: Creating an Operational Code variable led to a more robust data set and a clearer depiction emerged of the clinical presentations seen at six HHS clinics set up in the aftermath of Haiti’s earthquake. The number of records with an associated ICD-9 code increased 106% from 4,261 to 8,787. The most frequent Operational Code categories during the response were: General Symptoms, Signs, and Ill-Defined Conditions (34.2%), Injury and Poisoning (18.9%), Other (14.7%), Respiratory (4.8%), and Musculoskeletal and Connective Tissue (4.8%). Conclusion: The Operational Code methodology

  7. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  8. Transient Fluid Flow Along Basement Faults and Rupture Mechanics: Can We Expect Injection-Induced Earthquake Behavior to Correspond Directly With Injection Operations?

    NASA Astrophysics Data System (ADS)

    Norbeck, J. H.; Horne, R. N.

    2015-12-01

    We explored injection-induced earthquake behavior in geologic settings where basement faults are connected hydraulically to overlying saline aquifers targeted for wastewater disposal. Understanding how the interaction between natural geology and injection well operations affects the behavior of injection-induced earthquake sequences has important implications for characterizing seismic hazard risk. Numerical experiments were performed to investigate the extent to which seismicity is influenced by the migration of pressure perturbations along fault zones. Two distinct behaviors were observed: a) earthquake ruptures that were confined to the pressurized region of the fault and b) sustained earthquake ruptures that propagated far beyond the pressure front. These two faulting mechanisms have important implications for assessing the manner in which seismicity can be expected respond to injection well operations.Based upon observations from the numerical experiments, we developed a criterion that can be used to classify the expected faulting behavior near wastewater disposal sites. The faulting criterion depends on the state of stress, the initial fluid pressure, the orientation of the fault, and the dynamic friction coefficient of the fault. If the initial ratio of shear to effective normal stress resolved on the fault (the prestress ratio) is less than the fault's dynamic friction coefficient, then earthquake ruptures will tend to be limited by the distance of the pressure front. In this case, parameters that affect seismic hazard assessment, like the maximum earthquake magnitude or earthquake recurrence interval, could correlate with injection well operational parameters. For example, the maximum earthquake magnitude might be expected to grow over time in a systematic manner as larger patches of the fault are exposed to significant pressure changes. In contrast, if the prestress ratio is greater than dynamic friction, a stress drop can occur outside of the pressurized

  9. LLNL earthquake impact analysis committee report on the Livermore, California, earthquakes of January 24 and 26, 1980

    SciTech Connect

    Not Available

    1980-07-15

    The overall effects of the earthquakes of January 24 and 26, 1980, at the Lawrence Livermore National Laboratory in northern California are outlined. The damage caused by those earthquakes and how employees responded are discussed. The immediate emergency actions taken by management and the subsequent measures to resume operations are summarized. Long-range plans for recovery and repair, and the seisic history of the Livermore Valley region, various investigations concerning the design-basis earthquake (DBE), and seismic criteria for structures are reviewed. Following an analysis of the Laboratory's earthquake preparedness, emergency response, and related matters a series of conclusions and recommendations are presented. Appendixes provide additional information, such as persons interviewed, seismic and site maps, and a summary of the estimated costs incurred from the earthquakes.

  10. DEVELOPMENT OF A MATHEMATICAL BASIS FOR RELATING SLUDGE PROPERTIES TO FGD-SCRUBBER OPERATING VARIABLES

    EPA Science Inventory

    The report gives results of research to investigate prospects for increasing the size of calcium sulfite sludge particles in flue gas desulfurization systems. The approach included four work packages: a literature survey and development of a mathematical basis for predicting calc...

  11. Martin Marietta Energy Systems, Inc. comprehensive earthquake management plan: Emergency Operations Center training manual

    SciTech Connect

    Not Available

    1990-02-28

    The objective of this training is to: describe the responsibilities, resources, and goals of the Emergency Operations Center and be able to evaluate and interpret this information to best direct and allocate emergency, plant, and other resources to protect life and the Paducah Gaseous Diffusion Plant.

  12. Everyday Earthquakes.

    ERIC Educational Resources Information Center

    Svec, Michael

    1996-01-01

    Describes methods to access current earthquake information from the National Earthquake Information Center. Enables students to build genuine learning experiences using real data from earthquakes that have recently occurred. (JRH)

  13. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant. PMID:21038753

  14. Duration and predictors of emergency surgical operations - basis for medical management of mass casualty incidents

    PubMed Central

    2009-01-01

    Background Hospitals have a critically important role in the management of mass causality incidents (MCI), yet there is little information to assist emergency planners. A significantly limiting factor of a hospital's capability to treat those affected is its surgical capacity. We therefore intended to provide data about the duration and predictors of life saving operations. Methods The data of 20,815 predominantly blunt trauma patients recorded in the Trauma Registry of the German-Trauma-Society was retrospectively analyzed to calculate the duration of life-saving operations as well as their predictors. Inclusion criteria were an ISS ≥ 16 and the performance of relevant ICPM-coded procedures within 6 h of admission. Results From 1,228 patients fulfilling the inclusion criteria 1,793 operations could be identified as life-saving operations. Acute injuries to the abdomen accounted for 54.1% followed by head injuries (26.3%), pelvic injuries (11.5%), thoracic injuries (5.0%) and major amputations (3.1%). The mean cut to suture time was 130 min (IQR 65-165 min). Logistic regression revealed 8 variables associated with an emergency operation: AIS of abdomen ≥ 3 (OR 4,00), ISS ≥ 35 (OR 2,94), hemoglobin level ≤ 8 mg/dL (OR 1,40), pulse rate on hospital admission < 40 or > 120/min (OR 1,39), blood pressure on hospital admission < 90 mmHg (OR 1,35), prehospital infusion volume ≥ 2000 ml (OR 1,34), GCS ≤ 8 (OR 1,32) and anisocoria (OR 1,28) on-scene. Conclusions The mean operation time of 130 min calculated for emergency life-saving surgical operations provides a realistic guideline for the prospective treatment capacity which can be estimated and projected into an actual incident admission capacity. Knowledge of predictive factors for life-saving emergency operations helps to identify those patients that need most urgent operative treatment in case of blunt MCI. PMID:20149987

  15. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-04-01

    This report presents preliminary research results from the investigation in to the development of new models and guidance for concepts of operations (ConOps) in advanced small modular reactor (aSMR) designs. In support of this objective, three important research areas were included: operating principles of multi-modular plants, functional allocation models and strategies that would affect the development of new, non-traditional concept of operations, and the requiremetns for human performance, based upon work domain analysis and current regulatory requirements. As part of the approach for this report, we outline potential functions, including the theoretical and operational foundations for the development of a new functional allocation model and the identification of specific regulatory requirements that will influence the development of future concept of operations. The report also highlights changes in research strategy prompted by confirmationof the importance of applying the work domain analysis methodology to a reference aSMR design. It is described how this methodology will enrich the findings from this phase of the project in the subsequent phases and help in identification of metrics and focused studies for the determination of human performance criteria that can be used to support the design process.

  16. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  17. Experience in Construction and Operation of the Distributed Information Systems on the Basis of the Z39.50 Protocol

    NASA Astrophysics Data System (ADS)

    Zhizhimov, Oleg; Mazov, Nikolay; Skibin, Sergey

    Questions concerned with construction and operation of the distributed information systems on the basis of ANSI/NISO Z39.50 Information Retrieval Protocol are discussed in the paper. The paper is based on authors' practice in developing ZooPARK server. Architecture of distributed information systems, questions of reliability of such systems, minimization of search time and administration are examined. Problems with developing of distributed information systems are also described.

  18. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-08-01

    This report presents preliminary research results from the investigation into the development of new models and guidance for Concepts of Operations in advanced small modular reactor (AdvSMR) designs. AdvSMRs are nuclear power plants (NPPs), but unlike conventional large NPPs that are constructed on site, AdvSMRs systems and components will be fabricated in a factory and then assembled on site. AdvSMRs will also use advanced digital instrumentation and control systems, and make greater use of automation. Some AdvSMR designs also propose to be operated in a multi-unit configuration with a single central control room as a way to be more cost-competitive with existing NPPs. These differences from conventional NPPs not only pose technical and operational challenges, but they will undoubtedly also have regulatory compliance implications, especially with respect to staffing requirements and safety standards.

  19. A probabilistic risk assessment of the LLNL Plutonium Facility`s evaluation basis fire operational accident. Revision 1

    SciTech Connect

    Brumburgh, G. P.

    1995-02-27

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous programmatic activities involving plutonium to include device fabrication, development of improved and/or unique fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed in July 1994 to address operational safety and acceptable risk to employees, the public, government property, and the environmental. This paper outlines the PRA analysis of the Evaluation Basis Fire (EBF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility.

  20. Light storage in a tripod medium as a basis for logical operations

    NASA Astrophysics Data System (ADS)

    Słowik, K.; Raczyński, A.; Zaremba, J.; Zielińska-Kaniasty, S.

    2012-05-01

    A photon being a carrier of a polarization qubit is stored inside an atomic medium in the tripod configuration in the form of atomic excitations. Such stored information can be processed in the atomic memory and carried away by the released photon. An implementation is proposed of single qubit gates, e.g., phase, NOT, √{NOT} and Hadamard, as well as for a two-qubit CNOT gate, operating on polarized photons and based on light storage.

  1. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1994-09-01

    For waste containing small amounts of radioactivity, rad waste (RW), or mixed waste (MW) containing both radioactive and chemically hazardous components, incineration is a logical management candidate because of inherent safety, waste volume reduction, and low costs. Successful operation requires that the facility is properly designed and operated to protect workers and to limit releases of hazardous materials. The large decrease in waste volume achieved by incineration also results in a higher concentration of most of the radionuclides and non radioactive heavy metals in the ash products. These concentrations impact subsequent treatment and disposal. The various constituents (chemical elements) are not equal in concentration in the various incinerator feed materials, nor are they equal in their contribution to health risks on subsequent handling, or accidental release. Thus, for management of the wastes it is important to be able to predict how the nuclides partition between the primary combustion residue which may be an ash or a fused slag, the fine particulates or fly ash that is trapped in the burner off-gas by several different techniques, and the airborne fraction that escapes to the atmosphere. The objective of this report is to provide an estimate of how different elements of concern may behave in the chemical environment of the incinerator. The study briefly examines published incinerator operation data, then considers the properties of the elements of concern, and employs thermodynamic calculations, to help predict the fate of these RW and MW constituents. Many types and configurations of incinerators have been designed and tested.

  2. Diagnostics of PF-1000 Facility Operation and Plasma Concentration on the Basis of Spectral Measurements

    SciTech Connect

    Skladnik-Sadowska, E.; Malinowski, K.; Sadowski, M. J.; Scholz, M.; Tsarenko, A. V.

    2006-01-15

    The paper concerns the monitoring of high-current pulse discharges and the determination of the plasma concentration within the dense magnetized plasma by means of optical spectroscopy methods. In experiments with the large PF-1000 facility operated at IPPLM in Warsaw, Poland, attention was paid to the determination of the operational mode and electron concentration under different experimental conditions. To measure the visible radiation (VR) the use was made of the MECHELLE registered 900-spectrometer equipped with the CCD readout. The VR emission, observed at 65 deg. to the z-axis, originated from a part of the electrode surfaces, the collapsing current-sheath layer and the dense plasma pinch-region (40-50 mm from the electrode ends). Considerable differences were found in the optical spectra recorded for so-called 'good shots' and for cases of some failures. Estimates of the electron concentration, which were performed with different spectroscopic techniques, showed that it ranged from 5.56x1018 cm-3 to 4.8x1019 cm-3, depending on experimental conditions. The correlation of the fusion-neutron yield and the plasma density was proved.

  3. Waste Encapsulation and Storage Facility (WESF) Basis for Interim Operation (BIO)

    SciTech Connect

    COVEY, L.I.

    2000-11-28

    The Waste Encapsulation and Storage Facility (WESF) is located in the 200 East Area adjacent to B Plant on the Hanford Site north of Richland, Washington. The current WESF mission is to receive and store the cesium and strontium capsules that were manufactured at WESF in a safe manner and in compliance with all applicable rules and regulations. The scope of WESF operations is currently limited to receipt, inspection, decontamination, storage, and surveillance of capsules in addition to facility maintenance activities. The capsules are expected to be stored at WESF until the year 2017, at which time they will have been transferred for ultimate disposition. The WESF facility was designed and constructed to process, encapsulate, and store the extracted long-lived radionuclides, {sup 90}Sr and {sup 137}Cs, from wastes generated during the chemical processing of defense fuel on the Hanford Site thus ensuring isolation of hazardous radioisotopes from the environment. The construction of WESF started in 1971 and was completed in 1973. Some of the {sup 137}Cs capsules were leased by private irradiators or transferred to other programs. All leased capsules have been returned to WESF. Capsules transferred to other programs will not be returned except for the seven powder and pellet Type W overpacks already stored at WESF.

  4. Modeling of the Reactor Core Isolation Cooling Response to Beyond Design Basis Operations - Interim Report

    SciTech Connect

    Ross, Kyle; Cardoni, Jeffrey N.; Wilson, Chisom Shawn; Morrow, Charles; Osborn, Douglas; Gauntt, Randall O.

    2015-12-01

    Efforts are being pursued to develop and qualify a system-level model of a reactor core isolation (RCIC) steam-turbine-driven pump. The model is being developed with the intent of employing it to inform the design of experimental configurations for full-scale RCIC testing. The model is expected to be especially valuable in sizing equipment needed in the testing. An additional intent is to use the model in understanding more fully how RCIC apparently managed to operate far removed from its design envelope in the Fukushima Daiichi Unit 2 accident. RCIC modeling is proceeding along two avenues that are expected to complement each other well. The first avenue is the continued development of the system-level RCIC model that will serve in simulating a full reactor system or full experimental configuration of which a RCIC system is part. The model reasonably represents a RCIC system today, especially given design operating conditions, but lacks specifics that are likely important in representing the off-design conditions a RCIC system might experience in an emergency situation such as a loss of all electrical power. A known specific lacking in the system model, for example, is the efficiency at which a flashing slug of water (as opposed to a concentrated jet of steam) could propel the rotating drive wheel of a RCIC turbine. To address this specific, the second avenue is being pursued wherein computational fluid dynamics (CFD) analyses of such a jet are being carried out. The results of the CFD analyses will thus complement and inform the system modeling. The system modeling will, in turn, complement the CFD analysis by providing the system information needed to impose appropriate boundary conditions on the CFD simulations. The system model will be used to inform the selection of configurations and equipment best suitable of supporting planned RCIC experimental testing. Preliminary investigations with the RCIC model indicate that liquid water ingestion by the turbine

  5. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  6. Earthquake prediction

    SciTech Connect

    Ma, Z.; Fu, Z.; Zhang, Y.; Wang, C.; Zhang, G.; Liu, D.

    1989-01-01

    Mainland China is situated at the eastern edge of the Eurasian seismic system and is the largest intra-continental region of shallow strong earthquakes in the world. Based on nine earthquakes with magnitudes ranging between 7.0 and 7.9, the book provides observational data and discusses successes and failures of earthquake prediction. Derived from individual earthquakes, observations of various phenomena and seismic activities occurring before and after earthquakes, led to the establishment of some general characteristics valid for earthquake prediction.

  7. Hidden Earthquakes.

    ERIC Educational Resources Information Center

    Stein, Ross S.; Yeats, Robert S.

    1989-01-01

    Points out that large earthquakes can take place not only on faults that cut the earth's surface but also on blind faults under folded terrain. Describes four examples of fold earthquakes. Discusses the fold earthquakes using several diagrams and pictures. (YP)

  8. Debriefing of American Red Cross personnel: pilot study on participants' evaluations and case examples from the 1994 Los Angeles earthquake relief operation.

    PubMed

    Armstrong, K; Zatzick, D; Metzler, T; Weiss, D S; Marmar, C R; Garma, S; Ronfeldt, H; Roepke, L

    1998-01-01

    The Multiple Stressor Debriefing (MSD) model was used to debrief 112 American Red Cross workers individually or in groups after their participation in the 1994 Los Angeles earthquake relief effort. Two composite case examples are presented that illustrate individual and group debriefings using the MSD model. A questionnaire which evaluated workers' experience of debriefing, was completed by 95 workers. Results indicated that workers evaluated the debriefings in which they participated positively. In addition, as participant to facilitator ratio increased, workers shared less of their feelings and reactions about the disaster relief operation. These findings, as well as more specific issues about debriefing, are discussed. PMID:9579015

  9. Hidden earthquakes

    SciTech Connect

    Stein, R.S.; Yeats, R.S.

    1989-06-01

    Seismologists generally look for earthquakes to happen along visible fault lines, e.g., the San Andreas fault. The authors maintain that another source of dangerous quakes has been overlooked: the release of stress along a fault that is hidden under a fold in the earth's crust. The paper describes the differences between an earthquake which occurs on a visible fault and one which occurs under an anticline and warns that Los Angeles greatest earthquake threat may come from a small quake originating under downtown Los Angeles, rather than a larger earthquake which occurs 50 miles away at the San Andreas fault.

  10. Earthquakes: A Teacher's Package for K-6.

    ERIC Educational Resources Information Center

    National Science Teachers Association, Washington, DC.

    Like rain, an earthquake is a natural occurrence which may be mild or catastrophic. Although an earthquake may last only a few seconds, the processes that cause it have operated within the earth for millions of years. Until recently, the cause of earthquakes was a mystery and the subject of fanciful folklore to people all around the world. This…

  11. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  12. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  13. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  14. Deep Earthquakes.

    ERIC Educational Resources Information Center

    Frohlich, Cliff

    1989-01-01

    Summarizes research to find the nature of deep earthquakes occurring hundreds of kilometers down in the earth's mantle. Describes further research problems in this area. Presents several illustrations and four references. (YP)

  15. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  16. Proposed plan/Statement of basis for the Grace Road Site (631-22G) operable unit: Final action

    SciTech Connect

    Palmer, E.

    1997-08-19

    This Statement of Basis/Proposed Plan is being issued by the U. S. Department of Energy (DOE), which functions as the lead agency for the Savannah River Site (SRS) remedial activities, with concurrence by the U. S. Environmental Protection Agency (EPA), and the South Carolina Department of Health and Environmental Control (SCDHEC). The purpose of this Statement of Basis/Proposed Plan is to describe the preferred alternative for addressing the Grace Road site (GRS) located at the Savannah River Site (SRS), in Aiken, South Carolina and to provide an opportunity for public input into the remedial action selection process.

  17. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable year which... made upon the final determination of the rate of absorption applicable to the taxable year....

  18. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  19. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  20. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  1. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  2. Earthquakes for Kids

    MedlinePlus

    ... Hazards Data & Products Learn Monitoring Research Earthquakes for Kids Kid's Privacy Policy Earthquake Topics for Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters ...

  3. Two grave issues concerning the expected Tokai Earthquake

    NASA Astrophysics Data System (ADS)

    Mogi, Kiyoo

    2004-08-01

    The possibility of a great shallow earthquake (M 8) in the Tokai region, central Honshu, in the near future was pointed out by Mogi in 1969 and by the Coordinating Committee for Earthquake Prediction (CCEP), Japan (1970). In 1978, the government enacted the Large-Scale Earthquake Countermeasures Law and began to set up intensified observations in this region for short-term prediction of the expected Tokai earthquake. In this paper, two serious issues are pointed out, which may contribute to catastrophic effects in connection with the Tokai earthquake: 1. The danger of black-and-white predictions: According to the scenario based on the Large-Scale Earthquake Countermeasures Law, if abnormal crustal changes are observed, the Earthquake Assessment Committee (EAC) will determine whether or not there is an imminent danger. The findings are reported to the Prime Minister who decides whether to issue an official warning statement. Administrative policy clearly stipulates the measures to be taken in response to such a warning, and because the law presupposes the ability to predict a large earthquake accurately, there are drastic measures appropriate to the situation. The Tokai region is a densely populated region with high social and economic activity, and it is traversed by several vital transportation arteries. When a warning statement is issued, all transportation is to be halted. The Tokyo capital region would be cut off from the Nagoya and Osaka regions, and there would be a great impact on all of Japan. I (the former chairman of EAC) maintained that in view of the variety and complexity of precursory phenomena, it was inadvisable to attempt a black-and-white judgment as the basis for a "warning statement". I urged that the government adopt a "soft warning" system that acknowledges the uncertainty factor and that countermeasures be designed with that uncertainty in mind. 2. The danger of nuclear power plants in the focal region: Although the possibility of the

  4. A computational method for solving stochastic Itô–Volterra integral equations based on stochastic operational matrix for generalized hat basis functions

    SciTech Connect

    Heydari, M.H.; Hooshmandasl, M.R.; Maalek Ghaini, F.M.; Cattani, C.

    2014-08-01

    In this paper, a new computational method based on the generalized hat basis functions is proposed for solving stochastic Itô–Volterra integral equations. In this way, a new stochastic operational matrix for generalized hat functions on the finite interval [0,T] is obtained. By using these basis functions and their stochastic operational matrix, such problems can be transformed into linear lower triangular systems of algebraic equations which can be directly solved by forward substitution. Also, the rate of convergence of the proposed method is considered and it has been shown that it is O(1/(n{sup 2}) ). Further, in order to show the accuracy and reliability of the proposed method, the new approach is compared with the block pulse functions method by some examples. The obtained results reveal that the proposed method is more accurate and efficient in comparison with the block pule functions method.

  5. A Schauder and Riesz basis criterion for non-self-adjoint Schrödinger operators with periodic and antiperiodic boundary conditions

    NASA Astrophysics Data System (ADS)

    Gesztesy, Fritz; Tkachenko, Vadim

    Under the assumption that V∈L2([0,π];dx), we derive necessary and sufficient conditions in terms of spectral data for (non-self-adjoint) Schrödinger operators -d2/dx2+V in L2([0,π];dx) with periodic and antiperiodic boundary conditions to possess a Riesz basis of root vectors (i.e., eigenvectors and generalized eigenvectors spanning the range of the Riesz projection associated with the corresponding periodic and antiperiodic eigenvalues). We also discuss the case of a Schauder basis for periodic and antiperiodic Schrödinger operators -d2/dx2+V in Lp([0,π];dx), p∈(1,∞).

  6. The parkfield, california, earthquake prediction experiment.

    PubMed

    Bakun, W H; Lindh, A G

    1985-08-16

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment. PMID:17739363

  7. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways...

  8. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways...

  9. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated...

  10. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated...

  11. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture or Irrigation That Is Exempted From the Overtime Pay Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways...

  12. Postseismic Transient after the 2002 Denali Fault Earthquake from VLBI Measurements at Fairbanks

    NASA Technical Reports Server (NTRS)

    MacMillan, Daniel; Cohen, Steven

    2004-01-01

    The VLBI antenna (GILCREEK) at Fairbanks, Alaska observes in networks routinely twice a week with operational networks and on additional days with other networks on a more uneven basis. The Fairbanks antenna position is about 150 km north of the Denali fault and from the earthquake epicenter. We examine the transient behavior of the estimated VLBI position during the year following the earthquake to determine how the rate of change of postseismic deformation has changed. This is compared with what is seen in the GPS site position series.

  13. United States earthquakes, 1984

    SciTech Connect

    Stover, C.W.

    1988-01-01

    The report contains information for eartthquakes in the 50 states and Puerto Rico and the area near their shorelines. The data consist of earthquake locations (date, time, geographic coordinates, depth, and magnitudes), intensities, macroseismic information, and isoseismal and seismicity maps. Also, included are sections detailing the activity of seismic networks operated by universities and other government agencies and a list of results form strong-motion seismograph records.

  14. The Effects of Degraded Digital Instrumentation and Control Systems on Human-system Interfaces and Operator Performance: HFE Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.; W. Gunther, G. Martinez-Guridi

    2010-02-26

    New and advanced reactors will use integrated digital instrumentation and control (I&C) systems to support operators in their monitoring and control functions. Even though digital systems are typically highly reliable, their potential for degradation or failure could significantly affect operator performance and, consequently, impact plant safety. The U.S. Nuclear Regulatory Commission (NRC) supported this research project to investigate the effects of degraded I&C systems on human performance and plant operations. The objective was to develop human factors engineering (HFE) review guidance addressing the detection and management of degraded digital I&C conditions by plant operators. We reviewed pertinent standards and guidelines, empirical studies, and plant operating experience. In addition, we conducted an evaluation of the potential effects of selected failure modes of the digital feedwater system on human-system interfaces (HSIs) and operator performance. The results indicated that I&C degradations are prevalent in plants employing digital systems and the overall effects on plant behavior can be significant, such as causing a reactor trip or causing equipment to operate unexpectedly. I&C degradations can impact the HSIs used by operators to monitor and control the plant. For example, sensor degradations can make displays difficult to interpret and can sometimes mislead operators by making it appear that a process disturbance has occurred. We used the information obtained as the technical basis upon which to develop HFE review guidance. The guidance addresses the treatment of degraded I&C conditions as part of the design process and the HSI features and functions that support operators to monitor I&C performance and manage I&C degradations when they occur. In addition, we identified topics for future research.

  15. America's faulty earthquake plans

    SciTech Connect

    Rosen, J

    1989-10-01

    In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.

  16. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide. PMID:22410538

  17. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  18. Connecting slow earthquakes to huge earthquakes

    NASA Astrophysics Data System (ADS)

    Obara, Kazushige; Kato, Aitaro

    2016-07-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  19. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. PMID:27418504

  20. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  1. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  2. Development of information-and-control systems as a basis for modernizing the automated process control systems of operating power equipment

    NASA Astrophysics Data System (ADS)

    Shapiro, V. I.; Borisova, E. V.; Chausov, Yu. N.

    2014-03-01

    The main drawbacks inherent in the hardware of outdated control systems of power stations are discussed. It is shown that economically efficient and reliable operation of the process equipment will be impossible if certain part of these control systems is used further. It is pointed out that full retrofitting of outdated control systems on operating equipment in one go with replacing all technical facilities and cable connections by a modern computerized automation system involves certain difficulties if such work is carried out under the conditions of limited financial resources or a limited period of time destined for doing the works. A version of control system modernization is suggested that involves replacement of the most severely worn and outdated equipment (indicating and recording instruments, and local controllers) and retaining the existing cable routes and layout of board facilities. The modernization implies development of informationand-control systems constructed on the basis of a unified computerized automation system. Software and hardware products that have positively proven themselves in thermal power engineering are proposed for developing such an automation system. It is demonstrated that the proposed system has a considerable potential for its functional development and can become a basis for constructing a fully functional automated process control system.

  3. Stronger direction needed for the National Earthquake Program

    NASA Astrophysics Data System (ADS)

    1983-07-01

    The National Earthquake Hazards Reduction Program was established to mitigate the impact on communities. Emphasis is placed on: (1) the Federal Emergency Management Agency's (FEMA's) efforts to carry out its lead agency responsibilities for the program; (2) assistance provided to State and local governments in mitigating earthquake hazards; and (3) progress toward developing an operational earthquake prediction system.

  4. Earthquake Archaeology: a logical approach?

    NASA Astrophysics Data System (ADS)

    Stewart, I. S.; Buck, V. A.

    2001-12-01

    Ancient earthquakes can leave their mark in the mythical and literary accounts of ancient peoples, the stratigraphy of their site histories, and the structural integrity of their constructions. Within this broad cross-disciplinary tramping ground, earthquake geologists have tended to focus on those aspects of the cultural record that are most familiar to them; the physical effects of seismic deformation on ancient constructions. One of the core difficulties with this 'earthquake archaeology' approach is that recent attempts to isolate structural criteria that are diagnostic or strongly suggestive of a seismic origin are undermined by the recognition that signs of ancient seismicity are generally indistinguishable from non-seismic mechanisms (poor construction, adverse geotechnical conditions). We illustrate the difficulties and inconsistencies in current proposed 'earthquake diagnostic' schemes by reference to two case studies of archaeoseismic damage in central Greece. The first concerns fallen columns at various Classical temple localities in mainland Greece (Nemea, Sounio, Olympia, Bassai) which, on the basis of observed structural criteria, are earthquake-induced but which are alternatively explained by archaeologists as the action of human disturbance. The second re-examines the almost type example of the Kyparissi site in the Atalanti region as a Classical stoa offset across a seismic surface fault, arguing instead for its deformation by ground instability. Finally, in highlighting the inherent ambiguity of archaeoseismic data, we consider the value of a logic-tree approach for quantifying and quantifying our uncertainities for seismic-hazard analysis.

  5. On subduction zone earthquakes and the Pacific Northwest seismicity

    SciTech Connect

    Chung, Dae H.

    1991-12-01

    A short review of subduction zone earthquakes and the seismicity of the Pacific Northwest region of the United States is provided for the purpose of a basis for assessing issues related to earthquake hazard evaluations for the region. This review of seismotectonics regarding historical subduction zone earthquakes and more recent seismological studies pertaining to rupture processes of subduction zone earthquakes, with specific references to the Pacific Northwest, is made in this brief study. Subduction zone earthquakes tend to rupture updip and laterally from the hypocenter. Thus, the rupture surface tends to become more elongated as one considers larger earthquakes (there is limited updip distance that is strongly coupled, whereas rupture length can be quite large). The great Aleutian-Alaska earthquakes of 1957, 1964, and 1965 had rupture lengths of greater than 650 km. The largest earthquake observed instrumentally, the M{sub W} 9.5, 1960 Chile Earthquake, had a rupture length over 1000 km. However, earthquakes of this magnitude are very unlikely on Cascadia. The degree of surface shaking has a very strong dependency on the depth and style of rupture. The rupture surface during a great earthquake shows heterogeneous stress drop, displacement, energy release, etc. The high strength zones are traditionally termed asperities and these asperities control when and how large an earthquake is generated. Mapping of these asperities in specific subduction zones is very difficult before an earthquake. They show up more easily in inversions of dynamic source studies of earthquake ruptures, after an earthquake. Because seismic moment is based on the total radiated-energy from an earthquake, the moment-based magnitude M{sub W} is superior to all other magnitude estimates, such as M{sub L}, m{sub b}, M{sub bLg}, M{sub S}, etc Probably, just to have a common language, non-moment magnitudes should be converted to M{sub W} in any discussions of subduction zone earthquakes.

  6. Response to “Comment on ‘Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set’” [J. Chem. Phys. 140, 177103 (2014)

    SciTech Connect

    Reuter, Matthew G.; Harrison, Robert J.

    2014-05-07

    The thesis of Brandbyge's comment [J. Chem. Phys. 140, 177103 (2014)] is that our operator decoupling condition is immaterial to transport theories, and it appeals to discussions of nonorthogonal basis sets in transport calculations in its arguments. We maintain that the operator condition is to be preferred over the usual matrix conditions and subsequently detail problems in the existing approaches. From this operator perspective, we conclude that nonorthogonal projectors cannot be used and that the projectors must be selected to satisfy the operator decoupling condition. Because these conclusions pertain to operators, the choice of basis set is not germane.

  7. EARTHQUAKE HAZARDS IN THE OFFSHORE ENVIRONMENT.

    USGS Publications Warehouse

    Page, Robert A.; Basham, Peter W.

    1985-01-01

    This report discusses earthquake effects and potential hazards in the marine environment, describes and illustrates methods for the evaluation of earthquake hazards, and briefly reviews strategies for mitigating hazards. The report is broadly directed toward engineers, scientists, and others engaged in developing offshore resources. The continental shelves have become a major frontier in the search for new petroleum resources. Much of the current exploration is in areas of moderate to high earthquake activity. If the resources in these areas are to be developed economically and safely, potential earthquake hazards must be identified and mitigated both in planning and regulating activities and in designing, constructing, and operating facilities. Geologic earthquake effects that can be hazardous to marine facilities and operations include surface faulting, tectonic uplift and subsidence, seismic shaking, sea-floor failures, turbidity currents, and tsunamis.

  8. Chern-Simons gravity with (curvature){sup 2} and (torsion){sup 2} terms and a basis of degree-of-freedom projection operators

    SciTech Connect

    Helayeel-Neto, J. A.; Hernaski, C. A.; Pereira-Dias, B.; Vargas-Paredes, A. A.; Vasquez-Otoya, V. J.

    2010-09-15

    The effects of (curvature){sup 2}- and (torsion){sup 2}-terms in the Einstein-Hilbert-Chern-Simons Lagrangian are investigated. The purposes are two-fold: (i) to show the efficacy of an orthogonal basis of degree-of-freedom projection operators recently proposed and to ascertain its adequacy for obtaining propagators of general parity-breaking gravity models in three dimensions; (ii) to analyze the role of the topological Chern-Simons term for the unitarity and the particle spectrum of the model squared-curvature terms in connection with dynamical torsion. Our conclusion is that the Chern-Simons term does not influence the unitarity conditions imposed on the parameters of the Lagrangian but significantly modifies the particle spectrum.

  9. Automated Microwave Complex on the Basis of a Continuous-Wave Gyrotron with an Operating Frequency of 263 GHz and an Output Power of 1 kW

    NASA Astrophysics Data System (ADS)

    Glyavin, M. Yu.; Morozkin, M. V.; Tsvetkov, A. I.; Lubyako, L. V.; Golubiatnikov, G. Yu.; Kuftin, A. N.; Zapevalov, V. E.; V. Kholoptsev, V.; Eremeev, A. G.; Sedov, A. S.; Malygin, V. I.; Chirkov, A. V.; Fokin, A. P.; Sokolov, E. V.; Denisov, G. G.

    2016-02-01

    We study experimentally the automated microwave complex for microwave spectroscopy and diagnostics of various media, which was developed at the Institute of Applied Physics of the Russian Academy of Sciences in cooperation with GYCOM Ltd. on the basis of a gyrotron with a frequency of 263 GHz and operated at the first gyrofrequency harmonic. In the process of the experiments, a controllable output power of 0 .1 -1 kW was achieved with an efficiency of up to 17 % in the continuous-wave generation regime. The measured radiation spectrum with a relative width of about 10 -6 and the frequency values measured at various parameters of the device are presented. The results of measuring the parameters of the wave beam, which was formed by a built-in quasioptical converter, as well as the data obtained by measuring the heat loss in the cavity and the vacuum output window are analyzed.

  10. Evaluation of near-field earthquake effects

    SciTech Connect

    Shrivastava, H.P.

    1994-11-01

    Structures and equipment, which are qualified for the design basis earthquake (DBE) and have anchorage designed for the DBE loading, do not require an evaluation of the near-field earthquake (NFE) effects. However, safety class 1 acceleration sensitive equipment such as electrical relays must be evaluated for both NFE and DBE since they are known to malfunction when excited by high frequency seismic motions.

  11. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions. PMID:2347628

  12. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  13. Stress Drops for Potentially Induced Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  14. A Simplified Approach to the Basis Functions of Symmetry Operations and Terms of Metal Complexes in an Octahedral Field with d[superscript 1] to d[superscript 9] Configurations

    ERIC Educational Resources Information Center

    Lee, Liangshiu

    2010-01-01

    The basis sets for symmetry operations of d[superscript 1] to d[superscript 9] complexes in an octahedral field and the resulting terms are derived for the ground states and spin-allowed excited states. The basis sets are of fundamental importance in group theory. This work addresses such a fundamental issue, and the results are pedagogically…

  15. Material contrast does not predict earthquake rupture propagation direction

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    2005-01-01

    Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

  16. Geodetic measurement of deformation in the Loma Prieta, California earthquake with Very Long Baseline Interferometry (VLBI)

    SciTech Connect

    Clark, T.A.; Ma, C.; Sauber, J.M.; Ryan, J.W. ); Gordon, D.; Caprette, D.S. ); Shaffer, D.B.; Vandenberg, N.R. )

    1990-07-01

    Following the Loma Prieta earthquake, two mobile Very Long Baseline Interferometry (VLBI) systems operated by the NASA Crustal Dynamics Project and the NOAA National Geodetic Survey were deployed at three previously established VLBI sites in the earthquake area: Fort Ord (near Monterey), the Presidio (in San Francisco) and Point Reyes. From repeated VLBI occupations of these sites since 1983, the pre-earthquake rates of deformation have been determined with respect to a North American reference frame with 1{sigma} formal standard errors of {approximately}1 mm/yr. The VLBI measurements immediately following the earthquake showed that the Fort Ord site was displaced 49 {plus minus} 4 mm at an azimuth of 11 {plus minus} 4{degree} and that the Presidio site was displaced 12 {plus minus} 5 mm at an azimuth of 148 {plus minus} 13{degree}. No anomalous change was detected at Point Reyes with 1{sigma} uncertainty of 4 mm. The estimated displacements at Fort Ord and the Presidio are consistent with the static displacements predicted on the basis of a coseismic slip model in which slip on the southern segment is shallower than slip on the more northern segment is shallower than slip on the more northern segment of the fault rupture. The authors also give the Cartesian positions at epoch 1990.0 of a set of VLBI fiducial stations and the three mobile sites in the vicinity of the earthquake.

  17. Guidelines for earthquake ground motion definition for the eastern United States

    SciTech Connect

    Gwaltney, R.C.; Aramayo, G.A.; Williams, R.T.

    1985-01-01

    Guidelines for the determination of earthquake ground-motion definition for the eastern United States are established in this paper. Both far-field and near-field guidelines are given. The guidelines were based on an extensive review of the current procedures for specifying ground motion in the United States. Both empirical and theoretical procedures were used in establishing the guidelines because of the low seismicity in the eastern United States. Only a few large to great (M > 7.5) sized earthquakes have occurred in this region, no evidence of tectonic surface ruptures related to historic or Holocene earthquakes have been found, and no currently active plate boundaries of any kind are known in this region. Very little instrumented data has been gathered in the East. Theoretical procedures are proposed so that in regions of almost no data a reasonable level of seismic ground motion activity can be assumed. The guidelines are to be used to develop the Safe Shutdown Earthquake, SSE. A new procedure for establishing the Operating Basis Earthquake, OBE, is proposed, in particular for the eastern United States. The OBE would be developed using a probabilistic assessment of the geological conditions and the recurrence of seismic events at a site. These guidelines should be useful in development of seismic design requirements for future reactors. 17 refs., 2 figs., 1 tab.

  18. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  19. EARTHQUAKE CAUSED RELEASES FROM A NUCLEAR FUEL CYCLE FACILITY

    SciTech Connect

    Charles W. Solbrig; Chad Pope; Jason Andrus

    2014-08-01

    The fuel cycle facility (FCF) at the Idaho National Laboratory is a nuclear facility which must be licensed in order to operate. A safety analysis is required for a license. This paper describes the analysis of the Design Basis Accident for this facility. This analysis involves a model of the transient behavior of the FCF inert atmosphere hot cell following an earthquake initiated breach of pipes passing through the cell boundary. The hot cell is used to process spent metallic nuclear fuel. Such breaches allow the introduction of air and subsequent burning of pyrophoric metals. The model predicts the pressure, temperature, volumetric releases, cell heat transfer, metal fuel combustion, heat generation rates, radiological releases and other quantities. The results show that releases from the cell are minimal and satisfactory for safety. This analysis method should be useful in other facilities that have potential for damage from an earthquake and could eliminate the need to back fit facilities with earthquake proof boundaries or lessen the cost of new facilities.

  20. From Earthquake Prediction Research to Time-Variable Seismic Hazard Assessment Applications

    NASA Astrophysics Data System (ADS)

    Bormann, Peter

    2011-01-01

    The first part of the paper defines the terms and classifications common in earthquake prediction research and applications. This is followed by short reviews of major earthquake prediction programs initiated since World War II in several countries, for example the former USSR, China, Japan, the United States, and several European countries. It outlines the underlying expectations, concepts, and hypotheses, introduces the technologies and methodologies applied and some of the results obtained, which include both partial successes and failures. Emphasis is laid on discussing the scientific reasons why earthquake prediction research is so difficult and demanding and why the prospects are still so vague, at least as far as short-term and imminent predictions are concerned. However, classical probabilistic seismic hazard assessments, widely applied during the last few decades, have also clearly revealed their limitations. In their simple form, they are time-independent earthquake rupture forecasts based on the assumption of stable long-term recurrence of earthquakes in the seismotectonic areas under consideration. Therefore, during the last decade, earthquake prediction research and pilot applications have focused mainly on the development and rigorous testing of long and medium-term rupture forecast models in which event probabilities are conditioned by the occurrence of previous earthquakes, and on their integration into neo-deterministic approaches for improved time-variable seismic hazard assessment. The latter uses stress-renewal models that are calibrated for variations in the earthquake cycle as assessed on the basis of historical, paleoseismic, and other data, often complemented by multi-scale seismicity models, the use of pattern-recognition algorithms, and site-dependent strong-motion scenario modeling. International partnerships and a global infrastructure for comparative testing have recently been developed, for example the Collaboratory for the Study of

  1. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  2. Estimating earthquake potential

    USGS Publications Warehouse

    Page, R.A.

    1980-01-01

    The hazards to life and property from earthquakes can be minimized in three ways. First, structures can be designed and built to resist the effects of earthquakes. Second, the location of structures and human activities can be chosen to avoid or to limit the use of areas known to be subject to serious earthquake hazards. Third, preparations for an earthquake in response to a prediction or warning can reduce the loss of life and damage to property as well as promote a rapid recovery from the disaster. The success of the first two strategies, earthquake engineering and land use planning, depends on being able to reliably estimate the earthquake potential. The key considerations in defining the potential of a region are the location, size, and character of future earthquakes and frequency of their occurrence. Both historic seismicity of the region and the geologic record are considered in evaluating earthquake potential. 

  3. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  4. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  5. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  6. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  7. Generally Contracted Valence-Core/Valence Basis Sets for Use with Relativistic Effective Core Potentials and Spin-Orbit Coupling Operators

    SciTech Connect

    Ermler, Walter V.; Tilson, Jeffrey L.

    2012-12-15

    A procedure for structuring generally contracted valence-core/valence basis sets of Gaussian-type functions for use with relativistic effective core potentials (gcv-c/v-RECP basis sets) is presented. Large valence basis sets are enhanced using a compact basis set derived for outer core electrons in the presence of small-core RECPs. When core electrons are represented by relativistic effective core potentials (RECPs), and appropriate levels of theory, these basis sets are shown to provide accurate representations of atomic and molecular valence and outer-core electrons. Core/valence polarization and correlation effects can be calculated using these basis sets through standard methods for treating electron correlation. Calculations of energies and spectra for Ru, Os, Ir, In and Cs are reported. Spectroscopic constants for RuO2+, OsO2+, Cs2 and InH are calculated and compared with experiment.

  8. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  9. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  10. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  11. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  12. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  13. A Century of Induced Earthquakes in Oklahoma

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Page, M. T.

    2015-12-01

    Seismicity rates have increased sharply since 2009 in the central and eastern United States, with especially high rates of activity in the state of Oklahoma. A growing body of evidence indicates that many of these events are induced, primarily by injection of wastewater in deep disposal wells. The upsurge in activity has raised the questions, what is the background rate of tectonic earthquakes in Oklahoma? And how much has the rate varied throughout historical and early instrumental times? We first review the historical catalog, including assessment of the completeness level of felt earthquakes, and show that seismicity rates since 2009 surpass previously observed rates throughout the 20th century. Furthermore, several lines of evidence suggest that most of the significant (Mw > 3.5) earthquakes in Oklahoma during the 20th century were likely induced by wastewater injection and/or enhanced oil recovery operations. We show that there is a statistically significant temporal and spatial correspondence between earthquakes and disposal wells permitted during the 1950s. The intensity distributions of the 1952 Mw5.7 El Reno earthquake and the 1956 Mw3.9 Tulsa county earthquake are similar to those from recent induced earthquakes, with significantly lower shaking than predicted given a regional intensity-prediction equation. The rate of tectonic earthquakes is thus inferred to be significantly lower than previously estimated throughout most of the state, but is difficult to estimate given scant incontrovertible evidence for significant tectonic earthquakes during the 20th century. We do find evidence for a low level of tectonic seismicity in southeastern Oklahoma associated with the Ouachita structural belt, and conclude that the 22 October 1882 Choctaw Nation earthquake, for which we estimate Mw4.8, occurred in this zone.

  14. Safety Basis Report

    SciTech Connect

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  15. Virtual California: studying earthquakes through simulation

    NASA Astrophysics Data System (ADS)

    Sachs, M. K.; Heien, E. M.; Turcotte, D. L.; Yikilmaz, M. B.; Rundle, J. B.; Kellogg, L. H.

    2012-12-01

    Virtual California is a computer simulator that models earthquake fault systems. The design of Virtual California allows for fast execution so many thousands of events can be generated over very long simulated time periods. The result is a rich dataset, including simulated earthquake catalogs, which can be used to study the statistical properties of the seismicity on the modeled fault systems. We describe the details of Virtual California's operation and discuss recent results from Virtual California simulations.

  16. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  17. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  18. The loma prieta, california, earthquake: an anticipated event.

    PubMed

    1990-01-19

    The first major earthquake on the San Andreas fault since 1906 fulfilled a long-term forecast for its rupture in the southern Santa Cruz Mountains. Severe damage occurred at distances of up to 100 kilometers from the epicenter in areas underlain by ground known to be hazardous in strong earthquakes. Stronger earthquakes will someday strike closer to urban centers in the United States, most of which also contain hazardous ground. The Loma Prieta earthquake demonstrated that meaningful predictions can be made of potential damage patterns and that, at least in well-studied areas, long-term forecasts can be made of future earthquake locations and magnitudes. Such forecasts can serve as a basis for action to reduce the threat major earthquakes pose to the United States. PMID:17735847

  19. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  20. Application of Seismic Array Processing to Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Meng, L.; Allen, R. M.; Ampuero, J. P.

    2013-12-01

    Earthquake early warning (EEW) systems that can issue warnings prior to the arrival of strong ground shaking during an earthquake are essential in mitigating seismic hazard. Many of the currently operating EEW systems work on the basis of empirical magnitude-amplitude/frequency scaling relations for a point source. This approach is of limited effectiveness for large events, such as the 2011 Tohoku-Oki earthquake, for which ignoring finite source effects may result in underestimation of the magnitude. Here, we explore the concept of characterizing rupture dimensions in real time for EEW using clusters of dense low-cost accelerometers located near active faults. Back tracing the waveforms recorded by such arrays allows the estimation of the earthquake rupture size, duration and directivity in real-time, which enables the EEW of M > 7 earthquakes. The concept is demonstrated with the 2004 Parkfield earthquake, one of the few big events (M>6) that have been recorded by a local small-scale seismic array (UPSAR array, Fletcher et al, 2006). We first test the approach against synthetic rupture scenarios constructed by superposition of empirical Green's functions. We find it important to correct for the bias in back azimuth induced by dipping structures beneath the array. We implemented the proposed methodology to the mainshock in a simulated real-time environment. After calibrating the dipping-layer effect with data from smaller events, we obtained an estimated rupture length of 9 km, consistent with the distance between the two main high frequency subevents identified by back-projection using all local stations (Allman and Shearer, 2007). We proposed to deploy small-scale arrays every 30 km along the San Andreas Fault. The array processing is performed in local processing centers at each array. The output is compared with finite fault solutions based on real-time GPS system and then incorporated into the standard ElarmS system. The optimal aperture and array geometry is

  1. Earthquake source inversion of tsunami runup prediction

    NASA Astrophysics Data System (ADS)

    Sekar, Anusha

    Our goal is to study two inverse problems: using seismic data to invert for earthquake parameters and using tide gauge data to invert for earthquake parameters. We focus on the feasibility of using a combination of these inverse problems to improve tsunami runup prediction. A considerable part of the thesis is devoted to studying the seismic forward operator and its modeling using immersed interface methods. We develop an immersed interface method for solving the variable coefficient advection equation in one dimension with a propagating singularity and prove a convergence result for this method. We also prove a convergence result for the one-dimensional acoustic system of partial differential equations solved using immersed interface methods with internal boundary conditions. Such systems form the building blocks of the numerical model for the earthquake. For a simple earthquake-tsunami model, we observe a variety of possibilities in the recovery of the earthquake parameters and tsunami runup prediction. In some cases the data are insufficient either to invert for the earthquake parameters or to predict the runup. When more data are added, we are able to resolve the earthquake parameters with enough accuracy to predict the runup. We expect that this variety will be true in a real world three dimensional geometry as well.

  2. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed Central

    Grant, Rachel A.; Conlan, Hilary

    2013-01-01

    juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co-incidence. Statistical analysis of the data indicated frog swarms are unlikely to be connected with earthquakes. Reports of unusual behaviour giving rise to earthquake fears should be interpreted with caution, and consultation with experts in the field of earthquake biology is advised. PMID:26479746

  3. Effects of the 2011 Tohoku Earthquake on VLBI Geode- tic Measurements

    NASA Astrophysics Data System (ADS)

    MacMillan, D.; Behrend, D.; Kurihara, S.

    2012-12-01

    The VLBI antenna TSUKUB32 at Tsukuba, Japan observes in 24-hour observing sessions once per week with the R1 operational network and on additional days with other networks on a more irregular basis. Further, the antenna is an endpoint of the single-baseline, 1-hr Intensive Int2 sessions observed on the weekends for the determination of UT1. TSUKUB32 returned to normal operational observing one month after the earthquake. The antenna is 160 km west and 240 km south of the epicenter of the Tohoku earthquake. We looked at the transient behavior of the TSUKUB32 position time series following the earthquake and found that significant deformation is continuing. The eastward rate relative to the long-term rate prior to the earthquake was about 20 cm/yr four months after the earthquake and 9 cm/yr after one year. The VLBI series agrees closely with the corresponding JPL (Jet Propulsion Laboratory) GPS series measured by the co-located GPS antenna TSUK. The co-seismic UEN displacement at Tsukuba as determined by VLBI was (-90 mm, 640 mm, 44 mm). We examined the effect of the variation of the TSUKUB32 position on EOP estimates and then used the GPS data to correct its position for the estimation of UT1 in the Tsukuba-Wettzell Int2 Intensive experiments. For this purpose and to provide operational UT1, the IVS scheduled a series of weekend Intensive sessions observing on the Kokee-Wettzell baseline immediately before each of the two Tsukuba-Wettzell Intensive sessions. Comparisons between the UT1 estimates from these weekend sessions and the USNO (United States Naval Observatory) combination series were used to validate the GPS correction to the TSUKUB32 position.

  4. Comment on “Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set” [J. Chem. Phys. 139, 114104 (2013)

    SciTech Connect

    Brandbyge, Mads

    2014-05-07

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an “implicit decoupling assumption,” leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, and that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.

  5. Evidence for remotely triggered micro-earthquakes during salt cavern collapse

    NASA Astrophysics Data System (ADS)

    Jousset, P.; Rohmer, J.

    2012-04-01

    Micro-seismicity is a good indicator of spatio-temporal evolution of physical properties of rocks prior to catastrophic events like volcanic eruptions or landslides and may be triggered by a number of causes including dynamic characteristics of processes in play or/and external forces. Micro-earthquake triggering has been in the recent years the subject of intense research and our work contribute to showing further evidence of possible triggering of micro-earthquakes by remote large earthquakes. We show evidence of triggered micro-seismicity in the vicinity of an underground salt cavern prone to collapse by a remote M~7.2 earthquake, which occurred ~12000 kilometres away. We demonstrate the near critical state of the cavern before the collapse by means of 2D axisymmetric elastic finite-element simulations. Pressure was lowered in the cavern by pumping operations of brine out of the cavern. We demonstrate that a very small stress increase would be sufficient to break the overburden. High-dynamic broadband records reveal a remarkable time-correlation between a dramatic increase of the local high-frequency micro-seismicity rate associated with the break of the stiffest layer stabilizing the overburden and the passage of low-frequency remote seismic waves, including body, Love and Rayleigh surface waves. Stress oscillations due to the seismic waves exceeded the strength required for the rupture of the complex media made of brine and rock triggering micro-earthquakes and leading to damage of the overburden and eventually collapse of the salt cavern. The increment of stress necessary for the failure of a Dolomite layer is of the same order or magnitude as the maximum dynamic stress magnitude observed during the passage of the earthquakes waves. On this basis, we discuss the possible contribution of the Love and Rayleigh low-frequency surfaces waves.

  6. Retrospective Evaluation of Earthquake Forecasts during the 2010-12 Canterbury, New Zealand, Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Marzocchi, W.; Taroni, M.; Zechar, J. D.; Gerstenberger, M.; Liukis, M.; Rhoades, D. A.; Cattania, C.; Christophersen, A.; Hainzl, S.; Helmstetter, A.; Jimenez, A.; Steacy, S.; Jordan, T. H.

    2014-12-01

    The M7.1 Darfield, New Zealand (NZ), earthquake triggered a complex earthquake cascade that provides a wealth of new scientific data to study earthquake triggering and the predictive skill of statistical and physics-based forecasting models. To this end, the Collaboratory for the Study of Earthquake Predictability (CSEP) is conducting a retrospective evaluation of over a dozen short-term forecasting models that were developed by groups in New Zealand, Europe and the US. The statistical model group includes variants of the Epidemic-Type Aftershock Sequence (ETAS) model, non-parametric kernel smoothing models, and the Short-Term Earthquake Probabilities (STEP) model. The physics-based model group includes variants of the Coulomb stress triggering hypothesis, which are embedded either in Dieterich's (1994) rate-state formulation or in statistical Omori-Utsu clustering formulations (hybrid models). The goals of the CSEP evaluation are to improve our understanding of the physical mechanisms governing earthquake triggering, to improve short-term earthquake forecasting models and time-dependent hazard assessment for the Canterbury area, and to understand the influence of poor-quality, real-time data on the skill of operational (real-time) forecasts. To assess the latter, we use the earthquake catalog data that the NZ CSEP Testing Center archived in near real-time during the earthquake sequence and compare the predictive skill of models using the archived data as input with the skill attained using the best available data today. We present results of the retrospective model comparison and discuss implications for operational earthquake forecasting.

  7. Operations

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.

    2013-01-01

    Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this…

  8. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  9. Earthquakes and the office-based surgeon.

    PubMed Central

    Conover, W A

    1992-01-01

    A major earthquake may strike while a surgeon is performing an operation in an office surgical facility. A sudden major fault disruption will lead to thousands of casualties and widespread destruction. Surgeons who operate in offices can help lessen havoc by careful preparation. These plans should coordinate with other disaster plans for effective triage, evacuation, and the treatment of casualties. PMID:1413756

  10. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  11. Astronomical tides and earthquakes

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoping; Mao, Wei; Huang, Yong

    2001-03-01

    A review on the studies of correlation between astronomical tides and earthquakes is given in three categories, including (1) earthquakes and the relative locations of the sun, the moon and the earth, (2) earthquakes and the periods and phases of tides and (3) earthquakes and the tidal stress. The first two categories mainly investigate whether or not there exist any dominant pattern of the relative locations of the sun, the moon and the earth during earthquakes, whether or not the occurrences of earthquakes are clustered in any special phase during a tidal period, whether or not there exists any tidal periodic phenomenon in seismic activities, By empasizing the tidal stress in seismic focus, the third category investigates the relationship between various seismic faults and the triggering effects of tidal stress, which reaches the crux of the issue. Possible reasons to various inconsistent investigation results by using various methods and samples are analyzed and further investigations are proposed.

  12. Anthropogenic seismicity rates and operational parameters at the Salton Sea Geothermal Field.

    PubMed

    Brodsky, Emily E; Lajoie, Lia J

    2013-08-01

    Geothermal power is a growing energy source; however, efforts to increase production are tempered by concern over induced earthquakes. Although increased seismicity commonly accompanies geothermal production, induced earthquake rate cannot currently be forecast on the basis of fluid injection volumes or any other operational parameters. We show that at the Salton Sea Geothermal Field, the total volume of fluid extracted or injected tracks the long-term evolution of seismicity. After correcting for the aftershock rate, the net fluid volume (extracted-injected) provides the best correlation with seismicity in recent years. We model the background earthquake rate with a linear combination of injection and net production rates that allows us to track the secular development of the field as the number of earthquakes per fluid volume injected decreases over time. PMID:23845943

  13. Earthquake swarms in Greenland

    NASA Astrophysics Data System (ADS)

    Larsen, Tine B.; Voss, Peter H.; Dahl-Jensen, Trine

    2014-05-01

    Earthquake swarms occur primarily near active volcanoes and in areas with frequent tectonic activity. However, intraplate earthquake swarms are not an unknown phenomenon. They are located near zones of weakness, e.g. in regions with geological contrasts, where dynamic processes are active. An earthquake swarm is defined as a period of increased seismicity, in the form of a cluster of earthquakes of similar magnitude, occurring in the same general area, during a limited time period. There is no obvious main shock among the earthquakes in a swarm. Earthquake swarms occur in Greenland, which is a tectonically stable, intraplate environment. The first earthquake swarms in Greenland were detected more than 30 years ago in Northern and North-Eastern Greenland. However, detection of these low-magnitude events is challenging due to the enormous distances and the relatively sparse network of seismographs. The seismograph coverage of Greenland has vastly improved since the international GLISN-project was initiated in 2008. Greenland is currently coved by an open network of 19 BB seismographs, most of them transmitting data in real-time. Additionally, earthquake activity in Greenland is monitored by seismographs in Canada, Iceland, on Jan Mayen, and on Svalbard. The time-series of data from the GLISN network is still short, with the latest station been added in NW Greenland in 2013. However, the network has already proven useful in detecting several earthquake swarms. In this study we will focus on two swarms: one occurring near/on the East Greenland coast in 2008, and another swarm occurring in the Disko-area near the west coast of Greenland in 2010. Both swarms consist of earthquakes with local magnitudes between 1.9 and 3.2. The areas, where the swarms are located, are regularly active with small earthquakes. The earthquake swarms are analyzed in the context of the general seismicity and the possible relationship to the local geological conditions.

  14. Earthquake at 40 feet

    USGS Publications Warehouse

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  15. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  16. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed

    Grant, Rachel A; Conlan, Hilary

    2013-01-01

    In short-term earthquake risk forecasting, the avoidance of false alarms is of utmost importance to preclude the possibility of unnecessary panic among populations in seismic hazard areas. Unusual animal behaviour prior to earthquakes has been reported for millennia but has rarely been scientifically documented. Recently large migrations or unusual behaviour of amphibians have been linked to large earthquakes, and media reports of large frog and toad migrations in areas of high seismic risk such as Greece and China have led to fears of a subsequent large earthquake. However, at certain times of year large migrations are part of the normal behavioural repertoire of amphibians. News reports of "frog swarms" from 1850 to the present day were examined for evidence that this behaviour is a precursor to large earthquakes. It was found that only two of 28 reported frog swarms preceded large earthquakes (Sichuan province, China in 2008 and 2010). All of the reported mass migrations of amphibians occurred in late spring, summer and autumn and appeared to relate to small juvenile anurans (frogs and toads). It was concluded that most reported "frog swarms" are actually normal behaviour, probably caused by juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co

  17. Earthquake fluctuations in wells in New Jersey

    USGS Publications Warehouse

    Austin, Charles R.

    1960-01-01

    New Jersey is fortunate to be situated in a region that is relatively stable, geologically. For this reason scientists believe, on the basis of the best scientific evidence available, that the chances of New Jersey experiencing a major earthquake are very small. The last major earthquake on the east coast occurred at Charleston, S. C., in 1886. Minor shocks have been felt in New Jersey, however, from time to time. Reports of dishes being rattled or even of plaster in buildings being cracked are not uncommon. These minor disturbances are generally restricted to relatively small areas.

  18. Role of Bioindicators In Earthquake Modelling

    NASA Astrophysics Data System (ADS)

    Zelinsky, I. P.; Melkonyan, D. V.; Astrova, N. G.

    On the basis of experimental researches of influence of sound waves on bacteria- indicators a model of earthquake is constructed. It is revealed that the growth of num- ber of bacteria depends on frequency of a sound wave, influencing on the bacterium, (the less frequency of a sound wave, the faster takes place a growth). It is shown, that at absorption of energy of a sound wave by bacterium occurs growth of concentration of isopotential lines of biodynamic field in a bacterium. This process leads to the bac- terium braking and heating. By structure of deformation of lines of a biodynamic field it is possible to predict various geodynamic processes including earthquakes.

  19. Compiling the 'Global Earthquake History' (1000-1903)

    NASA Astrophysics Data System (ADS)

    Albini, P.; Musson, R.; Locati, M.; Rovida, A.

    2013-12-01

    The study of historical earthquakes from historical sources, or historical seismology, is of wider interest than just the seismic hazard and risk community. In the scope of the two-year project (October 2010-March 2013) "Global Earthquake History", developed in the framework of GEM, a reassessment of world historical seismicity was made, from available published studies. The scope of the project is the time window 1000-1903, with magnitudes 7.0 and above. Events with lower magnitudes are included on a case by case, or region by region, basis. The Global Historical Earthquake Archive (GHEA) provides a complete account of the global situation in historical seismology. From GHEA, the Global Historical Earthquake Catalogue (GHEC, v1, available at http://www.emidius.eu/GEH/, under Creative Commons licence) was derived, i.e. a world catalogue of earthquakes for the period 1000-1903, with magnitude 7 and over, using publically-available materials, as for the Archive. This is intended to be the best global historical catalogue of large earthquakes presently available, with the best parameters selected, duplications and fakes removed, and in some cases, new earthquakes discovered. GHEA and GHEC are conceived as providing a basis for co-ordinating future research into historical seismology in any part of the world, and hopefully, encouraging new historical earthquake research initiatives that will continue to improve the information available.

  20. Earthquakes and Plate Boundaries

    ERIC Educational Resources Information Center

    Lowman, Paul; And Others

    1978-01-01

    Contains the contents of the Student Investigation booklet of a Crustal Evolution Education Project (CEEP) instructional modules on earthquakes. Includes objectives, procedures, illustrations, worksheets, and summary questions. (MA)

  1. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  2. Missing Great Earthquakes

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Martin, S.

    2013-12-01

    The occurrence of three earthquakes with Mw greater than 8.8, and six earthquakes larger than Mw8.5, since 2004 has raised interest in the long-term rate of great earthquakes. Past studies have focused on rates since 1900, which roughly marks the start of the instrumental era. Yet substantial information is available for earthquakes prior to 1900. A re-examination of the catalog of global historical earthquakes reveals a paucity of Mw ≥ 8.5 events during the 18th and 19th centuries compared to the rate during the instrumental era (Hough, 2013, JGR), suggesting that the magnitudes of some documented historical earthquakes have been underestimated, with approximately half of all Mw≥8.5 earthquakes missing or underestimated in the 19th century. Very large (Mw≥8.5) magnitudes have traditionally been estimated for historical earthquakes only from tsunami observations given a tautological assumption that all such earthquakes generate significant tsunamis. Magnitudes would therefore tend to be underestimated for deep megathrust earthquakes that generated relatively small tsunamis, deep earthquakes within continental collision zones, earthquakes that produced tsunamis that were not documented, outer rise events, and strike-slip earthquakes such as the 11 April 2012 Sumatra event. We further show that, where magnitudes of historical earthquakes are estimated from earthquake intensities using the Bakun and Wentworth (1997, BSSA) method, magnitudes of great earthquakes can be significantly underestimated. Candidate 'missing' great 19th century earthquakes include the 1843 Lesser Antilles earthquake, which recent studies suggest was significantly larger than initial estimates (Feuillet et al., 2012, JGR; Hough, 2013), and an 1841 Kamchatka event, for which Mw9 was estimated by Gusev and Shumilina (2004, Izv. Phys. Solid Ear.). We consider cumulative moment release rates during the 19th century compared to that during the 20th and 21st centuries, using both the Hough

  3. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  4. Investigations on Real-time GPS for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.; Aranha, M. A.; Melgar, D.; Allen, R. M.

    2015-12-01

    The Geodetic Alarm System (G-larmS) is a software system developed in a collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech (NMT) primarily for real-time Earthquake Early Warning (EEW). It currently uses high rate (1Hz), low latency (< ~5 seconds), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from existing EEW algorithms, e.g. ElarmS, to compute static offsets upon S-wave arrival. G-larmS performs a least squares inversion on these offsets to determine slip on a finite fault, which we use to estimate moment magnitude. These computations are repeated every second for the duration of the event. G-larmS has been in continuous operation at the BSL for over a year using event triggers from the California Integrated Seismic Network (CISN) ShakeAlert system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California. Pairs of stations are processed as baselines using trackRT (MIT software package). G-larmS produced good results in real-time during the South Napa (M 6.0, August 2014) earthquake as well as on several replayed and simulated test cases. We evaluate the performance of G-larmS for EEW by analysing the results using a set of well defined test cases to investigate the following: (1) using multiple fault regimes and concurrent processing with the ultimate goal of achieving model generation (slip and magnitude computations) within each 1 second GPS epoch on very large magnitude earthquakes (up to M 9.0), (2) the use of Precise Point Positioning (PPP) real-time data streams of various operators, accuracies, latencies and formats along with baseline data streams, (3) collaboratively expanding EEW coverage along the U.S. West Coast on a regional network basis for Northern California, Southern California and Cascadia.

  5. Gravity drives Great Earthquakes

    NASA Astrophysics Data System (ADS)

    Lister, Gordon; Forster, Marnie

    2010-05-01

    of the over-riding crust and mantle. This is possible for the crust and mantle above major subduction zones is mechanically weakened by the flux of heat and water associated with subduction zone processes. In consequence the lithosphere of the over-riding orogens can act more like a fluid than a rigid plate. Such fluid-like behaviour has been noted for the Himalaya and for the crust of the uplifted adjacent Tibetan Plateau, which appear to be collapsing. Similar conclusions as to the fluid-like behaviour of an orogen can also be reached for the crust and mantle of Myanmar and Indonesia, since here again, there is evidence for arc-normal motion adjacent to rolling-back subduction zones. Prior to the Great Sumatran Earthquake of 2004 we had postulated such movements on geological time-scales, describing them as ‘surges‘ driven by the gravitational potential energy of the adjacent orogen. But we considered time-scales that were very different to those that apply in the lead up, or during and subsequent to a catastrophic seismic event. The Great Sumatran Earthquake taught us quite differently. Data from satellites support the hypothesis that extension took place in a discrete increment, which we interpret to be the result of a gravitationally driven surge of the Indonesian crust westward over the weakened rupture during and after the earthquake. Mode II megathrusts are tsunamigenic for one very simple reason: the crust has been attenuated as the result of ongoing extension, so they can be overlain by large tracts of water, and they have a long rupture run time, allowing a succession of stress accumulations to be harvested. The after-slip beneath the Andaman Sea was also significant (in terms of moment) although non-seismogenic in its character. Operation of a Mode II megathrust prior to catastrophic failure may involve relatively quiescent motion with a mixture of normal faults and reverse faults, much like south of Java today. Ductile yield may produce steadily

  6. Earthquake activity in Oklahoma

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. )

    1989-08-01

    Oklahoma is one of the most seismically active areas in the southern Mid-Continent. From 1897 to 1988, over 700 earthquakes are known to have occurred in Oklahoma. The earliest documented Oklahoma earthquake took place on December 2, 1897, near Jefferson, in Grant County. The largest known Oklahoma earthquake happened near El Reno on April 9, 1952. This magnitude 5.5 (mb) earthquake was felt from Austin, Texas, to Des Moines, Iowa, and covered a felt area of approximately 362,000 km{sup 2}. Prior to 1962, all earthquakes in Oklahoma (59) were either known from historical accounts or from seismograph stations outside the state. Over half of these events were located in Canadian County. In late 1961, the first seismographs were installed in Oklahoma. From 1962 through 1976, 70 additional earthquakes were added to the earthquake database. In 1977, a statewide network of seven semipermanent and three radio-telemetry seismograph stations were installed. The additional stations have improved earthquake detection and location in the state of Oklahoma. From 1977 to 1988, over 570 additional earthquakes were located in Oklahoma, mostly of magnitudes less than 2.5. Most of these events occurred on the eastern margin of the Anadarko basin along a zone 135 km long by 40 km wide that extends from Canadian County to the southern edge of Garvin County. Another general area of earthquake activity lies along and north of the Ouachita Mountains in the Arkoma basin. A few earthquakes have occurred in the shelves that border the Arkoma and Anadarko basins.

  7. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  8. Investigating landslides caused by earthquakes - A historical review

    USGS Publications Warehouse

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  9. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  10. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  11. Recent Progress and Development on Multi-parameters Remote Sensing Application in Earthquake Monitoring in China

    NASA Astrophysics Data System (ADS)

    Shen, Xuhui; Zhang, Xuemin; Hong, Shunying; Jing, Feng; Zhao, Shufan

    2014-05-01

    In the last ten years, a few national research plans and scientific projects on remote sensing application in Earthquake monitoring research are implemented in China. Focusing on advancing earthquake monitoring capability searching for the way of earthquake prediction, satellite electromagnetism, satellite infrared and D-InSAR technology were developed systematically and some remarkable progress were achieved by statistical research on historical earthquakes and summarized initially the space precursory characters, which laid the foundation for gradually promoting the practical use. On the basis of these works, argumentation on the first space-based platform has been finished in earthquake stereoscope observation system in China, and integrated earthquake remote sensing application system has been designed comprehensively. To develop the space-based earthquake observational system has become a major trend of technological development in earthquake monitoring and prediction. We shall pay more emphasis on the construction of the space segment of China earthquake stereoscope observation system and Imminent major scientific projects such as earthquake deformation observation system and application research combined INSAR, satellite gravity and GNSS with the goal of medium and long term earthquake monitoring and forcasting, infrared observation and technical system and application research with the goal of medium and short term earthquake monitoring and forcasting, and satellite-based electromagnetic observation and technical system and application system with the goal of short term and imminent earthquake monitoring.

  12. Earthquake history of Texas

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Seventeen earthquakes, intensity V or greater, have centered in Texas since 1882, when the first shock was reported. The strongest earthquake, a maximum intensity VIII, was in western Texas in 1931 and was felt over 1 165 000 km 2. Three shocks in the Panhandle region in 1925, 1936, and 1943 were widely felt. 

  13. Earthquake research in China

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    The prediction of the Haicheng earthquake was an extraordinary achievement by the geophysical workers of the People's Republic of China, whose national program in earthquake reserach was less than 10 years old at the time. To study the background to this prediction, a delgation of 10 U.S scientists, which I led, visited China in June 1976. 

  14. Can we control earthquakes?

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    In 1966, it was discovered that high pressure injection of industrial waste fluids into the subsurface near Denver, Colo., was triggering earthquakes. While this was disturbing at the time, it was also exciting because there was immediate speculation that here at last was a mechanism to control earthquakes.  

  15. The USGS Earthquake Scenario Project

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Petersen, M. D.; Wald, L. A.; Frankel, A. D.; Quitoriano, V. R.; Lin, K.; Luco, N.; Mathias, S.; Bausch, D.

    2009-12-01

    The U.S. Geological Survey’s (USGS) Earthquake Hazards Program (EHP) is producing a comprehensive suite of earthquake scenarios for planning, mitigation, loss estimation, and scientific investigations. The Earthquake Scenario Project (ESP), though lacking clairvoyance, is a forward-looking project, estimating earthquake hazard and loss outcomes as they may occur one day. For each scenario event, fundamental input includes i) the magnitude and specified fault mechanism and dimensions, ii) regional Vs30 shear velocity values for site amplification, and iii) event metadata. A grid of standard ShakeMap ground motion parameters (PGA, PGV, and three spectral response periods) is then produced using the well-defined, regionally-specific approach developed by the USGS National Seismic Hazard Mapping Project (NHSMP), including recent advances in empirical ground motion predictions (e.g., the NGA relations). The framework also allows for numerical (3D) ground motion computations for specific, detailed scenario analyses. Unlike NSHMP ground motions, for ESP scenarios, local rock and soil site conditions and commensurate shaking amplifications are applied based on detailed Vs30 maps where available or based on topographic slope as a proxy. The scenario event set is comprised primarily by selection from the NSHMP events, though custom events are also allowed based on coordination of the ESP team with regional coordinators, seismic hazard experts, seismic network operators, and response coordinators. The event set will be harmonized with existing and future scenario earthquake events produced regionally or by other researchers. The event list includes approximate 200 earthquakes in CA, 100 in NV, dozens in each of NM, UT, WY, and a smaller number in other regions. Systematic output will include all standard ShakeMap products, including HAZUS input, GIS, KML, and XML files used for visualization, loss estimation, ShakeCast, PAGER, and for other systems. All products will be

  16. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  17. Earthquake Hazard and Risk Assessment for Turkey

    NASA Astrophysics Data System (ADS)

    Betul Demircioglu, Mine; Sesetyan, Karin; Erdik, Mustafa

    2010-05-01

    Using a GIS-environment to present the results, seismic risk analysis is considered as a helpful tool to support the decision making for planning and prioritizing seismic retrofit intervention programs at large scale. The main ingredients of seismic risk analysis consist of seismic hazard, regional inventory of buildings and vulnerability analysis. In this study, the assessment of the national earthquake hazard based on the NGA ground motion prediction models and the comparisons of the results with the previous models have been considered, respectively. An evaluation of seismic risk based on the probabilistic intensity ground motion prediction for Turkey has been investigated. According to the Macroseismic approach of Giovinazzi and Lagomarsino (2005), two alternative vulnerability models have been used to estimate building damage. The vulnerability and ductility indices for Turkey have been taken from the study of Giovinazzi (2005). These two vulnerability models have been compared with the observed earthquake damage database. A good agreement between curves has been clearly observed. In additional to the building damage, casualty estimations based on three different methods for each return period and for each vulnerability model have been presented to evaluate the earthquake loss. Using three different models of building replacement costs, the average annual loss (AAL) and probable maximum loss ratio (PMLR) due to regional earthquake hazard have been provided to form a basis for the improvement of the parametric insurance model and the determination of premium rates for the compulsory earthquake insurance in Turkey.

  18. Maximum Earthquake Magnitude Assessments by Japanese Government Committees (Invited)

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2013-12-01

    The 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history and such a gigantic earthquake was not foreseen around Japan. After the 2011 disaster, various government committees in Japan have discussed and assessed the maximum credible earthquake size around Japan, but their values vary without definite consensus. I will review them with earthquakes along the Nankai Trough as an example. The Central Disaster Management Council, under Cabinet Office, set up a policy for the future tsunami disaster mitigation. The possible future tsunamis are classified into two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, for which saving people's lives is the first priority with soft measures such as tsunami hazard maps, evacuation facilities or disaster education. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared. The assessments of L1 and L2 events are left to local governments. The CDMC also assigned M 9.1 as the maximum size of earthquake along the Nankai trough, then computed the ground shaking and tsunami inundation for several scenario earthquakes. The estimated loss is about ten times the 2011 disaster, with maximum casualties of 320,000 and economic loss of 2 trillion dollars. The Headquarters of Earthquake Research Promotion, under MEXT, was set up after the 1995 Kobe earthquake and has made long-term forecast of large earthquakes and published national seismic hazard maps. The future probability of earthquake occurrence, for example in the next 30 years, was calculated from the past data of large earthquakes, on the basis of characteristic earthquake model. The HERP recently revised the long-term forecast of Naknai trough earthquake; while the 30 year probability (60 - 70 %) is similar to the previous estimate, they noted the size can be M 8 to 9, considering the variability of past

  19. The mass balance of earthquakes and earthquake sequences

    NASA Astrophysics Data System (ADS)

    Marc, O.; Hovius, N.; Meunier, P.

    2016-04-01

    Large, compressional earthquakes cause surface uplift as well as widespread mass wasting. Knowledge of their trade-off is fragmentary. Combining a seismologically consistent model of earthquake-triggered landsliding and an analytical solution of coseismic surface displacement, we assess how the mass balance of single earthquakes and earthquake sequences depends on fault size and other geophysical parameters. We find that intermediate size earthquakes (Mw 6-7.3) may cause more erosion than uplift, controlled primarily by seismic source depth and landscape steepness, and less so by fault dip and rake. Such earthquakes can limit topographic growth, but our model indicates that both smaller and larger earthquakes (Mw < 6, Mw > 7.3) systematically cause mountain building. Earthquake sequences with a Gutenberg-Richter distribution have a greater tendency to lead to predominant erosion, than repeating earthquakes of the same magnitude, unless a fault can produce earthquakes with Mw > 8 or more.

  20. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  1. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  2. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  3. AGU develops earthquake curriculum

    NASA Astrophysics Data System (ADS)

    Blue, Charles

    AGU, in cooperation with the Federal Emergency Management Agency (FEMA), announces the production of a new curriculum package for grades 7-12 on the engineering and geophysical aspects of earthquakes.According to Frank Ireton, AGU's precollege education manager, “Both AGU and FEMA are working to promote the understanding of earthquake processes and their impact on the built environment. We are designing a program that involves students in learning how science, mathematics, and social studies concepts can be applied to reduce earthquake hazards.”

  4. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  5. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal. PMID:27108213

  6. Cooperative earthquake research between the United States and the People's Republic of China

    SciTech Connect

    Russ, D.P.; Johnson, L.E.

    1986-01-01

    This paper describes cooperative research by scientists of the US and the People's Republic of China (PRC) which has resulted in important new findings concerning the fundamental characteristics of earthquakes and new insight into mitigating earthquake hazards. There have been over 35 projects cooperatively sponsored by the Earthquake Studies Protocol in the past 5 years. The projects are organized into seven annexes, including investigations in earthquake prediction, intraplate faults and earthquakes, earthquake engineering and hazards investigation, deep crustal structure, rock mechanics, seismology, and data exchange. Operational earthquake prediction experiments are currently being developed at two primary sites: western Yunnan Province near the town of Xiaguan, where there are several active faults, and the northeast China plain, where the devastating 1976 Tangshan earthquake occurred.

  7. To capture an earthquake

    SciTech Connect

    Ellsworth, W.L. )

    1990-11-01

    An earthquake model based on the theory of plate tectonics is presented. It is assumed that the plates behave elastically in response to slow, steady motions and the strains concentrate within the boundary zone between the plates. When the accumulated stresses exceed the bearing capacity of the rocks, the rocks break, producing an earthquake and releasing the accumulated stresses. As the steady movement of the plates continues, strain begins to reaccumulate. The cycle of strain accumulation and release is modeled using the motion of a block, pulled across a rough surface by a spring. A model earthquake can be predicted by taking into account a precursory event or the peak spring force prior to slip as measured in previous cycles. The model can be applied to faults, e.g., the San Andreas fault, if the past earthquake history of the fault and the rate of strain accumulation are known.

  8. Forecasting southern california earthquakes.

    PubMed

    Raleigh, C B; Sieh, K; Sykes, L R; Anderson, D L

    1982-09-17

    Since 1978 and 1979, California has had a significantly higher frequency of moderate to large earthquakes than in the preceding 25 years. In the past such periods have also been associated with major destructive earthquakes, of magnitude 7 or greater, and the annual probability of occurrence of such an event is now 13 percent in California. The increase in seismicity is associated with a marked deviation in the pattern of strain accumulation, a correlation that is physically plausible. Although great earthquakes (magnitude greater than 7.5) are too infrequent to have clear associations with any pattern of seismicity that is now observed, the San Andreas fault in southern California has accumulated sufficient potential displacement since the last rupture in 1857 to generate a great earthquake along part or all of its length. PMID:17740956

  9. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  10. Building losses assessment for Lushan earthquake utilization multisource remote sensing data and GIS

    NASA Astrophysics Data System (ADS)

    Nie, Juan; Yang, Siquan; Fan, Yida; Wen, Qi; Xu, Feng; Li, Lingling

    2015-12-01

    On 20 April 2013, a catastrophic earthquake of magnitude 7.0 struck the Lushan County, northwestern Sichuan Province, China. This earthquake named Lushan earthquake in China. The Lushan earthquake damaged many buildings. The situation of building loss is one basis for emergency relief and reconstruction. Thus, the building losses of the Lushan earthquake must be assessed. Remote sensing data and geographic information systems (GIS) can be employed to assess the building loss of the Lushan earthquake. The building losses assessment results for Lushan earthquake disaster utilization multisource remote sensing dada and GIS were reported in this paper. The assessment results indicated that 3.2% of buildings in the affected areas were complete collapsed. 12% and 12.5% of buildings were heavy damaged and slight damaged, respectively. The complete collapsed buildings, heavy damaged buildings, and slight damaged buildings mainly located at Danling County, Hongya County, Lushan County, Mingshan County, Qionglai County, Tianquan County, and Yingjing County.

  11. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  12. The SCEC-USGS Dynamic Earthquake Rupture Code Comparison Exercise - Simulations of Large Earthquakes and Strong Ground Motions

    NASA Astrophysics Data System (ADS)

    Harris, R.

    2015-12-01

    I summarize the progress by the Southern California Earthquake Center (SCEC) and U.S. Geological Survey (USGS) Dynamic Rupture Code Comparison Group, that examines if the results produced by multiple researchers' earthquake simulation codes agree with each other when computing benchmark scenarios of dynamically propagating earthquake ruptures. These types of computer simulations have no analytical solutions with which to compare, so we use qualitative and quantitative inter-code comparisons to check if they are operating satisfactorily. To date we have tested the codes against benchmark exercises that incorporate a range of features, including single and multiple planar faults, single rough faults, slip-weakening, rate-state, and thermal pressurization friction, elastic and visco-plastic off-fault behavior, complete stress drops that lead to extreme ground motion, heterogeneous initial stresses, and heterogeneous material (rock) structure. Our goal is reproducibility, and we focus on the types of earthquake-simulation assumptions that have been or will be used in basic studies of earthquake physics, or in direct applications to specific earthquake hazard problems. Our group's goals are to make sure that when our earthquake-simulation codes simulate these types of earthquake scenarios along with the resulting simulated strong ground shaking, that the codes are operating as expected. For more introductory information about our group and our work, please see our group's overview papers, Harris et al., Seismological Research Letters, 2009, and Harris et al., Seismological Research Letters, 2011, along with our website, scecdata.usc.edu/cvws.

  13. Historical Earthquakes and Active Structure for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashivli, Otar

    2014-05-01

    Long-term seismic history is an important foundation for reliable assessment of seismic hazard and risk. Therefore, completeness of earthquake catalogues in the longest historical part is very important. Survived historical sources, as well as special researches from the institutes, museums, libraries and archives in Georgia, the Caucasus and the Middle East indicate to high level of seismicity which entailed numerous human casualties and destruction on the territory of Georgia during the historical period. The study and detailed analysis of these original documents and researches have allowed us to create a new catalogue of historical earthquakes of Georgia from 1250 BC to 1900 AD. The method of the study is based on a multidisciplinary approach, i.e. on the joint use of methods of history and paleoseismology, archeoseismology, seismotectonics, geomorphology, etc. We present here a new parametric catalogue of 44 historic earthquakes of Georgia and a full "descriptor" of all the phenomena described in it. Constructed on its basis, the summarized map of the distribution of maximum damage in the historical period (before 1900) on the territory of Georgia clearly shows the main features of the seismic field during this period. In particular, in the axial part and the southern slope of the Greater Caucasus there is a seismic gap, which was filled in 1991 by the strongest earthquake and its aftershocks in Racha. In addition, it is also obvious that very high seismic activity in the central and eastern parts of the Javakheti highland is not described in historical materials and this fact requires further searches of various kinds of sources that contain data about historical earthquakes. We hope that this catalogue will enable to create a new joint (instrumental and historical) parametric earthquake catalogue of Georgia and will serve to assess the real seismic hazard and risk in the country.

  14. Dynamic stress changes during earthquake rupture

    USGS Publications Warehouse

    Day, S.M.; Yu, G.; Wald, D.J.

    1998-01-01

    We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.

  15. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  16. Injection-induced earthquakes.

    PubMed

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard. PMID:23846903

  17. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    SciTech Connect

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-15

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ε) ≔ T{sub 0} + εT{sub 1} + ε{sup 2}T{sub 2} + … + ε{sup k}T{sub k} + … forms a Riesz basis in L{sup 2}(0, T), T > 0, where ε∈C, T{sub 0} is a closed densely defined linear operator on a separable Hilbert space H with domain D(T{sub 0}) having isolated eigenvalues with multiplicity one, while T{sub 1}, T{sub 2}, … are linear operators on H having the same domain D⊃D(T{sub 0}) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  18. Seafloor earthquake measurement system, SEMS IV

    SciTech Connect

    Platzbecker, M.R.; Ehasz, J.P.; Franco, R.J.

    1997-07-01

    Staff of the Telemetry Technology Development Department (2664) have, in support of the U.S. Interior Department Mineral Management Services (MMS), developed and deployed the Seafloor Earthquake Measurement System IV (SEMS IV). The result of this development project is a series of three fully operational seafloor seismic monitor systems located at offshore platforms: Eureka, Grace, and Irene. The instrument probes are embedded from three to seven feet into the seafloor and hardwired to seismic data recorders installed top side at the offshore platforms. The probes and underwater cables were designed to survive the seafloor environment with an operation life of five years. The units have been operational for two years and have produced recordings of several minor earthquakes in that time. Sandia Labs will transfer operation of SEMS IV to MMS contractors in the coming months. 29 figs., 25 tabs.

  19. Practical approaches to earthquake prediction and warning

    NASA Astrophysics Data System (ADS)

    Kisslinger, Carl

    1984-04-01

    The title chosen for this renewal of the U.S.-Japan prediction seminar series reflects optimism, perhaps more widespread in Japan than in the United States, that research on earthquake prediction has progressed to a stage at which it is appropriate to begin testing operational forecast systems. This is not to suggest that American researchers do not recognize very substantial gains in understanding earthquake processes and earthquake recurrence, but rather that we are at the point of initiating pilot prediction experiments rather than asserting that we are prepared to start making earthquake predictions in a routine mode.For the sixth time since 1964, with support from the National Science Foundation and the Japan Society for the Promotion of Science, as well as substantial support from the U.S. Geological Survey (U.S.G.S.) for participation of a good representation of its own scientists, earthquake specialists from the two countries came together on November 7-11, 1983, to review progress of the recent past and share ideas about promising directions for future efforts. If one counts the 1980 Ewing symposium on prediction, sponsored by Lamont-Doherty Geological Observatory, which, though multinational, served the same purpose, one finds a continuity in these interchanges that has made them especially productive and stimulating for both scientific communities. The conveners this time were Chris Scholz, Lamont-Doherty, for the United States and Tsuneji Rikitake, Nihon University, for Japan.

  20. Earthquake swarms on Mount Erebus, Antarctica

    NASA Astrophysics Data System (ADS)

    Kaminuma, Katsutada; Baba, Megumi; Ueki, Sadato

    1986-12-01

    Mount Erebus (3794 m), located on Ross Island in McMurdo Sound, is one of the few active volcanoes in Antartica. A high-sensitivity seismic network has been operated by Japanese and US parties on and around the Volcano since December, 1980. The results of these observations show two kinds of seismic activity on Ross Island: activity concentrated near the summit of Mount Erebus associated with Strombolian eruptions, and micro-earthquake activity spread through Mount Erebus and the surrounding area. Seismicity on Mount Erebus has been quite high, usually exceeding 20 volcanic earthquakes per day. They frequently occur in swarms with daily counts exceeding 100 events. Sixteen earthquake swarms with more than 250 events per day were recorded by the seismic network during the three year period 1982-1984, and three notable earthquake swarms out of the sixteen were recognized, in October, 1982 (named 82-C), March-April, 1984 (84-B) and July, 1984 (84-F). Swarms 84-B and 84-F have a large total number of earthquakes and large Ishimoto-Iida's "m"; hence these two swarms are presumed to constitute on one of the precursor phenomena to the new eruption, which took place on 13 September, 1984, and lasted a few months.

  1. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  2. Prospective Tests of Southern California Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.

    2004-12-01

    We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability

  3. New characteristics of intensity assessment of Sichuan Lushan "4.20" M s7.0 earthquake

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Yan, Peilei; Chen, Xiangzhao

    2014-08-01

    The post-earthquake rapid accurate assessment of macro influence of seismic ground motion is of significance for earthquake emergency relief, post-earthquake reconstruction and scientific research. The seismic intensity distribution map released by the Lushan earthquake field team of the China Earthquake Administration (CEA) five days after the strong earthquake ( M7.0) occurred in Lushan County of Sichuan Ya'an City at 8:02 on April 20, 2013 provides a scientific basis for emergency relief, economic loss assessment and post-earthquake reconstruction. In this paper, the means for blind estimation of macroscopic intensity, field estimation of macro intensity, and review of intensity, as well as corresponding problems are discussed in detail, and the intensity distribution characteristics of the Lushan "4.20" M7.0 earthquake and its influential factors are analyzed, providing a reference for future seismic intensity assessments.

  4. Trial application of guidelines for nuclear plant response to an earthquake. Final report

    SciTech Connect

    Schmidt, W.; Oliver, R.; O`Connor, W.

    1993-09-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. These guidelines are published in EPRI report NP-6695, ``Guidelines for Nuclear Plant Response to an Earthquake,`` dated December 1989. This report includes two sets of nuclear plant procedures which were prepared to implement the guidelines of EPRI report NP-6695. The first set were developed by the Toledo Edison Company Davis-Besse plant. Davis-Besse is a pressurized water reactor (PWR) and contains relatively standard seismic monitoring instrumentation typical of many domestic nuclear plants. The second set of procedures were prepared by Yankee Atomic Electric Company for the Vermont Yankee facility. This plant is a boiling water reactor (BWR) with state-of-the-art seismic monitoring and PC-based data processing equipment, software developed specifically to implement the OBE Exceedance Criterion presented in EPRI report NP-5930, ``A Criterion for Determining Exceedance of the operating Basis Earthquake.`` The two sets of procedures are intended to demonstrate how two different nuclear utilities have interpreted and applied the EPRI guidance given in report NP-6695.

  5. Triggering of repeated earthquakes

    NASA Astrophysics Data System (ADS)

    Sobolev, G. A.; Zakrzhevskaya, N. A.; Sobolev, D. G.

    2016-03-01

    Based on the analysis of the world's earthquakes with magnitudes M ≥ 6.5 for 1960-2013, it is shown that they cause global-scale coherent seismic oscillations which most distinctly manifest themselves in the period interval of 4-6 min during 1-3 days after the event. After these earthquakes, a repeated shock has an increased probability to occur in different seismically active regions located as far away as a few thousand km from the previous event, i.e., a remote interaction of seismic events takes place. The number of the repeated shocks N( t) decreases with time, which characterizes the memory of the lithosphere about the impact that has occurred. The time decay N( t) can be approximated by the linear, exponential, and powerlaw dependences. No distinct correlation between the spatial locations of the initial and repeated earthquakes is revealed. The probable triggering mechanisms of the remote interaction between the earthquakes are discussed. Surface seismic waves traveling several times around the Earth's, coherent oscillations, and global source are the most preferable candidates. This may lead to the accumulation and coalescence of ruptures in the highly stressed or weakened domains of a seismically active region, which increases the probability of a repeated earthquake.

  6. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  7. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  8. The recent tectonic stress districts and strong earthquakes in China

    NASA Astrophysics Data System (ADS)

    Xie, F.; Zhang, H.

    2010-12-01

    According to the stress state and force source character, the recent tectonic stress field of China is preliminary divided into four classes. Among them, there are two first order districts, four second order districts, five third order districts and twenty-six fourth order districts. By analyzing those tectonic stress districts and strong earthquakes, the close relation between them is mainly summarized as follows: (1) The boundary of stress districts especially the first or second order boundary controlled by the interaction of tectonic plates has strong earthquakes very easily and frequently. (2) Stress districts with stress direction, regime type and stress value transformation are concentrative zones of strong earthquakes. (3) Stress districts with local stress differentiation but in the homogeneous stress background are the places where strong earthquakes are rela-tively concentrated. On the basis of these research work, we discuss the present dynamic environment in China from force source and plates movement.

  9. Estimating surface faulting impacts from the shakeout scenario earthquake

    USGS Publications Warehouse

    Treiman, J.A.; Pontib, D.J.

    2011-01-01

    An earthquake scenario, based on a kinematic rupture model, has been prepared for a Mw 7.8 earthquake on the southern San Andreas Fault. The rupture distribution, in the context of other historic large earthquakes, is judged reasonable for the purposes of this scenario. This model is used as the basis for generating a surface rupture map and for assessing potential direct impacts on lifelines and other infrastructure. Modeling the surface rupture involves identifying fault traces on which to place the rupture, assigning slip values to the fault traces, and characterizing the specific displacements that would occur to each lifeline impacted by the rupture. Different approaches were required to address variable slip distribution in response to a variety of fault patterns. Our results, involving judgment and experience, represent one plausible outcome and are not predictive because of the variable nature of surface rupture. ?? 2011, Earthquake Engineering Research Institute.

  10. Earthquakes: Megathrusts and mountain building

    NASA Astrophysics Data System (ADS)

    Briggs, Rich

    2016-05-01

    Coastlines above subduction zones slowly emerge from the sea despite repeated drowning by great, shallow earthquakes. Analysis of the Chilean coast suggests that moderate-to-large, deeper earthquakes may be responsible for the net uplift.