Science.gov

Sample records for assessing earthquake hazards

  1. Numerical earthquake simulations for seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik; Sokolov, Vladimir; Soloviev, Alexander

    2017-04-01

    A comprehensive seismic hazard assessment can contribute to earthquake preparedness and preventive measures aimed to reduce impacts of earthquakes, especially in the view of growing population and increasing vulnerability and exposure. Realistic earthquake simulations coupled with a seismic hazard analysis can provide better assessments of potential ground shaking due to large earthquakes. We present a model of block-and-fault dynamics, which simulates earthquakes in response to lithosphere movements and allows for studying the influence of fault network properties on seismic patterns. Using case studies (e.g., the Tibet-Himalayan region and the Caucasian region), we analyse the model's performance in terms of reproduction of basic features of the observed seismicity, such as the frequency-magnitude relationship, clustering of earthquakes, occurrences of large events, fault slip rates, and earthquake mechanisms. We examine a new approach to probabilistic seismic hazard assessment, which is based on instrumentally recorded, historical and simulated earthquakes. Based on predicted and observed peak ground acceleration values, we show that the hazard level associated with large events significantly increases if the long record of simulated seismicity is considered in the hazard assessment.

  2. Earthquake hazard assessment after Mexico (1985).

    PubMed

    Degg, M R

    1989-09-01

    The 1985 Mexican earthquake ranks foremost amongst the major earthquake disasters of the twentieth century. One of the few positive aspects of the disaster is that it provided massive quantities of data that would otherwise have been unobtainable. Every opportunity should be taken to incorporate the findings from these data in earthquake hazard assessments. The purpose of this paper is to provide a succinct summary of some of the more important lessons from Mexico. It stems from detailed field investigations, and subsequent analyses, conducted by the author on the behalf of reinsurance companies.

  3. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-04-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  4. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-01-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  5. Assessing the earthquake hazards in urban areas

    USGS Publications Warehouse

    Hays, W.W.; Gori, P.L.; Kockelman, W.J.

    1988-01-01

    Major urban areas in widely scattered geographic locations across the United States are a t varying degrees of risk from earthquakes. the locations of these urban areas include Charleston, South Carolina; Memphis Tennessee; St.Louis, Missouri; Salt Lake City, Utah; Seattle-Tacoma, Washington; Portland, Oregon; and Anchorage, Alaska; even Boston, Massachusetts, and Buffalo New York, have a history of large earthquakes. Cooperative research during the past decade has focused on assessing the nature and degree of the risk or seismic hazard i nthe broad geographic regions around each urban area. The strategy since the 1970's has been to bring together local, State, and Federal resources to solve the problem of assessing seismic risk. Successfl sooperative programs have been launched in the San Francisco Bay and Los Angeles regions in California and the Wasatch Front region in Utah. 

  6. Earthquake Hazard Assessment: Basics of Evaluation

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of

  7. Earthquake Hazard Assessment: an Independent Review

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  8. Spatial earthquake hazard assessment of Evansville, Indiana

    USGS Publications Warehouse

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.

    1997-01-01

    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  9. USGS Training in Afghanistan: Modern Earthquake Hazards Assessments

    NASA Astrophysics Data System (ADS)

    Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.

    2007-05-01

    Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."

  10. Modern Earthquake Hazard Assessments in Afghanistan: A USGS Training Course

    NASA Astrophysics Data System (ADS)

    Garthwaite, M.; Mooney, W. D.; Medlin, J.; Holzer, T.; McGarr, A.; Bohannon, R.

    2007-12-01

    Afghanistan is located in a tectonically active region at the western extent of the Indo-Asian collision zone, where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can cause damage, not only from strong ground shaking and surface rupture, but also from liquefaction and extensive landsliding. The M=6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghan communities to such hazards, and resulted in at least 1000 fatalities. This training course in modern earthquake hazard assessments is an integral part of the international effort to provide technical assistance to Afghanistan using an "end-to-end" approach. This approach involves providing assistance in all stages of hazard assessment, from identifying earthquakes, to disseminating information on mitigation strategies to the public. The purpose of this training course, held December 2-6, 2006 at the Afghan Geological Survey in Kabul, was to provide a solid background in the relevant seismological and geological methods for preparing for future earthquakes. With this information, participants may now be expected to educate other members of the Afghan community. In addition, they are better prepared to conduct earthquake hazard assessments and to build the capabilities of the Afghan Geological Survey. The training course was taught using a series of Power Point lectures, with all lectures being presented in English and translated into Dari, one of the two main languages of Afghanistan. The majority of lecture slides were also annotated in both English and Dari. Lectures were provided to the students in both hardcopy and digital formats. As part of the on-going USGS participation in the program, additional training sessions are planned in the subjects of field geology, modern concepts in Earth science, mineral resource assessments and applied geophysics.

  11. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    NASA Astrophysics Data System (ADS)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  12. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  13. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  14. Earthquake Hazard and Risk Assessment Based on Unified Scaling Law for Earthquakes: State of Gujarat, India

    NASA Astrophysics Data System (ADS)

    Parvez, Imtiyaz A.; Nekrasova, Anastasia; Kossobokov, Vladimir

    2017-03-01

    The Gujarat state of India is one of the most seismically active intercontinental regions of the world. Historically, it has experienced many damaging earthquakes including the devastating 1819 Rann of Kachchh and 2001 Bhuj earthquakes. The effect of the later one is grossly underestimated by the Global Seismic Hazard Assessment Program (GSHAP). To assess a more adequate earthquake hazard for the state of Gujarat, we apply Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter recurrence relation taking into account naturally fractal distribution of earthquake loci. USLE has evident implications since any estimate of seismic hazard depends on the size of the territory considered and, therefore, may differ dramatically from the actual one when scaled down to the proportion of the area of interest (e.g. of a city) from the enveloping area of investigation. We cross-compare the seismic hazard maps compiled for the same standard regular grid 0.2° × 0.2° (1) in terms of design ground acceleration based on the neo-deterministic approach, (2) in terms of probabilistic exceedance of peak ground acceleration by GSHAP, and (3) the one resulted from the USLE application. Finally, we present the maps of seismic risks for the state of Gujarat integrating the obtained seismic hazard, population density based on India's Census 2011 data, and a few model assumptions of vulnerability.

  15. Assessing earthquake hazard map performance with historical shaking intensity data

    NASA Astrophysics Data System (ADS)

    Stein, S.; Brooks, E. M.; Spencer, B. D.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.; Peresan, A.

    2016-12-01

    The performance of earthquake hazard maps often differs from that expected, for reasons that are unclear. We are exploring what maps do, rather than what they should do. Probabilistic seismic hazard maps predict that at a point on the map, the probability p that during t years of observations shaking will exceed a value expected once in a T year return period is assumed to be described by p = 1 - exp(-t/T). This probability is small for t/T small and grows with t. Maps can be assessed by comparing the fraction of sites where shaking exceeded the mapped threshold to p. However, the short time period since hazard maps began to be made poses a challenge. If during a few years after a map was made large earthquakes occurred, they could produce shaking exceeding that predicted at a larger-than-expected fraction of the sites, implying that the map may not performing well. However, if in the subsequent years no higher shaking occurred at these sites, the map would be eventually recognized to be performing as designed. Thus we use historical intensity data to examine how well maps describe past shaking. Although such assessments are not true tests, in that they compare the maps to data that were available when the map was made, they give useful insight into the maps' performance. We compare how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard maps, uniform maps, and randomized maps. By the metric implicit in the maps, that the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the maps do better than uniform or randomized maps. Probabilistic and deterministic hazard maps for Italy dramatically overpredict the recorded shaking in a 2200-yr-long intensity catalog, illustrating problems in the data, models, or both. These

  16. Harmonized Probabilistic Seismic Hazard Assessment in Europe: Earthquake Geology Applied

    NASA Astrophysics Data System (ADS)

    Woessner, J.; Danciu, L.; Giardini, D.; Share Consortium

    2012-04-01

    Probabilistic seismic hazard assessment (PSHA) aims to characterize the best available knowledge on seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results from PSHAs form the baseline for informed decision-making and provide essential input to each risk assessment application. SHARE is an EC-FP7 funded project to create a testable time-independent community-based hazard model for the Euro-Mediterranean region. SHARE scientists are creating a model framework and infrastructure for a harmonized PSHA. The results will serve as reference for the Eurocode 8 application and are envisioned to provide homogeneous input for state-of-the art seismic safety assessment for critical industry. Harmonizing hazard is pursued on the input data level and the model building procedure across borders and tectonic features of the European-Mediterranean region. An updated earthquake catalog, a harmonized database of seismogenic sources together with adjusted ground motion prediction equations (GMPEs) form the bases for a borderless assessment. We require transparent and reproducible strategies to estimate parameter values and their uncertainties within the source model assessment and the contributions of the GMPEs. The SHARE model accounts for uncertainties via a logic tree. Epistemic uncertainties within the seismic source-model are represented by four source model options including area sources, fault sources and kernel-smoothing approaches, aleatory uncertainties for activity rates and maximum magnitudes. Epistemic uncertainties for predicted ground motions are considered by multiple GMPEs as a function of tectonic settings and treated as being correlated. For practical implementation, epistemic uncertainties in the source model (i.e. dip and strike angles) are treated as aleatory, and a mean seismicity model is considered. The final results contain the full distribution of ground motion variability. This contribution will feature preliminary

  17. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  18. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  19. Generating Random Earthquake Events for Probabilistic Tsunami Hazard Assessment

    NASA Astrophysics Data System (ADS)

    LeVeque, Randall J.; Waagan, Knut; González, Frank I.; Rim, Donsub; Lin, Guang

    2016-12-01

    To perform probabilistic tsunami hazard assessment for subduction zone earthquakes, it is necessary to start with a catalog of possible future events along with the annual probability of occurrence, or a probability distribution of such events that can be easily sampled. For near-field events, the distribution of slip on the fault can have a significant effect on the resulting tsunami. We present an approach to defining a probability distribution based on subdividing the fault geometry into many subfaults and prescribing a desired covariance matrix relating slip on one subfault to slip on any other subfault. The eigenvalues and eigenvectors of this matrix are then used to define a Karhunen-Loève expansion for random slip patterns. This is similar to a spectral representation of random slip based on Fourier series but conforms to a general fault geometry. We show that only a few terms in this series are needed to represent the features of the slip distribution that are most important in tsunami generation, first with a simple one-dimensional example where slip varies only in the down-dip direction and then on a portion of the Cascadia Subduction Zone.

  20. Seismic hazard assessment for Myanmar: Earthquake model database, ground-motion scenarios, and probabilistic assessments

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Thant, M.; Maung Maung, P.; Sieh, K.

    2015-12-01

    We have constructed an earthquake and fault database, conducted a series of ground-shaking scenarios, and proposed seismic hazard maps for all of Myanmar and hazard curves for selected cities. Our earthquake database integrates the ISC, ISC-GEM and global ANSS Comprehensive Catalogues, and includes harmonized magnitude scales without duplicate events. Our active fault database includes active fault data from previous studies. Using the parameters from these updated databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and the elapse time of last events), we have determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the modelled ground motions to the felt intensities of earthquakes. Through the case of the 1975 Bagan earthquake, we determined that Atkinson and Moore's (2003) scenario using the ground motion prediction equations (GMPEs) fits the behaviours of the subduction events best. Also, the 2011 Tarlay and 2012 Thabeikkyin events suggested the GMPEs of Akkar and Cagnan (2010) fit crustal earthquakes best. We thus incorporated the best-fitting GMPEs and site conditions based on Vs30 (the average shear-velocity down to 30 m depth) from analysis of topographic slope and microtremor array measurements to assess seismic hazard. The hazard is highest in regions close to the Sagaing Fault and along the Western Coast of Myanmar as seismic sources there have earthquakes occur at short intervals and/or last events occurred a long time ago. The hazard curves for the cities of Bago, Mandalay, Sagaing, Taungoo and Yangon show higher hazards for sites close to an active fault or with a low Vs30, e.g., the downtown of Sagaing and Shwemawdaw Pagoda in Bago.

  1. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    NASA Astrophysics Data System (ADS)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  2. Some differences in seismic hazard assessment for natural and fluid-induced earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2014-12-01

    Although there is little doubt that fluid-induced earthquakes contribute significantly to the seismic hazard in some parts of the United States, assessing this contribution in ways consistent with hazard assessment for natural earthquakes is proving to be challenging. For natural earthquakes, the hazard is considered to be independent of time whereas for fluid-induced seismicity there is considerable time dependence as evidenced, for instance, by the dramatic increase in recent years of the seismicity in Oklahoma. Case histories of earthquakes induced by the development of Enhanced Geothermal Systems and wastewater injection at depth illustrate a few of the problems. Analyses of earthquake sequences induced by these operations indicate that the rate of earthquake occurrence is proportional to the rate of injection, a factor that, on a broad scale, depends on the level of energy production activities. For natural earthquakes, in contrast, the rate of earthquake occurrence depends on time-independent tectonic factors including the long-term slip rates across known faults. Maximum magnitude assessments for natural and fluid-induced earthquake sources also show a contrast in behavior. For a natural earthquake source, maximum magnitude is commonly assessed from empirical relations between magnitude and the area of a potentially-active fault. The same procedure applied to fluid-induced earthquakes yields magnitudes that are systematically higher than what is observed. For instance, the maximum magnitude estimated from the fault area of the Prague, OK, main shock of 6 November 2011 is 6.2 whereas the magnitude measured from seismic data is 5.65 (Sun and Hartzell, 2014). For fluid-induced earthquakes, maximum magnitude appears to be limited according to the volume of fluid injected before the largest earthquake. This implies that for a given fluid-injection project, the upper limit on magnitude increases as long as injection continues.

  3. Improving earthquake hazard assessments in Italy: An alternative to “Texas sharpshooting”

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Panza, Giuliano F.

    2012-12-01

    The 20 May 2012 M = 6.1 earthquake that struck the Emilia region of northern Italy illustrates a common problem afflicting earthquake hazard assessment. It occurred in an area classified as "low seismic hazard" based on the current national seismic hazard map (Gruppo di Lavoro, Redazione della mappa di pericolosità sismica, rapporto conclusivo, 2004, http://zonesismiche.mi.ingv.it/mappa_ps_apr04/italia.html) adopted in 2006. That revision of the seismic code was motivated by the 2002 M = 5.7 earthquake that struck S. Giuliano di Puglia in central Italy, also a previously classified low-hazard area, resulting in damage and casualties. Previous code was updated in 1981-1984 after earlier maps missed the 1980 M = 6.5 Irpinia earthquake.

  4. International Collaboration for Strengthening Capacity to Assess Earthquake Hazard in Indonesia

    NASA Astrophysics Data System (ADS)

    Cummins, P. R.; Hidayati, S.; Suhardjono, S.; Meilano, I.; Natawidjaja, D.

    2012-12-01

    Indonesia has experienced a dramatic increase in earthquake risk due to rapid population growth in the 20th century, much of it occurring in areas near the subduction zone plate boundaries that are prone to earthquake occurrence. While recent seismic hazard assessments have resulted in better building codes that can inform safer building practices, many of the fundamental parameters controlling earthquake occurrence and ground shaking - e.g., fault slip rates, earthquake scaling relations, ground motion prediction equations, and site response - could still be better constrained. In recognition of the need to improve the level of information on which seismic hazard assessments are based, the Australian Agency for International Development (AusAID) and Indonesia's National Agency for Disaster Management (BNPB), through the Australia-Indonesia Facility for Disaster Reduction, have initiated a 4-year project designed to strengthen the Government of Indonesia's capacity to reliably assess earthquake hazard. This project is a collaboration of Australian institutions including Geoscience Australia and the Australian National University, with Indonesian government agencies and universities including the Agency for Meteorology, Climatology and Geophysics, the Geological Agency, the Indonesian Institute of Sciences, and Bandung Institute of Technology. Effective earthquake hazard assessment requires input from many different types of research, ranging from geological studies of active faults, seismological studies of crustal structure, earthquake sources and ground motion, PSHA methodology, and geodetic studies of crustal strain rates. The project is a large and diverse one that spans all these components, and these will be briefly reviewed in this presentation

  5. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  6. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan)

    NASA Astrophysics Data System (ADS)

    Babayev, G.; Ismail-Zadeh, A.; Le Mouël, J.-L.

    2010-12-01

    A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations), and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan) to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA), vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence), and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  7. Three Cups of Tea: Building Collaborations to Assess Earthquake Hazard in Pakistan

    NASA Astrophysics Data System (ADS)

    Hough, Susan E.; Yong, Alan

    2009-12-01

    Modern Methods in Seismic Hazard Assessment; Nagarkot, Nepal, 8-12 June 2009; The M7.6 Muzaffarabad, Pakistan, earthquake struck the Pakistani Kashmir on 8 October 2005, claiming more than 80,000 lives. The earthquake underscored two points about earthquake hazard in Pakistan: first, that it is high, and, second, that it is poorly understood. In Karachi, for example, hazard is generally considered to be low, yet this rapidly growing megacity is as close to a major strike-slip fault system as Los Angeles is to the San Andreas fault. The Pakistani engineering community has sought guidance from seismologists on improved characterization of seismic hazard. This requires both improved hazard assessment methodology and improved constraints on the critical inputs to seismic hazard maps, for example, assessment of fault slip rates and geological site characterization. These inputs are currently unavailable. Efforts to map seismicity and attenuation and to estimate fault slip rates have been hampered by political instability. Yet there is no shortage of intellectual energy—Pakistan boasts an eager community of trained earthquake professionals.

  8. Considering potential seismic sources in earthquake hazard assessment for Northern Iran

    NASA Astrophysics Data System (ADS)

    Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila

    2014-07-01

    Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.

  9. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  10. Long aftershock sequences within continents and implications for earthquake hazard assessment.

    PubMed

    Stein, Seth; Liu, Mian

    2009-11-05

    One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere.

  11. Recent destructive earthquakes and international collaboration for seismic hazard assessment in the East Asia region

    NASA Astrophysics Data System (ADS)

    Hao, K.; Fujiwara, H.

    2013-12-01

    Recent destructive earthquakes in East-Asia claimed one third of million of people's lives. People learned from the lessons but forgotten after generations even one sculpted on stones. Probabilistic seismic hazard assessment (SHA) is considered as a scientific way to define earthquake zones and to guide urban plan and construction. NIED promoted SHA as a national mission of Japan over 10 years and as an international cooperation to neighbor countries since the 2008 Wenchuan earthquake. We initiated China-Japan-Korea SHA strategic cooperative program for the next generation map supported by MOST-JST-NRF in 2010. We also initiated cooperative program with Taiwan Earthquake Model from 2012, as well many other parties in the world. Consequently NIED proudly joined Global Earthquake Model (GEM) since its SHA's methodologies and technologies were highly valuated. As a representative of Japan, NIED will continue to work closely with all members of GEM not only for the GEM global components, also for its regional programs. Seismic hazard assessment has to be carrying out under existed information with epistemic uncertainty. We routinely improve the existed models to carefully treat active faults, earthquake records, and magnitudes under the newest authorized information provided by Earthquake Research Committee, Headquarters for Earthquake Research Promotion. After the 2011 Tohoku earthquake, we have been re-considering the national SHA maps in even long-term and low probabilities. We have setup a platform of http://www.j-shis.bosai.go.jp/en to exchange the SHA information and share our experiences, lessons and knowledge internationally. Some probabilistic SHA concepts, seismic risk mitigation issues need constantly to be promoted internationally through outreach and media. Major earthquakes in East Asian region which claimed one third of million of people's lives (slab depth with contour (Hayes et al., 2011)).

  12. Scaling of intraplate earthquake recurrence interval with fault length and implications for seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Marrett, Randall

    1994-12-01

    Consensus indicates that faults follow power-law scaling, although significant uncertainty remains about the values of important parameters. Combining these scaling relationships with power-law scaling relationships for earthquakes suggests that intraplate earthquake recurrence interval scales with fault length. Regional scaling data may be locally calibrated to yield a site-specific seismic hazard assessment tool. Scaling data from small faults (those that do not span the seismogenic layer) suggest that recurrence interval varies as a negative power of fault length. Due to uncertainties regarding the recently recognized changes in scaling for large earthquakes, it is unclear whether recurrence interval varies as a negative or positive power of fault length for large fauts (those that span the seismogenic layer). This question is of critical importance for seismic hazard assessment.

  13. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  14. Field-based assessment of landslide hazards resulting from the 2015 Gorkha, Nepal earthquake sequence

    NASA Astrophysics Data System (ADS)

    Collins, B. D.; Jibson, R.

    2015-12-01

    The M7.8 2015 Gorkha, Nepal earthquake sequence caused thousands of fatalities, destroyed entire villages, and displaced millions of residents. The earthquake sequence also triggered thousands of landslides in the steep Himalayan topography of Nepal and China; these landslides were responsible for hundreds of fatalities and blocked vital roads, trails, and rivers. With the support of USAID's Office of Foreign Disaster Assistance, the U.S. Geological Survey responded to this crisis by providing landslide-hazard expertise to Nepalese agencies and affected villages. Assessments of landslide hazards following earthquakes are essential to identify vulnerable populations and infrastructure, and inform government agencies working on rebuilding and mitigation efforts. However, assessing landslide hazards over an entire earthquake-affected region (in Nepal, estimated to be ~30,000 km2), and in exceedingly steep, inaccessible topography presents a number of logistical challenges. We focused the scope of our assessment by conducting helicopter- and ground-based landslide assessments in 12 priority areas in central Nepal identified a priori from satellite photo interpretation performed in conjunction with an international consortium of remote sensing experts. Our reconnaissance covered 3,200 km of helicopter flight path, extending over an approximate area of 8,000 km2. During our field work, we made 17 site-specific assessments and provided landslide hazard information to both villages and in-country agencies. Upon returning from the field, we compiled our observations and further identified and assessed 74 river-blocking landslide dams, 12% of which formed impoundments larger than 1,000 m2 in surface area. These assessments, along with more than 11 hours of helicopter-based video, and an overview of hazards expected during the 2015 summer monsoon have been publically released (http://dx.doi.org/10.3133/ofr20151142) for use by in-country and international agencies.

  15. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    SciTech Connect

    Yilmaz, Şeyda Bayrak, Erdem; Bayrak, Yusuf

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  16. East Meets West: An Earthquake in India Helps Hazard Assessment in the Central United States

    USGS Publications Warehouse

    ,

    2002-01-01

    Although geographically distant, the State of Gujarat in India bears many geological similarities to the Mississippi Valley in the Central United States. The Mississippi Valley contains the New Madrid seismic zone that, during the winter of 1811-1812, produced the three largest historical earthquakes ever in the continental United States and remains the most seismically active region east of the Rocky Mountains. Large damaging earthquakes are rare in ‘intraplate’ settings like New Madrid and Gujarat, far from the boundaries of the world’s great tectonic plates. Long-lasting evidence left by these earthquakes is subtle (fig. 1). Thus, each intraplate earthquake provides unique opportunities to make huge advances in our ability to assess and understand the hazards posed by such events.

  17. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  18. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 18. Errors in Probabilistic Seismic Hazard Analysis.

    DTIC Science & Technology

    1982-01-01

    PAPER S-73-lu- STATE-OF-THE-ART FOR ASSESSING EARTHQUAKE HAZARDS IN THE - UNITED STATES Report 18 ERRORS IN PROBABILISTIC SEISMIC HAZARD ANALYSISo by...Daniel. Veneziano t(Ii Department of Civil Engineering ~ MassachuseWt institute oF Technology .0 Cambridge, Mass. 02139 January 1982 Report 18 of a Serios...COVE.RED STATE-OF-THE-ART FOR ASSESSING EARTHQUAKE HAZARDS IN THE UNITED STATES; Report 18 , ERRORS IN Report 18 of a series PROBABILISTIC SEISMIC

  19. Seismic hazard assessment in the Tibet-Himalayan region based on observed and modeled extreme earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Sokolov, V. Y.

    2013-12-01

    Ground shaking due to recent catastrophic earthquakes are estimated to be significantly higher than that predicted by a probabilistic seismic hazard analysis (PSHA). A reason is that extreme (large magnitude and rare) seismic events are not accounted in PSHA in the most cases due to the lack of information and unknown reoccurrence time of the extremes. We present a new approach to assessment of regional seismic hazard, which incorporates observed (recorded and historic) seismicity and modeled extreme events. We apply this approach to PSHA of the Tibet-Himalayan region. The large magnitude events simulated for several thousand years in models of lithospheric block-and-fault dynamics and consistent with the regional geophysical and geodetic data are employed together with the observed earthquakes for the Monte-Carlo PSHA. Earthquake scenarios are generated stochastically to sample the magnitude and spatial distribution of seismicity (observed and modeled) as well as the distribution of ground motion for each seismic event. The peak ground acceleration (PGA) values (that is, ground shaking at a site), which are expected to be exceeded at least once in 50 years with a probability of 10%, are mapped and compared to those PGA values observed and predicted earlier. The results show that the PGA values predicted by our assessment fit much better the observed ground shaking due to the 2008 Wenchuan earthquake than those predicted by conventional PSHA. Our approach to seismic hazard assessment provides a better understanding of ground shaking due to possible large-magnitude events and could be useful for risk assessment, earthquake engineering purposes, and emergency planning.

  20. From Earthquake Prediction Research to Time-Variable Seismic Hazard Assessment Applications

    NASA Astrophysics Data System (ADS)

    Bormann, Peter

    2011-01-01

    The first part of the paper defines the terms and classifications common in earthquake prediction research and applications. This is followed by short reviews of major earthquake prediction programs initiated since World War II in several countries, for example the former USSR, China, Japan, the United States, and several European countries. It outlines the underlying expectations, concepts, and hypotheses, introduces the technologies and methodologies applied and some of the results obtained, which include both partial successes and failures. Emphasis is laid on discussing the scientific reasons why earthquake prediction research is so difficult and demanding and why the prospects are still so vague, at least as far as short-term and imminent predictions are concerned. However, classical probabilistic seismic hazard assessments, widely applied during the last few decades, have also clearly revealed their limitations. In their simple form, they are time-independent earthquake rupture forecasts based on the assumption of stable long-term recurrence of earthquakes in the seismotectonic areas under consideration. Therefore, during the last decade, earthquake prediction research and pilot applications have focused mainly on the development and rigorous testing of long and medium-term rupture forecast models in which event probabilities are conditioned by the occurrence of previous earthquakes, and on their integration into neo-deterministic approaches for improved time-variable seismic hazard assessment. The latter uses stress-renewal models that are calibrated for variations in the earthquake cycle as assessed on the basis of historical, paleoseismic, and other data, often complemented by multi-scale seismicity models, the use of pattern-recognition algorithms, and site-dependent strong-motion scenario modeling. International partnerships and a global infrastructure for comparative testing have recently been developed, for example the Collaboratory for the Study of

  1. Assessing Earthquake Hazard Map Performance Using Historical Shaking Intensity Data and Instrumental Data

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S.; Spencer, B. D.

    2016-12-01

    Assessing the performance of earthquake hazard maps is important but challenging, and various approaches can be taken. In principle, map performance can be assessed for any given observation period of t years and map return period T, because the probability p that shaking at a site will exceed the mapped value should be described by Poissonian probability p = 1 - exp(-t/T). However, because any real earthquake shaking history is one sample of many possible ones, a real history could easily deviate from this ideal behavior for t/T small, even if the hazard map is quite good. The observed shaking should be closer to ideal behavior for larger t/T. Hence the longer the observation period is compared to the map return period, the better the map's performance can be assessed. Given the short time period since hazard maps began to be made, typical maps with return periods of hundreds or thousands of years may be best assessed using historical intensity data. We are using such data for Italy and Japan, comparing historical intensity catalogs to national hazard maps. Although the results show intriguing differences from ideal behavior, the cause of these differences is unclear because these assessments face the limitations involved with possible biases in intensity assignment, biases in space-time sampling, and the fact that they are hindcasts rather than forecasts. Hence we are also exploring how the recent (2016) USGS one-year seismic hazard forecast for the Central and Eastern United States from induced and natural earthquakes predicts observed shaking. This assessment has the advantages of using instrumental data spanning a observation period comparable to the map's return period, but faces the challenge that the map is the first attempt to forecast the behavior of new and only partly understood processes.

  2. Assessing Earthquake Hazard Map Performance Using Historical Shaking Intensity Data and Instrumental Data

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Spencer, Bruce; Peresan, Antonella

    2017-04-01

    Assessing the performance of earthquake hazard maps is important but challenging, and various approaches can be taken. In principle, map performance can be assessed for any given observation period t years and map return period T, because the probability p that shaking will exceed the mapped value should be described by Poissonian probability p = 1 - exp(-t/T). However, because any real earthquake shaking history is one sample of many possible ones, a real history could easily deviate from this ideal behavior for t/T small, even if the hazard map is quite good. The observed shaking should be closer to ideal behavior for larger t/T. Hence the longer the observation period is compared to the map return period, the better the map's performance can be assessed. Given the short time period since hazard maps began to be made, typical maps with return periods of hundreds or thousands of years may be usefully assessed using historical intensity data. We are using such data for Italy and Japan, comparing historical intensity catalogs to national hazard maps.

  3. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  4. Seismic hazard assessment and pattern recognition of earthquake prone areas in the Po Plain (Italy)

    NASA Astrophysics Data System (ADS)

    Gorshkov, Alexander; Peresan, Antonella; Soloviev, Alexander; Panza, Giuliano F.

    2014-05-01

    A systematic and quantitative assessment, capable of providing first-order consistent information about the sites where large earthquakes may occur, is crucial for the knowledgeable seismic hazard evaluation. The methodology for the pattern recognition of areas prone to large earthquakes is based on the morphostructural zoning method (MSZ), which employs topographic data and present-day tectonic structures for the mapping of earthquake-controlling structures (i.e. the nodes formed around lineaments intersections) and does not require the knowledge about past seismicity. The nodes are assumed to be characterized by a uniform set of topographic, geologic, and geophysical parameters; on the basis of such parameters the pattern recognition algorithm defines a classification rule to discriminate seismogenic and non-seismogenic nodes. This methodology has been successfully applied since the early 1970s in a number of regions worldwide, including California, where it permitted the identification of areas that have been subsequently struck by strong events and that previously were not considered prone to strong earthquakes. Recent studies on the Iberian Peninsula and the Rhone Valley, have demonstrated the applicability of MSZ to flat basins, with a relatively flat topography. In this study, the analysis is applied to the Po Plain (Northern Italy), an area characterized by a flat topography, to allow for the systematic identification of the nodes prone to earthquakes with magnitude larger or equal to M=5.0. The MSZ method differs from the standard morphostructural analysis where the term "lineament" is used to define the complex of alignments detectable on topographic maps or on satellite images. According to that definition the lineament is locally defined and the existence of the lineament does not depend on the surrounding areas. In MSZ, the primary element is the block - a relatively homogeneous area - while the lineament is a secondary element of the morphostructure

  5. Synergistic use of geospatial and in-situ data for earthquake hazard assessment in Vrancea area

    NASA Astrophysics Data System (ADS)

    Zoran, M. A.; Savastru, R. S.; Savastru, D. M.

    2016-08-01

    Space-time anomalies of Earth's emitted radiation: thermal infrared in spectral range measured from satellite months to weeks before the occurrence of earthquakes, radon in underground water and soil, etc., and electromagnetic anomalies are considered as pre-seismic signals. Satellite remote sensing provides spatially continuous information of the tectonic landscape but also contribute to the understanding of specific fault and information about stress transfer between fault systems from depth and to the surface as well as on released energy by earthquakes and other modes of deformation. This paper presents observations made using time series MODIS Terra/Aqua, NOAA-AVHRR, Landsat satellite data for derived multi-parameters land surface temperature (LST), outgoing long-wave radiation (OLR), and mean air temperature (AT) for some seismic events recorded in Vrancea active geotectonic region in Romania. For some analyzed earthquakes, starting with almost one week prior to a moderate or strong earthquake a transient thermal infrared rise in LST of several Celsius degrees (oC) and the increased OLR values higher than the normal function of the magnitude and focal depth, which disappeared after the main shock. Synergy of multisenzor and multitemporal satellite data with in-situ and GPS data and spatial analysis of magnitude-frequency distributions of Vrancea earthquakes provides more information on Vrancea area seismicity. Earthquake hazard assessment for Vrancea region in Romania must have different degrees of complexity, which consists of derived geospatial and in-situ geophysical/geodetic parameters monitoring, analysis, predictive modeling, and forecast-oriented as well as decision-making procedures.

  6. No longer so clueless in seattle: Current assessment of earthquake hazards

    USGS Publications Warehouse

    Weaver, C.S.

    1998-01-01

    The Pacific Northwest is an active subduction zone. Because of this tectonic setting, there are three distinct earthquake source zones in earthquake hazard assessments of the Seattle area. Offshore, the broad sloping interface between the Juan de Fuca and the North America plates produces earthquakes as large as magnitude 9; on the average these events occur every 400-600 years. The second source zone is within the subducting Juan de Fuca plate as it bends, at depths of 40-60 km, beneath the Puget lowland. Five earthquakes in this zone this century have had magnitudes greater than 6, including one magnitude 7.1 event in 1949. The third zone, the crust of the North America plate, is the least well known. Paleoseismic evidence shows that an event of approximate magnitude 7 occurred on the Seattle fault about 1000 years ago. Potentially very damaging to the heavily urbanized areas of Puget Sound, the rate of occurrence and area over which large magnitude crustal events are to be expected is the subject of considerable research.

  7. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    NASA Astrophysics Data System (ADS)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  8. Assessment of earthquake hazard by simultaneous use of the statistical method and the method of fuzzy mathematics

    NASA Astrophysics Data System (ADS)

    Feng, De-Yi; Gu, Jing-Ping; Lin, Ming-Zhou; Xu, Shao-Xie; Yu, Xue-Jun

    1984-11-01

    A probabilistic method and a retrieval method of fuzzy information are simultaneously studied for assessment of earthquake hazard, or earthquake prediction. Statistical indices of regional seismicity in three adjacent time intervals are used to predict an earthquake in the next interval. The indices are earthquake frequency, the maximum magnitude, and a parameter related to the average magnitude (or b-value) and their time derivatives. Applying the probabilistic method, we can estimate a probability for a large earthquake with magnitude larger than a certain threshold occurring in the next time interval in a given region. By using the retrieval method of fuzzy information we can classify time intervals into several classes according to the regional seismic activity in each time interval and then evaluate whether or not the next time interval belongs to seismically hazardous time interval with a large earthquake. Some examples of applying both methods to the North section of the North-South Seismic Zone in China are shown. The results obtained are in good agreement with actual earthquake history. A comparison of the probabilistic method with the method of fuzzy mathematics is made, and it is recommended that earthquake hazard be assessed by simultaneous use of both methods.

  9. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2017-09-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  10. The 1843 earthquake: a maximising scenario for tsunami hazard assessment in the Northern Lesser Antilles?

    NASA Astrophysics Data System (ADS)

    Roger, Jean; Zahibo, Narcisse; Dudon, Bernard; Krien, Yann

    2013-04-01

    The French Caribbean Islands are located over the Lesser Antilles active subduction zone where a handful of earthquakes historically reached magnitude Mw=6.0 and more. According to available catalogs these earthquakes have been sometimes able to trigger devastating local or regional tsunamis, either directly by the shake or indirectly by induced landslides. For example, these islands have severely suffered during the Mw~7.5 Virgin Islands earthquake (1867) triggering several meters high waves in the whole Lesser Antilles Arc and, more recently, during the Mw=6.3 Les Saintes earthquake (2004) followed by a local 1 m high tsunami. However, in 1839 a Mw~7.5 subduction earthquake occured offshore Martinica followed a few years after by the more famous 1843 Mw~8.5 megathrust event, with an epicenter located approximately between Guadeloupe and Antigua, but both without any catastrophic tsunami being reported. In this study we discuss the potential impact of a maximum credible scenario of tsunami generation with such a Mw=8.5 rupture at the subduction interface using available geological information, numerical modeling of tsunami generation and propagation and high resolution bathymetric data within the framework of tsunami hazard assessment for the French West Indies. Despite the fact that the mystery remains unresolved concerning the lack of historical tsunami data especially for the 1843 event, modeling results show that the tsunami impact is not uniformly distributed in the whole archipelago and could show important heterogeneities in terms of maximum wave heights for specific places. This is easily explained by the bathymetry and the presence of several islands around the mainland leading to resonance phenomena, and because of the existence of a fringing coral reef surrounding partially those islands.

  11. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence

  12. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model

    NASA Astrophysics Data System (ADS)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza

    2017-06-01

    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  13. Earthquake Hazard Assessment Based on Geological Data: An approach from Crystalline Terrain of Peninsular India

    NASA Astrophysics Data System (ADS)

    John, B.

    2009-04-01

    Earthquake Hazard Assessment Based on Geological Data: An approach from Crystalline Terrain of Peninsular India Biju John National Institute of Rock Mechanics b_johnp@yahoo.co.in Peninsular India was for long considered as seismically stable. But the recent earthquake sequence of Latur (1993), Jabalpur (1997), Bhuj (2001) suggests this region is among one of the active Stable Continental Regions (SCRs) of the world, where the recurrence intervals is of the order of tens of thousands of years. In such areas, earthquake may happen at unexpected locations, devoid of any previous seismicity or dramatic geomorphic features. Even moderate earthquakes will lead to heavy loss of life and property in the present scenario. So it is imperative to map suspected areas to identify active faults and evaluate its activities, which will be a vital input to seismic hazard assessment of SCR area. The region around Wadakkanchery, Kerala, South India has been experiencing micro seismic activities since 1989. Subsequent studies, by the author, identified a 30 km long WNW-ESE trending reverse fault, dipping south (45°), that influenced the drainage system of the area. The macroscopic and microscopic studies of the fault rocks from the exposures near Desamangalam show an episodic nature of faulting. Dislocations of pegmatitic veins across the fault indicate a cumulative dip displacement of 2.1m in the reverse direction. A minimum of four episodes of faulting were identified in this fault based on the cross cutting relations of different structural elements and from the mineralogic changes of different generations of gouge zones. This suggests that an average displacement of 52cm per event might have occurred for each event. A cyclic nature of faulting is identified in this fault zone in which the inter-seismic period is characterized by gouge induration and fracture sealing aided by the prevailing fluids. Available empirical relations connecting magnitude with displacement and rupture

  14. The importance of earthquake research in the assessment of seismic hazards in Argentina

    NASA Astrophysics Data System (ADS)

    Giuliano, A.; Alvarado, P.; Beck, S.

    2007-05-01

    The history of Argentina has repeated occurrences of damaging crustal earthquakes with examples like the 1944 (Mw 7.0) San Juan earthquake, considered the largest natural disaster. These large earthquakes occur in the continental Andean backarc crust as far as 600 to 800 km east from the Trench. Of high significance is the correlation of this large-sized continental seismicity with the horizontal position of the subducted Nazca plate at about 100-km depth. In addition, lateral variations of the crustal structure are expected since several terranes have been accreted to western South America since the Paleozoic. Given the high seismic potential of this region, understanding of these seismotectonic processes and the crustal structure is essential for the assessment of seismic hazards and the mitigation of their effects. In this presentation we show our work based on an integrated research effort that combines permanent and temporal seismic networks from the Argentinean National Institute for Seismic Disaster Mitigation (INPRES) and IRIS- Passcal arrays. This international collaboration started in 2000 and involves researchers, technicians and students from the University of Arizona (USA), the National University of San Juan (Argentina) and INPRES (Argentina).

  15. Site-specific Earthquake-generate Tsunami Hazard Assessment in U.S. Atlantic Coast

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Titov, V. V.; Moore, C. W.; Gica, E.; Arcas, D.; Spillane, M. C.; Zhou, H.

    2009-12-01

    The Indian Ocean tsunami of 24 December 2004 has changed the perception of a tsunami as an infrequent low-risk hazard. The devoid of subduction or convergent zones in the Atlantic Ocean makes coastal communities less aware of the potential tsunami hazard in the East Coast of US. The existing continental shelf offshore has believed to act as additional buffer that may significantly attenuate the tsunami impact to the U.S. Atlantic coast. However, the uncertainties are still substantial and need to be timely addressed: 1. the largest tsunami ever recorded in Atlantic, 1755 Lisbon, was understudied; 2. the Hispaniola-Puerto Rico-Lesser Antilles subduction zone - a Sumatra-Andaman type of trench - in the northeast of Caribbean is capable of generating catastrophic tsunami; 3. the South Sandwich Trench was mostly overlooked; and 4. most of previous studies tackling these issues did not surpass the linear tsunami propagation in the deep ocean for nonlinear tsunami inundation modeling in the coastal area. Using the established NOAA high-resolution tsunami inundation model, the present study explores above uncertainties and provides comprehensive modeling assessment of the potential earthquake-generated tsunami hazard for selected coastal communities in U.S. Atlantic coasts, with highlight on over-shelf tsunami wave dynamics. This study is an extension of the USGS evaluation of earthquake-tsunami impact in Atlantic (ten Brink et al., 2007; Barkan et al., 2009) in the light of the Nuclear Regulation Commission (NRC) efforts on tsunami risk assessment for existing and potential nuclear power plants in U. S. East Coast.

  16. Assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L.; Hays, Walter W.

    2000-01-01

    This report--the second of two volumes--represents an ongoing effort by the U.S. Geological Survey to transfer accurate Earth science information about earthquake hazards along Utah's Wasatch Front to researchers, public officials, design professionals, land-use planners, and emergency managers in an effort to mitigate the effects of these hazards. This volume contains eight chapters on ground-shaking hazards and aspects of loss estimation.

  17. Metrics, Bayes, and BOGSAT: Recognizing and Assessing Uncertainties in Earthquake Hazard Maps

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Brooks, E. M.; Spencer, B. D.

    2015-12-01

    Recent damaging earthquakes in areas predicted to be relatively safe illustrate the need to assess how seismic hazard maps perform. At present, there is no agreed way of assessing how well a map performed. The metric implicit in current maps, that during a time interval predicted shaking will be exceeded only at a specific fraction of sites, is useful but permits maps to be nominally successful although they significantly underpredict or overpredict shaking, or nominally unsuccessful but predict shaking well. We explore metrics that measure the effects of overprediction and underprediction. Although no single metric fully characterizes map behavior, using several metrics can provide useful insight for comparing and improving maps. A related question is whether to regard larger-than-expected shaking as a low-probability event allowed by a map, or to revise the map to show increased hazard. Whether and how much to revise a map is complicated, because a new map that better describes the past may or may not better predict the future. The issue is like deciding after a coin has come up heads a number of times whether to continue assuming that the coin is fair and the run is a low-probability event, or to change to a model in which the coin is assumed to be biased. This decision can be addressed using Bayes' Rule, so that how much to change depends on the degree of one's belief in the prior model. Uncertainties are difficult to assess for hazard maps, which require subjective assessments and choices among many poorly known or unknown parameters. However, even rough uncertainty measures for estimates/predictions from such models, sometimes termed BOGSATs (Bunch Of Guys Sitting Around Table) by risk analysts, can give users useful information to make better decisions. We explore the extent of uncertainty via sensitivity experiments on how the predicted hazard depends on model parameters.

  18. Earthquake risk reduction in the United States: An assessment of selected user needs and recommendations for the National Earthquake Hazards Reduction Program

    SciTech Connect

    1994-12-31

    This Assessment was conducted to improve the National Earthquake Hazards Reduction Program (NEHRP) by providing NEHRP agencies with information that supports their user-oriented setting of crosscutting priorities in the NEHRP strategic planning process. The primary objective of this Assessment was to take a ``snapshot`` evaluation of the needs of selected users throughout the major program elements of NEHRP. Secondary objectives were to conduct an assessment of the knowledge that exists (or is being developed by NEHRP) to support earthquake risk reduction, and to begin a process of evaluating how NEHRP is meeting user needs. An identification of NEHRP`s strengths also resulted from the effort, since those strengths demonstrate successful methods that may be useful to NEHRP in the future. These strengths are identified in the text, and many of them represent important achievements since the Earthquake Hazards Reduction Act was passed in 1977.

  19. Assessment of the 1988 Saguenay earthquake: Implications on attenuation functions for seismic hazard analysis

    SciTech Connect

    Toro, G.R.; McGuire, R.K. )

    1991-09-01

    This study investigates the earthquake records from the 1988 Saguenay earthquake and examines the implications of these records with respect to ground-motion models used in seismic-hazard studies in eastern North America (ENA), specifically, to what extent the ground motions from this earthquake support or reject the various attenuation functions used in the EPRI and LLNL seismic-hazard calculations. Section 2 provides a brief description of the EPRI and LLNL attenuation functions for peak acceleration and for spectral velocities. Section 2 compares these attenuation functions the ground motions from the Saguenay earthquake and from other relevant earthquakes. Section 4 reviews available seismological studies about the Saguenay earthquake, in order to understand its seismological characteristics and why some observations may differ from predictions. Section 5 examines the assumptions and methodology used in the development of the attenuation functions selected by LLNL ground-motion expert 5. Finally, Section 6 draws conclusions about the validity of the various sets of attenuation functions, in light of the Saguenay data and of other evidence presented here. 50 refs., 37 figs., 7 tabs.

  20. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  1. Can Apparent Stress be Used to Time-Dependent Seismic Hazard Assessment or Earthquake Forecast? An Ongoing Approach in China

    NASA Astrophysics Data System (ADS)

    Wu, Zhongliang; Jiang, Changsheng; Zhang, Shengfeng

    2017-06-01

    The approach in China since the last 1.5 decade for using apparent stress in time-dependent seismic hazard assessment or earthquake forecast is summarized. Retrospective case studies observe that apparent stress exhibits short-term increase, with time scale of several months, before moderate to strong earthquakes in a large area surrounding the `target earthquake'. Apparent stress is also used to estimate the tendency of aftershock activity. The concept relating apparent stress indirectly to stress level is used to understand the properties of some `precursory' anomalies. Meanwhile, different opinions were reported. Problems in the calculation also existed for some cases. Moreover, retrospective studies have the limitation in their significance as compared to forward forecast test. Nevertheless, this approach, seemingly uniquely carried out in a large scale in mainland China, provides the earthquake catalogs for the predictive analysis of seismicity with an additional degree of freedom, deserving a systematic review and reflection.

  2. Can Apparent Stress be Used to Time-Dependent Seismic Hazard Assessment or Earthquake Forecast? An Ongoing Approach in China

    NASA Astrophysics Data System (ADS)

    Wu, Zhongliang; Jiang, Changsheng; Zhang, Shengfeng

    2016-08-01

    The approach in China since the last 1.5 decade for using apparent stress in time-dependent seismic hazard assessment or earthquake forecast is summarized. Retrospective case studies observe that apparent stress exhibits short-term increase, with time scale of several months, before moderate to strong earthquakes in a large area surrounding the `target earthquake'. Apparent stress is also used to estimate the tendency of aftershock activity. The concept relating apparent stress indirectly to stress level is used to understand the properties of some `precursory' anomalies. Meanwhile, different opinions were reported. Problems in the calculation also existed for some cases. Moreover, retrospective studies have the limitation in their significance as compared to forward forecast test. Nevertheless, this approach, seemingly uniquely carried out in a large scale in mainland China, provides the earthquake catalogs for the predictive analysis of seismicity with an additional degree of freedom, deserving a systematic review and reflection.

  3. Uncertainty in local and regional tsunami earthquake source parameters: Implications for scenario based hazard assessment and forecasting

    NASA Astrophysics Data System (ADS)

    Müller, Christof; Power, William; Burbidge, David; Wang, Xiaoming

    2016-04-01

    Over the last decade tsunami propagation models have been used extensively for both tsunami forecasting, hazard and risk assessment. However, the effect of uncertainty in the earthquake source parameters, such as location and distribution of slip in the earthquake source on the results of the tsunami model has not always been examined in great detail. We have developed a preliminary combined and continuous Hikurangi-Kermadec subduction zone interface model. The model is defined by a spline surface and is based on a previously published spline model for Hikurangi interface and a more traditional unit source model for the Kermadec interface. The model allows to freely position and vary the earthquake epicenter and to consider non-uniform slip. Using this model we have investigated the effects of variability in non-uniform slip and epicenter location on the distribution of offshore maximum wave heights for local New Zealand targets. Which scenario out of an ensemble is responsible for the maximum wave height locally is a spatially highly variable function of earthquake location and/or the distribution of slip. We use the Coefficient of Variation (CoV) to quantify the variability of offshore wave heights as a function of source location and distribution of slip. CoV increases significantly with closer proximity to the shore, in bays and in shallow water. The study has implication for tsunami hazard assessment and forecasting. As an example, our results challenge the concept of hazard assessment using a single worst case scenario in particular for local tsunami.

  4. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  5. New seafloor map of the Puerto Rico trench helps assess earthquake and tsunami hazards

    NASA Astrophysics Data System (ADS)

    Brink, Uri ten; Danforth, William; Polloni, Christopher; Andrews, Brian; Llanes, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-09-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure l). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S.Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands [McCann et al., 2004]. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918 [Mercado and McCann, 1998]. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico [Mercado et al., 2002; Schwab et al., 1991],although their ages are unknown.

  6. New seafloor map of the Puerto Rico Trench helps assess earthquake and tsunami hazards

    USGS Publications Warehouse

    ten Brink, Uri S.; Danforth, William; Polloni, Christopher; Andrews, Brian D.; Llanes Estrada, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-01-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure l). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S.Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands [McCann et al., 2004]. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918 [Mercado and McCann, 1998]. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico [Mercado et al., 2002; Schwab et al., 1991],although their ages are unknown.

  7. Rapid field-based landslide hazard assessment in response to post-earthquake emergency

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Gambini, Stefano; Cancelliere, Giorgio

    2016-04-01

    On April 25, 2015 a Mw 7.8 earthquake occurred 80 km to the northwest of Kathmandu (Nepal). The largest aftershock, occurred on May 12, 2015, was the Mw 7.3 Nepal earthquake (SE of Zham, China), 80 km to the east of Kathmandu. . The earthquakes killed ~9000 people and severely damaged a 10,000 sqkm region in Nepal and neighboring countries. Several thousands of landslides have been triggered during the event, causing widespread damages to mountain villages and the evacuation of thousands of people. Rasuwa was one of the most damaged districts. This contribution describes landslide hazard analysis of the Saramthali, Yarsa and Bhorle VDCs (122 km2, Rasuwa district). Hazard is expressed in terms of qualitative classes (low, medium, high), through a simple matrix approach that combines frequency classes and magnitude classes. The hazard analysis is based primarily on the experience gained during a field survey conducted in September 2014. During the survey, local knowledge has been systematically exploited through interviews with local people that have experienced the earthquake and the coseismic landslides. People helped us to recognize fractures and active deformations, and allowed to reconstruct a correct chronicle of landslide events, in order to assign the landslide events to the first shock, the second shock, or the post-earthquake 2015 monsoon. The field experience was complemented with a standard analysis of the relationship between potential controlling factors and the distribution of landslides reported in Kargel et al (2016). This analysis allowed recognizing the most important controlling factor. This information was integrated with the field observations to verify the mapped units and to complete the mapping in area not accessible for field activity. Finally, the work was completed with the analysis and the use of a detailed landslide inventory produced by the University of Milano Bicocca that covers most of the area affected by coseismic landslides in

  8. The Effects on Tsunami Hazard Assessment in Chile of Assuming Earthquake Scenarios with Spatially Uniform Slip

    NASA Astrophysics Data System (ADS)

    Carvajal, Matías; Gubler, Alejandra

    2016-12-01

    We investigated the effect that along-dip slip distribution has on the near-shore tsunami amplitudes and on coastal land-level changes in the region of central Chile (29°-37°S). Here and all along the Chilean megathrust, the seismogenic zone extends beneath dry land, and thus, tsunami generation and propagation is limited to its seaward portion, where the sensitivity of the initial tsunami waveform to dislocation model inputs, such as slip distribution, is greater. We considered four distributions of earthquake slip in the dip direction, including a spatially uniform slip source and three others with typical bell-shaped slip patterns that differ in the depth range of slip concentration. We found that a uniform slip scenario predicts much lower tsunami amplitudes and generally less coastal subsidence than scenarios that assume bell-shaped distributions of slip. Although the finding that uniform slip scenarios underestimate tsunami amplitudes is not new, it has been largely ignored for tsunami hazard assessment in Chile. Our simulations results also suggest that uniform slip scenarios tend to predict later arrival times of the leading wave than bell-shaped sources. The time occurrence of the largest wave at a specific site is also dependent on how the slip is distributed in the dip direction; however, other factors, such as local bathymetric configurations and standing edge waves, are also expected to play a role. Arrival time differences are especially critical in Chile, where tsunamis arrive earlier than elsewhere. We believe that the results of this study will be useful to both public and private organizations for mapping tsunami hazard in coastal areas along the Chilean coast, and, therefore, help reduce the risk of loss and damage caused by future tsunamis.

  9. The 1945 Balochistan earthquake and probabilistic tsunami hazard assessment for the Makran subduction zone

    NASA Astrophysics Data System (ADS)

    Höchner, Andreas; Babeyko, Andrey; Zamora, Natalia

    2014-05-01

    Iran and Pakistan are countries quite frequently affected by destructive earthquakes. For instance, the magnitude 6.6 Bam earthquake in 2003 in Iran with about 30'000 casualties, or the magnitude 7.6 Kashmir earthquake 2005 in Pakistan with about 80'000 casualties. Both events took place inland, but in terms of magnitude, even significantly larger events can be expected to happen offshore, at the Makran subduction zone. This small subduction zone is seismically rather quiescent, but a tsunami caused by a thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Additionally, some recent publications raise the question of the possiblity of rare but huge magnitude 9 events at the Makran subduction zone. We first model the historic Balochistan event and its effect in terms of coastal wave heights, and then generate various synthetic earthquake and tsunami catalogs including the possibility of large events in order to asses the tsunami hazard at the affected coastal regions. Finally, we show how an effective tsunami early warning could be achieved by the use of an array of high-precision real-time GNSS (Global Navigation Satellite System) receivers along the coast.

  10. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    USGS Publications Warehouse

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  11. GPS/GNSS for Earthquake, Tsunami and Volcano Hazards: Examples of Assessments, Response, and Real-time Monitoring in Alaska

    NASA Astrophysics Data System (ADS)

    Freymueller, J. T.; Elliott, J.; Grapenthin, R.

    2016-12-01

    GPS/GNSS observations provide a powerful tool for natural hazard assessment, mitigation, response and monitoring. Solid earth deformation on a wide range of timescales provides important constraints on source processes for tectonic deformation, earthquakes and their related tsunamis, volcanic activity, and many other problems. This talk will focus on experience in Alaska, a region of high tectonic and volcanic activity, and highlight examples of these uses from hazard assessment to response and monitoring. At the University of Alaska Fairbanks, we acquire and analyze daily GPS data from 1200 continuous sites around the world, including 200 in and around Alaska. We analyze a subset of these stations on a next-day basis, and the entire data set with a latency of 1-2 weeks. Additional near real-time to real-time solutions have been generated in test mode or in response to significant events, such as the 2006 eruption of Augustine volcano. Long-term site velocities have been used to provide new input for earthquake hazard maps via strain estimates and block models, or for studying the inflation and eruption of volcanoes. Displacements over shorter intervals are used mainly for studying the time development of distinct events, such as earthquakes and volcanic eruptions. We use a series of moderate earthquakes to assess the capability of real-time GPS displacements to provide critical data for earthquake monitoring and assessment. The 2006 eruption of Augustine volcano illustrates the utility of real-time deformation data. The volcano inflated before and during a series of discrete explosions. This subtle but continuous inflation signal allowed the Alaska Volcano Observatory to infer that pressurization inside the volcano continued during eruptive activity. The volcano deflated only when it was undergoing sustained emissions.

  12. Hazard assessment of long-period ground motions for the Nankai Trough earthquakes

    NASA Astrophysics Data System (ADS)

    Maeda, T.; Morikawa, N.; Aoi, S.; Fujiwara, H.

    2013-12-01

    We evaluate a seismic hazard for long-period ground motions associated with the Nankai Trough earthquakes (M8~9) in southwest Japan. Large interplate earthquakes occurring around the Nankai Trough have caused serious damages due to strong ground motions and tsunami; most recent events were in 1944 and 1946. Such large interplate earthquake potentially causes damages to high-rise and large-scale structures due to long-period ground motions (e.g., 1985 Michoacan earthquake in Mexico, 2003 Tokachi-oki earthquake in Japan). The long-period ground motions are amplified particularly on basins. Because major cities along the Nankai Trough have developed on alluvial plains, it is therefore important to evaluate long-period ground motions as well as strong motions and tsunami for the anticipated Nankai Trough earthquakes. The long-period ground motions are evaluated by the finite difference method (FDM) using 'characterized source models' and the 3-D underground structure model. The 'characterized source model' refers to a source model including the source parameters necessary for reproducing the strong ground motions. The parameters are determined based on a 'recipe' for predicting strong ground motion (Earthquake Research Committee (ERC), 2009). We construct various source models (~100 scenarios) giving the various case of source parameters such as source region, asperity configuration, and hypocenter location. Each source region is determined by 'the long-term evaluation of earthquakes in the Nankai Trough' published by ERC. The asperity configuration and hypocenter location control the rupture directivity effects. These parameters are important because our preliminary simulations are strongly affected by the rupture directivity. We apply the system called GMS (Ground Motion Simulator) for simulating the seismic wave propagation based on 3-D FDM scheme using discontinuous grids (Aoi and Fujiwara, 1999) to our study. The grid spacing for the shallow region is 200 m and

  13. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    SciTech Connect

    Türker, Tuğba

    2016-04-18

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M{sub S}=7.3 and 1897, M{sub S}=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M{sub S} magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99

  14. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    NASA Astrophysics Data System (ADS)

    Türker, Tuǧba; Bayrak, Yusuf

    2016-04-01

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn't been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, MS=7.3 and 1897, MS=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for MS magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boǧazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99 with an earthquake

  15. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-06-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction

  16. Real-time Seismicity Evaluation as a Tool for the Earthquake and Tsunami Short-Term Hazard Assessment (Invited)

    NASA Astrophysics Data System (ADS)

    Papadopoulos, G. A.

    2010-12-01

    Seismic activity is a 3-D process varying in the space-time-magnitude domains. When in a target area the short-term activity deviates significantly from the usual (background) seismicity, then the modes of activity may include swarms, temporary quiescence, foreshock-mainshock-aftershock sequences, doublets and multiplets. This implies that making decision for civil protection purposes requires short-term seismic hazard assessment and evaluation. When a sizable earthquake takes place the critical question is about the nature of the event: mainshock or a foreshock which foreshadows the occurrence of a biger one? Also, the seismicity increase or decrease in a target area may signify either precursory changes or just transient seismicity variations (e.g. swarms) which do not conclude with a strong earthquake. Therefore, the real-time seismicity evaluation is the backbone of the short-term hazard assessment. The algorithm FORMA (Foreshock-Mainshock-Aftershock) is presented which detects and updates automatically and in near real-time significant variations of the seismicity according to the earthquake data flow from the monitoring center. The detection of seismicity variations is based on an expert system which for a given target area indicates the mode of seismicity from the variation of two parameters: the seismicity rate, r, and the b-value of the magnitude-frequency relation. Alert levels are produced according to the significance levels of the changes of r and b. The good performance of FORMA was verified retrospectively in several earthquake cases, e.g. for the L’ Aquila, Italy, 2009 earthquake sequence (Mmax 6.3) (Papadopoulos et al., 2010). Real-time testing was executed during January 2010 with the strong earthquake activity (Mmax 5.6) in the Corinth Rift, Central Greece. Evaluation outputs were publicly documented on a nearly daily basis with successful results. Evaluation of coastal and submarine earthquake activity is also of crucial importance for the

  17. Assessment of existing and potential landslide hazards resulting from the April 25, 2015 Gorkha, Nepal earthquake sequence

    USGS Publications Warehouse

    Collins, Brian D.; Jibson, Randall W.

    2015-07-28

    This report provides a detailed account of assessments performed in May and June 2015 and focuses on valley-blocking landslides because they have the potential to pose considerable hazard to many villages in Nepal. First, we provide a seismological background of Nepal and then detail the methods used for both external and in-country data collection and interpretation. Our results consist of an overview of landsliding extent, a characterization of all valley-blocking landslides identified during our work, and a description of video resources that provide high resolution coverage of approximately 1,000 kilometers (km) of river valleys and surrounding terrain affected by the Gorkha earthquake sequence. This is followed by a description of site-specific landslide-hazard assessments conducted while in Nepal and includes detailed descriptions of five noteworthy case studies. Finally, we assess the expectation for additional landslide hazards during the 2015 summer monsoon season.

  18. Scenario-Based Tsunami Hazard Assessment from Earthquake and Landslide Sources for Eastern Sicily, Italy

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Armigliato, A.; Pagnoni, G.; Paparo, M. A.; Zaniboni, F.

    2016-12-01

    Eastern Sicily was theatre of the most damaging tsunamis that ever struck Italy, such as the 11 January 1693 and the 28 December 1908 tsunamis. Tectonic studies and paleotsunami investigations extended historical records of tsunami occurrence back of several thousands of years. Tsunami sources relevant for eastern Sicily are both local and remote, the latter being located in the Ionian Greece and in the Western Hellenic Arc. Here in 365 A.D. a large earthquake generated a tsunami that was seen in the whole eastern and central Mediterranean including the Sicilian coasts. The objective of this study is the evaluation of tsunami hazard along the coast of eastern Sicily, central Mediterranean, Italy via a scenario-based technique, which has been preferred to the PTHA approach because, when dealing with tsunamis induced by landslides, uncertainties are usually so large to undermine the PTHA results. Tsunamis of earthquake and landslide origin are taken into account for the entire coast of Sicily, from the Messina to the Siracusa provinces. Landslides are essentially local sources and can occur underwater along the unstable flanks of the Messina Straits or along the steep slopes of the Hyblaean-Malta escarpment. The method is based on a two-step procedure. After a preliminary step where very many earthquake and landslide sources are taken into account and tsunamis are computed on a low-resolution grid, the worst-case scenarios are selected and tsunamis are simulated on a finer-resolution grid allowing for a better calculation of coastal wave height and tsunami penetration. The final result of our study is given in the form of aggregate fields computed from individual scenarios. Also interesting is the contribution of the various tsunami sources in different localities along the coast. It is found that the places with the highest level of hazard are the low lands of La Playa south of Catania and of the Bay of Augusta, which is in agreement also with historical

  19. Seismic noise in the shallow subsurface: Methods for using it in earthquake hazard assessment

    NASA Astrophysics Data System (ADS)

    Scott, James B.

    2007-12-01

    The primary focus of this work has been characterization of the shallow subsurface for seismic hazard using naturally occurring seismic noise. Three studies chronicle the further development of the refraction microtremor method for determining shear-wave velocity-depth structure, which is a predictor of earthquake shaking amplification. These studies present results from the first uses of the refraction microtremor method to determine earthquake hazard across entire urban basins. Improved field methods led to speed and efficiency in these deployments. These spatially dense geophysical measurements of shallow shear-wave velocity were conducted to broadly define shaking hazard and to determine the accuracy of earlier methods of prediction. The refraction microtremor method agrees well with borehole and other shear-velocity methods. In Chapter 2, I present results from the first long urban transect, 16 km across the Reno, Nevada basin. In 45 of the 55 (82%) measurements of shear velocity averaged to 30 m depth (Vs30) the result was above 360 m/s. The National Earthquake Hazards Reduction Program (NEHRP) defines Vs30 of 360 m/s as the boundary between site hazard class C and class D, with class C above 360 m/s. Mapped geologic and soil units are not accurate predictors of Vs30 on this transect, and would have predicted most of the transect as NEHRP-D. In Chapter 3, I present Vs30 results along a 13 km-long transect parallel to Las Vegas Blvd. (The Strip), along with borehole and surface-wave measurements of 30 additional sites. Again, our transect measurements correlate poorly against geologic map units, which do not predict Vs30 at any individual site with sufficient accuracy for engineering application. Two models to predict Vs30 were reported in this study. In Chapter 4, I present aggregate results from the Reno and Las Vegas transects and include results from our 60 km-long transect across the Los Angeles basin. Our statistical analyses suggest that the lateral

  20. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    NASA Astrophysics Data System (ADS)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  1. Oregon Hazard Explorer for Lifelines Program (OHELP): A web-based geographic information system tool for assessing potential Cascadia earthquake hazard

    NASA Astrophysics Data System (ADS)

    Sharifi Mood, M.; Olsen, M. J.; Gillins, D. T.; Javadnejad, F.

    2016-12-01

    The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.

  2. New Seafloor Map of the Puerto Rico Trench Helps Assess Earthquake and Tsunami Hazards

    NASA Astrophysics Data System (ADS)

    ten Brink, Uri; Danforth, William; Polloni, Christopher; Andrews, Brian; Llanes, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-09-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure 1). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S. Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico, although their ages are unknown. The Puerto Rico Trench is atypical of oceanic trenches. Subduction is highly oblique (10°-20°) to the trench axis with a large component of left-lateral strike-slip motion. Similar convergence geometry is observed at the Challenger Deep in the Mariana Trench, the deepest point on Earth. In addition to its extremely deep seafloor, the Puerto Rico Trench is also characterized by the most negative free-air gravity anomaly on Earth, -380 mGal, located 50 km south of the trench, where water depth is 7950 m (Figure 2). A tilted carbonate platform provides evidence for extreme vertical tectonism in the region. This platform was horizontally deposited over Cretaceous to Paleocene arc rocks starting in the Late Oligocene. Then, at 3.5 Ma, the carbonate platform was tilted by 4° toward the trench over a time period of less than 40 kyr, such that its northern edge is at a depth of 4000 m and its reconstructed elevation on land in Puerto Rico is at +1300 m (Figures 1 and 2).

  3. Probabilistic Seismic Hazard Assessment for Iraq Using Complete Earthquake Catalogue Files

    NASA Astrophysics Data System (ADS)

    Ameer, A. S.; Sharma, M. L.; Wason, H. R.; Alsinawi, S. A.

    2005-05-01

    Probabilistic seismic hazard analysis (PSHA) has been carried out for Iraq. The earthquake catalogue used in the present study covers an area between latitude 29° 38.5° N and longitude 39° 50° E containing more than a thousand events for the period 1905 2000. The entire Iraq region has been divided into thirteen seismogenic sources based on their seismic characteristics, geological setting and tectonic framework. The completeness of the seismicity catalogue has been checked using the method proposed by Stepp (1972). The analysis of completeness shows that the earthquake catalogue is not complete below Ms=4.8 for all of Iraq and seismic source zones S1, S4, S5, and S8, while it varies for the other seismic zones. A statistical treatment of completeness of the data file was carried out in each of the magnitude classes. The Frequency Magnitude Distributions (FMD) for the study area including all seismic source zones were established and the minimum magnitude of complete reporting (Mc) were then estimated. For the entire Iraq the Mc was estimated to be about Ms=4.0 while S11 shows the lowest Mc to be about Ms=3.5 and the highest Mc of about Ms=4.2 was observed for S4. The earthquake activity parameters (activity rate λ, b value, maximum regional magnitude mmax) as well as the mean return period (R) with a certain lower magnitude mmin ≥ m along with their probability of occurrence have been determined for all thirteen seismic source zones of Iraq. The maximum regional magnitude mmax was estimated as 7.87 ± 0.86 for entire Iraq. The return period for magnitude 6.0 is largest for source zone S3 which is estimated to be 705 years while the smallest value is estimated as 9.9 years for all of Iraq.

  4. Will a continuous GPS array for L.A. help earthquake hazard assessment?

    NASA Astrophysics Data System (ADS)

    Prescott, William H.

    The striking landscapes and hospitable climate of Southern California are home to more than 20 million people and vital elements of the nation's economy. Unfortunately, the region is also laced with many active faults that can produce strong earthquakes. Scientists from several institutions are pursuing a new approach to studying earthquake hazards in a high-risk metropolitan area.The Southern California Integrated GPS Network (SCIGN) is currently an array of about 40 Global Positioning System (GPS) stations distributed throughout the greater Los Angeles metropolitan region. There have been informal discussions about expanding the array to 250 stations, and formal proposals have been submitted to begin this expansion. To achieve high precision, the sites will be carefully monumented, and all the GPS receivers will operate continuously. The goals of the array are to provide an accurate and detailed velocity field from which to identify the deformation from known faults, test current models of the geologic structure, and make better estimates of the seismic potential in the populous parts of southern California.

  5. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto

    2016-04-01

    The characteristic earthquake hypothesis is not strongly supported by observational data because of the relatively short duration of historical and even paleoseismological records. For instance, for the Calabria (Southern Italy) region, historical information on strong earthquakes exist for at least two thousand years, but they can be considered complete for M > 6.0 only for the latest few centuries. As a consequence, characteristic earthquakes are seldom reported for individual fault segments, and hazard assessment is not reliably estimated by means of only minor seismicity reported in the historical catalogs. Even if they cannot substitute the information contained in a good historical catalog, physics-based earthquake simulators have become popular in the recent literature, and their application has been justified by a number of reasons. In particular, earthquake simulators can provide interesting information on which renewal models can better describe the recurrence statistics, and how this is affected by features as local fault geometry and kinematics. The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 100,000 events of magnitudes ≥ 4.5. The algorithm on which this simulator is based is constrained by several physical elements, as an average slip rate due to tectonic loading for every single segment in the investigated fault system, the process of rupture growth and termination, and interaction between earthquake sources, including small magnitude events. Events nucleated in one segment are allowed to expand into neighboring segments, if they are separated by a given maximum range of distance. The application of our simulation algorithm to Calabria region provides typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term periodicity of strong earthquakes, short

  6. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario

    NASA Astrophysics Data System (ADS)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.

    2017-09-01

    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically

  7. A new earthquake catalogue for seismic hazard assessment of the NPP (Nuclear Power Plant) Jaslovske Bohunice, Slovakia, site

    NASA Astrophysics Data System (ADS)

    Kysel, Robert; Kristek, Jozef; Moczo, Peter; Csicsay, Kristian; Cipciar, Andrej; Srbecky, Miroslav

    2014-05-01

    According to the IAEA (International Atomic Energy Agency) Safety Guide No. SSG-9, an earthquake catalogue should comprise all information on pre-historical, historical and seismometrically recorded earthquakes in the region which should cover geographic area not smaller than a circle with radius of 300 km around the site. Jaslovske Bohunice is an important economic site. Several nuclear facilities are located in Jaslovske Bohunice - either in operation (NPP V2, national radioactive waste repository) or in decommissioning (NPP A1, NPP V1). Moreover, a new reactor unit is being planned for the site. Jaslovske Bohunice site is not far from the Dobra Voda seismic source zone which has been the most active seismic zone at territory of Slovakia since the beginning of 20th century. Relatively small distances to Austria, Hungary, Czech Republic and Slovak capital Bratislava make the site a prominent priority in terms of seismic hazard assessment. We compiled a new earthquake catalogue for the NPP Jaslovske Bohunice region following the recommendations of the IAEA Safety Guide. The region includes parts of the territories of Slovakia, Hungary, Austria, the Czech Republic and Poland, and it partly extends up to Germany, Slovenia, Croatia and Serbia. The catalogue is based on data from six national earthquake catalogues, two regional earthquake catalogues (ACORN, CENEC) and a catalogue from the local NPP network. The primarily compiled catalogue for the time period 350 - 2011 consists of 9 142 events. We then homogenized and declustered the catalogue. Eventually we checked the catalogue for time completeness. For homogenization, we divided the catalogue into preseismometric (350 - 1900) and seismometric (1901-2011) periods. For earthquakes characterized by the epicentral intensity and local magnitude we adopted relations proposed for homogenization of the CENEC catalogue (Grünthal et al. 2009). Instead of assuming the equivalency between local magnitudes reported by the

  8. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  9. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  10. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  11. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  12. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  13. Implications for prediction and hazard assessment from the 2004 Parkfield earthquake.

    PubMed

    Bakun, W H; Aagaard, B; Dost, B; Ellsworth, W L; Hardebeck, J L; Harris, R A; Ji, C; Johnston, M J S; Langbein, J; Lienkaemper, J J; Michael, A J; Murray, J R; Nadeau, R M; Reasenberg, P A; Reichle, M S; Roeloffs, E A; Shakal, A; Simpson, R W; Waldhauser, F

    2005-10-13

    Obtaining high-quality measurements close to a large earthquake is not easy: one has to be in the right place at the right time with the right instruments. Such a convergence happened, for the first time, when the 28 September 2004 Parkfield, California, earthquake occurred on the San Andreas fault in the middle of a dense network of instruments designed to record it. The resulting data reveal aspects of the earthquake process never before seen. Here we show what these data, when combined with data from earlier Parkfield earthquakes, tell us about earthquake physics and earthquake prediction. The 2004 Parkfield earthquake, with its lack of obvious precursors, demonstrates that reliable short-term earthquake prediction still is not achievable. To reduce the societal impact of earthquakes now, we should focus on developing the next generation of models that can provide better predictions of the strength and location of damaging ground shaking.

  14. An Updated Homogeneous GPS Velocity Field for Studies of Earthquake Hazard Prediction and Assessment in Turkey

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Aktug, B.; Dogru, A.; Tasci, L.; Acar, M.

    2016-12-01

    While the GPS-based crustal deformation studies in Turkey date back to early 1990s, a homogenous velocity field utilizing all the available data is still missing. Regional studies employing different site distributions, observation plans, processing software and methodology not only create reference frame variations but also heterogeneous stochastic models. While the reference frame effect between different velocity fields could easily be removed by estimating a set of rotations, the homogenization of the stochastic models of the individual velocity fields requires a more detailed analysis. Using a rigorous Variance Component Estimation (VCE) methodology, we estimated the variance factors for each of the contributing velocity fields and combined them into a single homogenous velocity field covering whole Turkey. Results show that variance factors between velocity fields including the survey mode and continuous observations can vary a few orders of magnitude.In this study, we present the most complete velocity field in Turkey rigorously combined from 20 individual velocity fields including the 146 station CORS network with 8 years continuous stations. In addition, two GPS campaigns were performed at 35 stations along the North Anatolian Fault to fill the gap between existing velocity fields. The homogenously combined new velocity field is nearly complete in terms of geographic coverage, and will serve as the basis for further analyses such as the estimation of the deformation rates and the determination of the slip rates across main fault zones. As the Active Fault Map of Turkey was recently revised and 500 faults were tagged as having the potential of generating destructive earthquakes, the new velocity field is also expected to have a direct impact on the earthquake hazard studies.

  15. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  16. Development of direct multi-hazard susceptibility assessment method for post-earthquake reconstruction planning in Nepal

    NASA Astrophysics Data System (ADS)

    Mavrouli, Olga; Rana, Sohel; van Westen, Cees; Zhang, Jianqiang

    2017-04-01

    After the devastating 2015 Gorkha earthquake in Nepal, reconstruction activities have been delayed considerably, due to many reasons, of a political, organizational and technical nature. Due to the widespread occurrence of co-seismic landslides, and the expectation that these may be aggravated or re-activated in future years during the intense monsoon periods, there is a need to evaluate for thousands of sites whether these are suited for reconstruction. In this evaluation multi-hazards, such as rockfall, landslides, debris flow, and flashfloods should be taken into account. The application of indirect knowledge-based, data-driven or physically-based approaches is not suitable due to several reasons. Physically-based models generally require a large number of parameters, for which data is not available. Data-driven, statistical methods, depend on historical information, which is less useful after the occurrence of a major event, such as an earthquake. Besides, they would lead to unacceptable levels of generalization, as the analysis is done based on rather general causal factor maps. The same holds for indirect knowledge-driven methods. However, location-specific hazards analysis is required using a simple method that can be used by many people at the local level. In this research, a direct scientific method was developed where local level technical people can easily and quickly assess the post-earthquake multi hazards following a decision tree approach, using an app on a smartphone or tablet. The methods assumes that a central organization, such as the Department of Soil Conservation and Watershed Management, generates spatial information beforehand that is used in the direct assessment at a certain location. Pre-earthquake, co-seismic and post-seismic landslide inventories are generated through the interpretation of Google Earth multi-temporal images, using anaglyph methods. Spatial data, such as Digital Elevation Models, land cover maps, and geological maps are

  17. Volunteers in the earthquake hazard reduction program

    USGS Publications Warehouse

    Ward, P.L.

    1978-01-01

    With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 

  18. Vulnerability assessment of a port and harbor community to earthquake and tsunami hazards: Integrating technical expert and stakeholder input

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Research suggests that the Pacific Northwest could experience catastrophic earthquakes and tsunamis in the near future, posing a significant threat to the numerous ports and harbors along the coast. A collaborative, multiagency initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to these hazards, involving Oregon Sea Grant, Washington Sea Grant, the National Oceanic and Atmospheric Administration Coastal Services Center, and the U.S. Geological Survey Center for Science Policy. One element of this research, planning, and outreach initiative is a natural hazard mitigation and emergency preparedness planning process that combines technical expertise with local stakeholder values and perceptions. This paper summarizes and examines one component of the process, the vulnerability assessment methodology, used in the pilot port and harbor community of Yaquina River, Oregon, as a case study of assessing vulnerability at the local level. In this community, stakeholders were most concerned with potential life loss and other nonstructural vulnerability issues, such as inadequate hazard awareness, communication, and response logistics, rather than structural issues, such as damage to specific buildings or infrastructure.

  19. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  20. Foreshocks and short-term hazard assessment of large earthquakes using complex networks: the case of the 2009 L'Aquila earthquake

    NASA Astrophysics Data System (ADS)

    Daskalaki, Eleni; Spiliotis, Konstantinos; Siettos, Constantinos; Minadakis, Georgios; Papadopoulos, Gerassimos A.

    2016-08-01

    The monitoring of statistical network properties could be useful for the short-term hazard assessment of the occurrence of mainshocks in the presence of foreshocks. Using successive connections between events acquired from the earthquake catalog of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) for the case of the L'Aquila (Italy) mainshock (Mw = 6.3) of 6 April 2009, we provide evidence that network measures, both global (average clustering coefficient, small-world index) and local (betweenness centrality) ones, could potentially be exploited for forecasting purposes both in time and space. Our results reveal statistically significant increases in the topological measures and a nucleation of the betweenness centrality around the location of the epicenter about 2 months before the mainshock. The results of the analysis are robust even when considering either large or off-centered the main event space windows.

  1. Multicomponent Body and Surface Wave Seismic Analysis using an Urban Land Streamer System: An Integrative Earthquake Hazards Assessment Approach

    NASA Astrophysics Data System (ADS)

    Gribler, G.; Liberty, L. M.

    2014-12-01

    We present earthquake site response results from a 48-channel multicomponent seismic land streamer and large weight drop system. We acquired data along a grid of city streets in western Idaho at a rate of a few km per day where we derived shear wave velocity profiles to a depth of 40-50 m by incorporating vertical and radial geophone signals to capture the complete elliptical Rayleigh wave motion. We also obtained robust p-wave reflection and refraction results by capturing the returned signals that arrive at non-vertical incidence angles that result from the high-velocity road surface layer. By integrating the derived shear wave velocity profiles with p-wave reflection results, we include depositional and tectonic boundaries from the upper few hundred meters into our analysis to help assess whether ground motions may be amplified by shallow bedrock. By including p-wave refraction information into the analysis, we can identify zones of high liquefaction potential by comparing shear wave and p-wave velocity (Vp/Vs) measurements relative to refraction-derived water table depths. The utilization of multicomponent land streamer data improves signal-noise levels over single component data with no additional field effort. The added multicomponent data processing step can be as simple as calculating the magnitude of the vector for surface wave and refraction arrivals or rotating the reflected signals to the maximum emergence angle based on near surface p-wave velocity information. We show example data from a number of Idaho communities where historical earthquakes have been recorded. We also present numerical models and systematic field tests that show the effects of a high velocity road surface layer in surface and body wave measurements. We conclude that multicomponent seismic information derived from seismic land streamers can provide a significant improvement in earthquake hazard assessment over a standard single component approach with only a small addition in

  2. 76 FR 18165 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. ] SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... be sent to National Earthquake Hazards Reduction Program Director, National Institute of Standards...

  3. Earthquake Hazard Mitigation Strategy in Indonesia

    NASA Astrophysics Data System (ADS)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.

    2008-05-01

    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  4. Probabilistic Tsunami Hazard Assessment along Nankai Trough (2) a comprehensive assessment including a variety of earthquake source areas other than those that the Earthquake Research Committee, Japanese government (2013) showed

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2016-12-01

    For the forthcoming Nankai earthquake with M8 to M9 class, the Earthquake Research Committee(ERC)/Headquarters for Earthquake Research Promotion, Japanese government (2013) showed 15 examples of earthquake source areas (ESAs) as possible combinations of 18 sub-regions (6 segments along trough and 3 segments normal to trough) and assessed the occurrence probability within the next 30 years (from Jan. 1, 2013) was 60% to 70%. Hirata et al.(2015, AGU) presented Probabilistic Tsunami Hazard Assessment (PTHA) along Nankai Trough in the case where diversity of the next event's ESA is modeled by only the 15 ESAs. In this study, we newly set 70 ESAs in addition of the previous 15 ESAs so that total of 85 ESAs are considered. By producing tens of faults models, with various slip distribution patterns, for each of 85 ESAs, we obtain 2500 fault models in addition of previous 1400 fault models so that total of 3900 fault models are considered to model the diversity of the next Nankai earthquake rupture (Toyama et al.,2015, JpGU). For PTHA, the occurrence probability of the next Nankai earthquake is distributed to possible 3900 fault models in the viewpoint of similarity to the 15 ESAs' extents (Abe et al.,2015, JpGU). A major concept of the occurrence probability distribution is; (i) earthquakes rupturing on any of 15 ESAs that ERC(2013) showed most likely occur, (ii) earthquakes rupturing on any of ESAs whose along-trench extent is the same as any of 15 ESAs but trough-normal extent differs from it second likely occur, (iii) earthquakes rupturing on any of ESAs whose both of along-trough and trough-normal extents differ from any of 15 ESAs rarely occur. Procedures for tsunami simulation and probabilistic tsunami hazard synthesis are the same as Hirata et al (2015). A tsunami hazard map, synthesized under an assumption that the Nankai earthquakes can be modeled as a renewal process based on BPT distribution with a mean recurrence interval of 88.2 years (ERC, 2013) and an

  5. EARTHQUAKE HAZARDS IN THE OFFSHORE ENVIRONMENT.

    USGS Publications Warehouse

    Page, Robert A.; Basham, Peter W.

    1985-01-01

    This report discusses earthquake effects and potential hazards in the marine environment, describes and illustrates methods for the evaluation of earthquake hazards, and briefly reviews strategies for mitigating hazards. The report is broadly directed toward engineers, scientists, and others engaged in developing offshore resources. The continental shelves have become a major frontier in the search for new petroleum resources. Much of the current exploration is in areas of moderate to high earthquake activity. If the resources in these areas are to be developed economically and safely, potential earthquake hazards must be identified and mitigated both in planning and regulating activities and in designing, constructing, and operating facilities. Geologic earthquake effects that can be hazardous to marine facilities and operations include surface faulting, tectonic uplift and subsidence, seismic shaking, sea-floor failures, turbidity currents, and tsunamis.

  6. Applications of Multi-Cycle Earthquake Simulations to Earthquake Hazard

    NASA Astrophysics Data System (ADS)

    Gilchrist, Jacquelyn Joan

    This dissertation seeks to contribute to earthquake hazard analyses and forecasting by conducting a detailed study of the processes controlling the occurrence, and particularly the clustering, of large earthquakes, the probabilities of these large events, and the dynamics of their ruptures. We use the multi-cycle earthquake simulator RSQSim to investigate several fundamental aspects of earthquake occurrence in order to improve the understanding of earthquake hazard. RSQSim, a 3D, boundary element code that incorporates rate- and state-friction to simulate earthquakes in fully interacting, complex fault systems has been successful at modeling several aspects of fault slip and earthquake occurrence. Multi-event earthquake models with time-dependent nucleation based on rate- and state-dependent friction, such as RSQSim, provide a viable physics-based method for modeling earthquake processes. These models can provide a better understanding of earthquake hazard by improving our knowledge of earthquake processes and probabilities. RSQSim is fast and efficient, and therefore is able to simulate very long sequences of earthquakes (from hundreds of thousands to millions of events). This makes RSQSim an ideal instrument for filling in the current gaps in earthquake data, from short and incomplete earthquake catalogs to unrealistic initial conditions used for dynamic rupture models. RSQSim catalogs include foreshocks, aftershocks, and occasional clusters of large earthquakes, the statistics of which are important for the estimation of earthquake probabilities. Additionally, RSQSim finds a near optimal nucleation location that enables ruptures to propagate at minimal stress conditions and thus can provide suites of heterogeneous initial conditions for dynamic rupture models that produce reduced ground motions compared to models with homogeneous initial stresses and arbitrary forced nucleation locations.

  7. Historical Earthquakes As Examples To Assess The Seismic Hazard In The Eastern Region of Venezuela

    NASA Astrophysics Data System (ADS)

    Martin, J.; Posadas, A.; Avendaño, J.; Sierra, R.; Bonive, F.

    The North-East region of Venezuela lies on the border of the friction zone between the Caribbean and South-American tectonic plates, a source of great seismicity. The first written news of an earthquake in the American Continent were those of the earth- quake of september 1530 which caused damage to Cumaná, the first town of that Continent. Since then a continuous series of earthquakes have been reported, many of them with damaging effects on Cumaná; those caused in the 1929 earthquake (17-01- 1929; with IX Mercalli degrees) were well described by Sidney Paige in the Vo. 20 of the B.S.S.A., March, 1930. An earthquake of magnitude 5.9 {11-06-1986; 10.26z N,63.29z W} was the trigger for the Unesco`s intention to declare the Estado Sucre as a pilot zone for seismological studies. In 1991 a report issued by the International Institute of Earthquake Prediction Theory and Matematical Geophysics (Academy of Sciences, U.R.S.S.) stated that the ocurrence of an earthquake of great magnitude which could affect the North-East region of Venezuela was possible. Other studies of the seismicity of the region have been carried out. The interest of the authorities and of the seismologists reached a peak with the earthquake of july 1997 (10.456z N, 63.555z W), with a magnitude of 6.9; there was a death toll of 73, around 528 people injured and more than 2000 houses needed to be completely rebuilt. A study of micro- zonification of the city of Cumaná has been carried out recently and the results of this study will be presented also to this Congress.

  8. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    SciTech Connect

    Wong, I.G.; Green, R.K.; Sun, J.I.; Pezzopane, S.K.; Abrahamson, N.A.; Quittmeyer, R.C.

    1996-12-31

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  9. Stochastic ground-motion simulation of two Himalayan earthquakes: seismic hazard assessment perspective

    NASA Astrophysics Data System (ADS)

    Harbindu, Ashish; Sharma, Mukat Lal; Kamal

    2012-04-01

    The earthquakes in Uttarkashi (October 20, 1991, M w 6.8) and Chamoli (March 8, 1999, M w 6.4) are among the recent well-documented earthquakes that occurred in the Garhwal region of India and that caused extensive damage as well as loss of life. Using strong-motion data of these two earthquakes, we estimate their source, path, and site parameters. The quality factor ( Q β ) as a function of frequency is derived as Q β ( f) = 140 f 1.018. The site amplification functions are evaluated using the horizontal-to-vertical spectral ratio technique. The ground motions of the Uttarkashi and Chamoli earthquakes are simulated using the stochastic method of Boore (Bull Seismol Soc Am 73:1865-1894, 1983). The estimated source, path, and site parameters are used as input for the simulation. The simulated time histories are generated for a few stations and compared with the observed data. The simulated response spectra at 5% damping are in fair agreement with the observed response spectra for most of the stations over a wide range of frequencies. Residual trends closely match the observed and simulated response spectra. The synthetic data are in rough agreement with the ground-motion attenuation equation available for the Himalayas (Sharma, Bull Seismol Soc Am 98:1063-1069, 1998).

  10. The HayWired earthquake scenario—Earthquake hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-01-01

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  11. After the damages: Lessons learned from recent earthquakes for ground-motion prediction and seismic hazard assessment (C.F. Gauss Lecture)

    NASA Astrophysics Data System (ADS)

    Cotton, Fabrice

    2017-04-01

    Recent damaging earthquakes (e.g. Japan 2011, Nepal 2014, Italy 2016) and associated ground-shaking (ground-motion) records challenge the engineering models used to quantify seismic hazard. The goal of this presentation is to present the lessons learned from these recent events and discuss their implications for ground-motion prediction and probabilistic seismic hazard assessment. The following points will be particularly addressed: 1) Recent observations clearly illustrate the dependency of ground-shaking on earthquake source related factors (e.g. fault properties and geometry, earthquake depth, directivity). The weaknesses of classical models and the impact of these factors on hazard evaluation will be analysed and quantified. 2) These observations also show that events of similar magnitude and style of faulting are producing ground-motions which are highly variable. We will analyse this variability and show that the exponential growth of recorded data give a unique opportunity to quantify regional or between-events shaking variations. Indeed, most seismic-hazard evaluations do not consider the regional specificities of earthquake or wave-propagation properties. There is little guidance in the literature on how this should be done and we will show that this challenge is interdisciplinary, as structural geology, neotectonic and tomographic images can provide key understanding of these regional variations. 3) One of the key lessons of recent earthquakes is that extreme hazard scenarios and ground-shaking are difficult to predict. In other words, we need to mobilize "scientific imagination" and define new strategies based on the latest research results to capture epistemic uncertainties and integrate them in engineering seismology projects. We will discuss these strategies and show an example of their implementation to develop new seismic hazard maps of Europe (Share and Sera FP7 projects) and Germany.

  12. Preliminary Earthquake Hazard Map of Afghanistan

    USGS Publications Warehouse

    Boyd, Oliver S.; Mueller, Charles S.; Rukstales, Kenneth S.

    2007-01-01

    Introduction Earthquakes represent a serious threat to the people and institutions of Afghanistan. As part of a United States Agency for International Development (USAID) effort to assess the resource potential and seismic hazards of Afghanistan, the Seismic Hazard Mapping group of the United States Geological Survey (USGS) has prepared a series of probabilistic seismic hazard maps that help quantify the expected frequency and strength of ground shaking nationwide. To construct the maps, we do a complete hazard analysis for each of ~35,000 sites in the study area. We use a probabilistic methodology that accounts for all potential seismic sources and their rates of earthquake activity, and we incorporate modeling uncertainty by using logic trees for source and ground-motion parameters. See the Appendix for an explanation of probabilistic seismic hazard analysis and discussion of seismic risk. Afghanistan occupies a southward-projecting, relatively stable promontory of the Eurasian tectonic plate (Ambraseys and Bilham, 2003; Wheeler and others, 2005). Active plate boundaries, however, surround Afghanistan on the west, south, and east. To the west, the Arabian plate moves northward relative to Eurasia at about 3 cm/yr. The active plate boundary trends northwestward through the Zagros region of southwestern Iran. Deformation is accommodated throughout the territory of Iran; major structures include several north-south-trending, right-lateral strike-slip fault systems in the east and, farther to the north, a series of east-west-trending reverse- and strike-slip faults. This deformation apparently does not cross the border into relatively stable western Afghanistan. In the east, the Indian plate moves northward relative to Eurasia at a rate of about 4 cm/yr. A broad, transpressional plate-boundary zone extends into eastern Afghanistan, trending southwestward from the Hindu Kush in northeast Afghanistan, through Kabul, and along the Afghanistan-Pakistan border

  13. Estimation of fault propagation distance from fold shape: Implications for earthquake hazard assessment

    NASA Astrophysics Data System (ADS)

    Allmendinger, Richard W.; Shaw, John H.

    2000-12-01

    A numerical grid search using the trishear kinematic model can be used to extract both slip and the distance that a fault tip line has propagated during growth of a fault-propagation fold. The propagation distance defines the initial position of the tip line at the onset of slip. In the Santa Fe Springs anticline of the Los Angeles basin, we show that the tip line of the underlying Puente Hills thrust fault initiated at the same position as the 1987 magnitude 6.0 Whittier Narrows earthquake.

  14. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174 Section 120.174 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Policies Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake...

  15. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky and Portsmouth, Ohio

    SciTech Connect

    1997-03-01

    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the, ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah.

  16. Space-time behavior of continental intraplate earthquakes and implications for hazard assessment in China and the Central U.S.

    NASA Astrophysics Data System (ADS)

    Stein, Seth; Liu, Mian; Luo, Gang; Wang, Hui

    2014-05-01

    much faster than it accumulates today, suggesting that they result from recent fault activation that releases prestored strain energy in the crust. If so, this earthquake sequence is similar to aftershocks in that the rates of energy release should decay with time and the sequence of earthquakes will eventually end. We use simple physical analysis and numerical simulations to show that the current New Madrid earthquake sequence is likely ending or has ended. Recognizing that mid-continental earthquakes have long aftershock sequences and complex spatiotemporal occurrences is critical to improve hazard assessments

  17. Quantifying the Seismic Hazard From Natural and Induced Earthquakes (Invited)

    NASA Astrophysics Data System (ADS)

    Rubinstein, J. L.; Llenos, A. L.; Ellsworth, W. L.; McGarr, A.; Michael, A. J.; Mueller, C. S.; Petersen, M. D.

    2013-12-01

    In the past 12 years, seismicity rates in portions of the central and eastern United States (CEUS) have increased. In 2011, the year of peak activity, three M ≥ 5 earthquakes occurred, causing millions of dollars in damage. Much of the increase in seismicity is believed to have been induced by wastewater from oil and gas activity that is injected deep underground. This includes damaging earthquakes in southern Colorado, central Arkansas, and central Oklahoma in 2011. Earthquakes related to oil and gas activities contribute significantly to the total seismic hazard in some areas of the CEUS, but most of the tens of thousands of wastewater disposal wells in the CEUS do not cause damaging earthquakes. The challenge is to better understand this contribution to the hazard in a realistic way for those wells that are inducing earthquakes or wells that may induce earthquakes in the future. We propose a logic-tree approach to estimate the hazard posed by the change in seismicity that deemphasizes the need to evaluate whether the seismicity is natural or man-made. We first compile a list of areas of increased seismicity, including areas of known induced earthquakes. Using areas of increased seismicity (instead of just induced earthquakes) allows us to assess the hazard over a broader region, avoiding the often-difficult task of judging whether an earthquake sequence is induced. With the zones of increased seismicity defined, we then estimate the earthquake hazard for each zone using a four-branch logic tree: (1) The increased seismicity rate is natural, short-term variation within the longer-term background seismicity rate. Thus, these earthquakes would be added to the catalog when computing the background seismicity rate. (2) The increased seismicity rate represents a new and permanent addition to the background seismicity. In this branch, a new background seismicity rate begins at the time of the change in earthquake rate. (3) Induced earthquakes account for the

  18. Fragility analysis of flood protection structures in earthquake and flood prone areas around Cologne, Germany for multi-hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Tyagunov, Sergey; Vorogushyn, Sergiy; Munoz Jimenez, Cristina; Parolai, Stefano; Fleming, Kevin; Merz, Bruno; Zschau, Jochen

    2013-04-01

    The work presents a methodology for fragility analyses of fluvial earthen dikes in earthquake and flood prone areas. Fragility estimates are being integrated into the multi-hazard (earthquake-flood) risk analysis being undertaken within the framework of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) for the city of Cologne, Germany. Scenarios of probable cascading events due to the earthquake-triggered failure of flood protection dikes and the subsequent inundation of surroundings are analyzed for the area between the gauges Andernach and Düsseldorf along the Rhine River. Along this river stretch, urban areas are partly protected by earthen dikes, which may be prone to failure during exceptional floods and/or earthquakes. The seismic fragility of the dikes is considered in terms of liquefaction potential (factor of safety), estimated by the use of the simplified procedure of Seed and Idriss. It is assumed that initiation of liquefaction at any point throughout the earthen dikes' body corresponds to the failure of the dike and, therefore, this should be taken into account for the flood risk calculations. The estimated damage potential of such structures is presented as a two-dimensional surface (as a function of seismic hazard and water level). Uncertainties in geometrical and geotechnical dike parameters are considered within the framework of Monte Carlo simulations. Taking into consideration the spatial configuration of the existing flood protection system within the area under consideration, seismic hazard curves (in terms of PGA) are calculated for sites along the river segment of interest at intervals of 1 km. The obtained estimates are used to calculate the flood risk when considering the temporal coincidence of seismic and flood events. Changes in flood risk for the considered hazard cascade scenarios are quantified and compared to the single-hazard scenarios.

  19. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  20. Spatial variation of seismogenic depths of crustal earthquakes in the Taiwan region: Implications for seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Wu, Wen-Nan; Yen, Yin-Tung; Hsu, Ya-Ju; Wu, Yih-Min; Lin, Jing-Yi; Hsu, Shu-Kun

    2017-06-01

    This paper presents the first whole Taiwan-scale spatial variation of the seismogenic zone using a high-quality crustal seismicity catalog. The seismicity onset and cutoff depths (i.e., seismogenic depths) are determined by the earthquake depth-moment distribution and used to define the upper and lower boundaries of the seismogenic zone, respectively. Together with the published fault geometries and fault area-moment magnitude relations, the depth difference in the onset and cutoff depths (i.e., seismogenic thickness) is used as the fault width to determine the moment magnitudes of potential earthquakes for the major seismogenic faults. Results show that the largest (Mw7.9-8.0) potential earthquake may occur along the Changhua fault in western Taiwan, where the seismic risk is relatively high and seismic hazard mitigation should be a matter of urgent concern. In addition, the first-motion focal mechanism catalog is used to examine the relation between the seismogenic depths and earthquake source parameters. For crustal earthquakes (≤ 50 km), the shallowest onset and cutoff depths are observed for normal and strike-slip events, respectively. This observation is different from the prediction of the conventional continental-rheology model, which states that thrust events have the shallowest cutoff depth. Thus, a more sophisticated rheology model is necessary to explain our observed dependence of the seismogenic depths on faulting types. Meanwhile, for intermediate to large crustal (Mw ≥ 4; depth ≤ 50 km) earthquakes, thrust events tend to occur at the bottom region of the seismogenic zone, but normal and strike-slip events distribute at a large depth range.

  1. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  2. Simulation-Based Hazard Assessment for Long-Period Ground Motions of the Nankai Trough Megathrust Earthquake

    NASA Astrophysics Data System (ADS)

    Maeda, T.; Morikawa, N.; Iwaki, A.; Aoi, S.; Fujiwara, H.

    2014-12-01

    We evaluate a long-period ground motion hazard for the Nankai Trough earthquakes (M8~9) in west Japan. Past large earthquakes in the Nankai Trough that have occurred in an interval of 100~200 years showed various occurrence patterns and caused serious damages due to strong ground motion and tsunami. However, such large interplate earthquake potentially causes damages due to long-period ground motion even at long-distance basins. For evaluating the long-period ground motion of large earthquakes, it is important to take into account the uncertainty of source model and the effect of 3-D underground structure. In this study, we evaluate the long-period ground motion by the finite difference method (FDM) using "characterized source models" and the 3-D underground structure model. We construct various characterized source models (369 scenarios). Although most of parameters of the model are determined based on the "recipe" for predicting strong ground motion, we assume various possible source parameters including rupture area, asperity configuration, and hypocenter location. To perform the large-scale simulation for many source models, we apply a 3-D FDM scheme using discontinuous grids and utilize the GPGPU for our simulation. We use the system called GMS (Ground Motion Simulator) for the FD simulation. The grid spacing for the shallow region is 200 m and 100 m in horizontal and vertical, respectively. The grid spacing for the deep region is three times coarser. The total number of grid points is about 3.2 billion, which is about the eighth in the case of using uniform grids. We use GMS adapted for multi GPU simulation on the supercomputer TSUBAME operated by Tokyo Institute of Technology. Simulated peak ground velocity (PGV) and velocity response spectra (Sv) are strongly affected by the hypocenter location and show a large variation up to 10-fold at each site even in a group that have the same source area. We evaluate hazard curves and maps for PGV and Sv using the

  3. Earthquake hazards on the cascadia subduction zone

    SciTech Connect

    Heaton, T.H.; Hartzell, S.H.

    1987-04-10

    Large subduction earthquakes on the Cascadia subduction zone pose a potential seismic hazard. Very young oceanic lithosphere (10 million years old) is being subducted beneath North America at a rate of approximately 4 centimeters per year. The Cascadia subduction zone shares many characteristics with subduction zones in southern Chile, southwestern Japan, and Colombia, where comparably young oceanic lithosphere is also subducting. Very large subduction earthquakes, ranging in energy magnitude (M/sub w/) between 8 and 9.5, have occurred along these other subduction zones. If the Cascadia subduction zone is also storing elastic energy, a sequence of several great earthquakes (M/sub w/ 8) or a giant earthquake (M/sub w/ 9) would be necessary to fill this 1200-kilometer gap. The nature of strong ground motions recorded during subduction earthquakes of M/sub w/ less than 8.2 is discussed. Strong ground motions from even larger earthquakes (M/sub w/ up to 9.5) are estimated by simple simulations. If large subduction earthquakes occur in the Pacific Northwest, relatively strong shaking can be expected over a large region. Such earthquakes may also be accompanied by large local tsunamis. 35 references, 6 figures.

  4. Earthquake hazards on the cascadia subduction zone.

    PubMed

    Heaton, T H; Hartzell, S H

    1987-04-10

    Large subduction earthquakes on the Cascadia subduction zone pose a potential seismic hazard. Very young oceanic lithosphere (10 million years old) is being subducted beneath North America at a rate of approximately 4 centimeters per year. The Cascadia subduction zone shares many characteristics with subduction zones in southern Chile, southwestern Japan, and Colombia, where comparably young oceanic lithosphere is also subducting. Very large subduction earthquakes, ranging in energy magnitude (M(w)) between 8 and 9.5, have occurred along these other subduction zones. If the Cascadia subduction zone is also storing elastic energy, a sequence of several great earthquakes (M(w) 8) or a giant earthquake (M(w) 9) would be necessary to fill this 1200-kilometer gap. The nature of strong ground motions recorded during subduction earthquakes of M(w) less than 8.2 is discussed. Strong ground motions from even larger earthquakes (M(w) up to 9.5) are estimated by simple simulations. If large subduction earthquakes occur in the Pacific Northwest, relatively strong shaking can be expected over a large region. Such earthquakes may also be accompanied by large local tsunamis.

  5. 76 FR 64325 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... Directive/PPD-8: National Preparedness to National Earthquake Hazards Reduction Program (NEHRP) activities...

  6. 77 FR 18792 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100 Bureau...

  7. 76 FR 72905 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... should be sent to National Earthquake Hazards Reduction Program Director, National Institute of Standards...

  8. 77 FR 19224 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100 Bureau...

  9. 75 FR 8042 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-23

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a.... Jack Hayes, National Earthquake Hazards Reduction Program Director, National Institute of Standards and...

  10. 77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... needs for existing buildings, to review the National Earthquake Hazards Reduction Program (NEHRP) agency...

  11. 75 FR 18787 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100 Bureau...

  12. 77 FR 27439 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100 Bureau...

  13. 75 FR 75457 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-03

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100 Bureau...

  14. 76 FR 8712 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... Effectiveness of the National Earthquake Hazards Reduction Program (NEHRP). The agenda may change to accommodate...

  15. Continued Earthquake Hazard in Northern Sumatra

    NASA Astrophysics Data System (ADS)

    Sørensen, Mathilde B.; Atakan, Kuvvet

    2008-04-01

    The occurrence of two large earthquakes (Mw = 8.4 and Mw = 7.9) along the Sumatran west coast on 12 September 2007 as well as an Mw = 7.4 event on 20 February 2008 have again put the high earthquake hazard of this region into focus. These events are the most recent in a series of major subduction zone earthquakes that began with the great Mw = 9.3 event of 26 December 2004 followed by an Mw = 8.7 event on 28 March 2005 [Bilham, 2005; Lay et al., 2005; Stein and Okal, 2005]. The major subduction zone earthquakes have been propagating southward along the Sunda trench, and the remaining stress is expected to be released along the subduction zone in a long stretch from the Andaman Sea in the north to the southernmost extension of the recent ruptures, especially in the southernmost part close to the Sunda Strait (Figure 1). Also, there is an additional and significant hazard due to potential earthquakes along the Great Sumatran Fault (GSF), a major right-lateral strike-slip fault parallel to the western coast of Sumatra. The GSF accommodates the component of plate convergence parallel to the trench, where strain partitioning is a result of the oblique collision along the Sunda trench.

  16. Tsunami Hazards From Strike-Slip Earthquakes

    NASA Astrophysics Data System (ADS)

    Legg, M. R.; Borrero, J. C.; Synolakis, C. E.

    2003-12-01

    Strike-slip faulting is often considered unfavorable for tsunami generation during large earthquakes. Although large strike-slip earthquakes triggering landslides and then generating substantial tsunamis are now recognized hazards, many continue to ignore the threat from submarine tectonic displacement during strike-slip earthquakes. Historical data record the occurrence of tsunamis from strike-slip earthquakes, for example, 1906 San Francisco, California, 1994 Mindoro, Philippines, and 1999 Izmit, Turkey. Recognizing that strike-slip fault zones are often curved and comprise numerous en echelon step-overs, we model tsunami generation from realistic strike-slip faulting scenarios. We find that tectonic seafloor uplift, at a restraining bend or"pop-up" structure, provides an efficient mechanism to generate destructive local tsunamis; likewise for subsidence at divergent pull-apart basin structures. Large earthquakes on complex strike-slip fault systems may involve both types of structures. The California Continental Borderland is a high-relief submarine part of the active Pacific-North America transform plate boundary. Natural harbors and bays created by long term vertical motion associated with strike-slip structural irregularities are now sites of burgeoning population and major coastal infrastructure. Significant local tsunamis generated by large strike-slip earthquakes pose a serious, and previously unrecognized threat. We model several restraining bend pop-up structures offshore southern California to quantify the local tsunami hazard. Maximum runup derived in our scenarios ranges from one to several meters, similar to runup observed from the 1994 Mindoro, Philippines, (M=7.1) earthquake. The runup pattern is highly variable, with local extremes along the coast. We only model the static displacement field for the strike-slip earthquake source; dynamic effects of moving large island or submerged banks laterally during strike-slip events remains to be examined

  17. The 1909 Taipei earthquake: implication for seismic hazard in Taipei

    USGS Publications Warehouse

    Kanamori, Hiroo; Lee, William H.K.; Ma, Kuo-Fong

    2012-01-01

    The 1909 April 14 Taiwan earthquake caused significant damage in Taipei. Most of the information on this earthquake available until now is from the written reports on its macro-seismic effects and from seismic station bulletins. In view of the importance of this event for assessing the shaking hazard in the present-day Taipei, we collected historical seismograms and station bulletins of this event and investigated them in conjunction with other seismological data. We compared the observed seismograms with those from recent earthquakes in similar tectonic environments to characterize the 1909 earthquake. Despite the inevitably large uncertainties associated with old data, we conclude that the 1909 Taipei earthquake is a relatively deep (50–100 km) intraplate earthquake that occurred within the subducting Philippine Sea Plate beneath Taipei with an estimated M_W of 7 ± 0.3. Some intraplate events elsewhere in the world are enriched in high-frequency energy and the resulting ground motions can be very strong. Thus, despite its relatively large depth and a moderately large magnitude, it would be prudent to review the safety of the existing structures in Taipei against large intraplate earthquakes like the 1909 Taipei earthquake.

  18. Increased Earthquake Rates in the Central and Eastern US Portend Higher Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Rubinstein, J. L.; Ellsworth, W. L.; Mueller, C. S.; Michael, A. J.; McGarr, A.; Petersen, M. D.; Weingarten, M.; Holland, A. A.

    2014-12-01

    Since 2009 the central and eastern United States has experienced an unprecedented increase in the rate of M≥3 earthquakes that is unlikely to be due to natural variation. Where the rates have increased so has the seismic hazard, making it important to understand these changes. Areas with significant seismicity increases are limited to areas where oil and gas production take place. By far the largest contributor to the seismicity increase is Oklahoma, where recent studies suggest that these rate changes may be due to fluid injection (e.g., Keranen et al., Geology, 2013; Science, 2014). Moreover, the area of increased seismicity in northern Oklahoma that began in 2013 coincides with the Mississippi Lime play, where well completions greatly increased the year before the seismicity increase. This suggests a link to oil and gas production either directly or from the disposal of significant amounts of produced water within the play. For the purpose of assessing the hazard due to these earthquakes, should they be treated differently from natural earthquakes? Previous studies suggest that induced seismicity may differ from natural seismicity in clustering characteristics or frequency-magnitude distributions (e.g., Bachmann et al., GJI, 2011; Llenos and Michael, BSSA, 2013). These differences could affect time-independent hazard computations, which typically assume that clustering and size distribution remain constant. In Oklahoma, as well as other areas of suspected induced seismicity, we find that earthquakes since 2009 tend to be considerably more clustered in space and time than before 2009. However differences between various regional and national catalogs leave unclear whether there are significant changes in magnitude distribution. Whether they are due to natural or industrial causes, the increased earthquake rates in these areas could increase the hazard in ways that are not accounted for in current hazard assessment practice. Clearly the possibility of induced

  19. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale

    NASA Astrophysics Data System (ADS)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.

    2016-05-01

    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  20. Assessing earthquake hazards with fault trench and LiDAR maps in the Puget Lowland, Washington, USA (Invited)

    NASA Astrophysics Data System (ADS)

    Nelson, A. R.; Bradley, L.; Personius, S. F.; Johnson, S. Y.

    2010-12-01

    Deciphering the earthquake histories of faults over the past few thousands of years in tectonically complex forearc regions relies on detailed site-specific as well as regional geologic maps. Here we present examples of site-specific USGS maps used to reconstruct earthquake histories for faults in the Puget Lowland. Near-surface faults and folds in the Puget Lowland accommodate 4-7 mm/yr of north-south shortening resulting from northward migration of forearc blocks along the Cascadia convergent margin. The shortening has produced east-trending uplifts, basins, and associated reverse faults that traverse urban areas. Near the eastern and northern flanks of the Olympic Mountains, complex interactions between north-south shortening and mountain uplift are reflected by normal, oblique-slip, and reverse surface faults. Holocene oblique-slip movement has also been mapped on Whidbey Island and on faults in the foothills of the Cascade Mountains in the northeastern lowland. The close proximity of lowland faults to urban areas may pose a greater earthquake hazard there than do much longer but more distant plate-boundary faults. LiDAR imagery of the densely forested lowland flown over the past 12 years revealed many previously unknown 0.5-m to 6-m-high scarps showing Holocene movement on upper-plate faults. This imagery uses two-way traveltimes of laser light pulses to detect as little as 0.2 m of relative relief on the forest floor. The returns of laser pulses with the longest travel times yield digital elevation models of the ground surface, which we vertically exaggerate and digitally shade from multiple directions at variable transparencies to enhance identification of scarps. Our maps include imagery at scales of 1:40,000 to 1:2500 with contour spacings of 100 m to 0.5 m. Maps of the vertical walls of fault-scarp trenches show complex stratigraphies and structural relations used to decipher the histories of large surface-rupturing earthquakes. These logs (field mapping

  1. Structural geometry of the source region for the 2013 Mw 6.6 Lushan earthquake: Implication for earthquake hazard assessment along the Longmen Shan

    NASA Astrophysics Data System (ADS)

    Li, Yiquan; Jia, Dong; Wang, Maomao; Shaw, John H.; He, Jiankun; Lin, Aiming; Xiong, Lin; Rao, Gang

    2014-03-01

    The 2013 Mw 6.6 Lushan earthquake occurred in the Longmen Shan fold-and-thrust belt, Sichuan Province, China, near the five-year anniversary of the devastating 2008 Mw 7.8 Wenchuan earthquake. To define the fault that generated the 2013 earthquake and its relationship with the Beichuan fault, which ruptured in the Wenchuan earthquake, we construct several cross sections and a 3D structural model. The sections and models reveal that the main-shock of the Lushan earthquake occurred on a portion of the Range Front blind thrust (RFBT) and that the structural geometry of this fault varies along strike. The Lushan main-shock occurred at a location along the strike of the fault where the geologic shortening and total fault slip are greatest. A lateral ramp of the RFBT appears to coincide with the northern limit of aftershocks from the Lushan earthquake, leading to a 75 km seismic gap between the Wenchuan earthquake and the 2013 earthquake sequence. Although both the Wenchuan and Lushan earthquakes occurred within the Longmen Shan fold-and-thrust belt, different faults generated the two events. Based on this structural characterization and analysis of the aftershocks of the Wenchuan and Lushan earthquakes, we suggest that the Lushan earthquake may have been triggered by the 2008 rupture but is best considered as an independent event rather than an aftershock of the Wenchuan earthquake. The RFBT that generated the Lushan earthquake is linked to a detachment that extends into the Sichuan basin along the Triassic evaporite layer. The coulomb stress change simulation suggests that other faults linked to this detachment may have been loaded by the 2008 and 2013 earthquake, posing the risk of future earthquakes along the Longmen Shan and in the densely populated Sichuan basin.

  2. Earthquakes Pose a Serious Hazard in Afghanistan

    USGS Publications Warehouse

    Crone, Anthony J.

    2007-01-01

    This report is USGS Afghanistan Project No. 155. This study was funded by an Interagency Agreement between the U.S. Agency for International Development (USAID) and the U.S. Geological Survey. Afghanistan is located in the geologically active part of the world where the northward-moving Indian plate is colliding with the southern part of the Eurasian plate at a rate of about 1.7 inches per year. This collision has created the world's highest mountains and causes slips on major faults that generate large, often devastating earthquakes. Every few years a powerful earthquake causes significant damage or fatalities. New construction needs to be designed to accommodate the hazards posed by strong earthquakes. The U.S. Geological Survey has developed a preliminary seismic-hazard map of Afghanistan. Although the map is generalized, it provides government officials, engineers, and private companies who are interested in participating in Afghanistan's growth with crucial information about the location and nature of seismic hazards.

  3. 77 FR 75610 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... Director. Any draft meeting materials will be posted prior to the meeting on the National Earthquake...

  4. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-17

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet....m. The primary purpose of this meeting is to receive information on NEHRP earthquake related...

  5. 78 FR 8109 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... Director. Any draft meeting materials will be posted prior to the meeting on the National Earthquake...

  6. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    NASA Astrophysics Data System (ADS)

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in

  7. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L.

    1993-01-01

    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and

  8. Earthquake Hazard and Risk in Alaska

    NASA Astrophysics Data System (ADS)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  9. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  10. Assessment of tsunami hazard to the U.S. East Coast using relationships between submarine landslides and earthquakes

    USGS Publications Warehouse

    ten Brink, U.S.; Lee, H.J.; Geist, E.L.; Twichell, D.

    2009-01-01

    Submarine landslides along the continental slope of the U.S. Atlantic margin are potential sources for tsunamis along the U.S. East coast. The magnitude of potential tsunamis depends on the volume and location of the landslides, and tsunami frequency depends on their recurrence interval. However, the size and recurrence interval of submarine landslides along the U.S. Atlantic margin is poorly known. Well-studied landslide-generated tsunamis in other parts of the world have been shown to be associated with earthquakes. Because the size distribution and recurrence interval of earthquakes is generally better known than those for submarine landslides, we propose here to estimate the size and recurrence interval of submarine landslides from the size and recurrence interval of earthquakes in the near vicinity of the said landslides. To do so, we calculate maximum expected landslide size for a given earthquake magnitude, use recurrence interval of earthquakes to estimate recurrence interval of landslide, and assume a threshold landslide size that can generate a destructive tsunami. The maximum expected landslide size for a given earthquake magnitude is calculated in 3 ways: by slope stability analysis for catastrophic slope failure on the Atlantic continental margin, by using land-based compilation of maximum observed distance from earthquake to liquefaction, and by using land-based compilation of maximum observed area of earthquake-induced landslides. We find that the calculated distances and failure areas from the slope stability analysis is similar or slightly smaller than the maximum triggering distances and failure areas in subaerial observations. The results from all three methods compare well with the slope failure observations of the Mw = 7.2, 1929 Grand Banks earthquake, the only historical tsunamigenic earthquake along the North American Atlantic margin. The results further suggest that a Mw = 7.5 earthquake (the largest expected earthquake in the eastern U

  11. Earthquake Hazard for Aswan High Dam Area

    NASA Astrophysics Data System (ADS)

    Ismail, Awad

    2016-04-01

    Earthquake activity and seismic hazard analysis are important components of the seismic aspects for very essential structures such as major dams. The Aswan High Dam (AHD) created the second man-made reservoir in the world (Lake Nasser) and is constructed near urban areas pose a high-risk potential for downstream life and property. The Dam area is one of the seismically active regions in Egypt and is occupied with several cross faults, which are dominant in the east-west and north-south. Epicenters were found to cluster around active faults in the northern part of Lake and AHD location. The space-time distribution and the relation of the seismicity with the lake water level fluctuations were studied. The Aswan seismicity separates into shallow and deep seismic zones, between 0 and 14 and 14 and 30 km, respectively. These two seismic zones behave differently over time, as indicated by the seismicity rate, lateral extent, b-value, and spatial clustering. It is characterized by earthquake swarm sequences showing activation of the clustering-events over time and space. The effect of the North African drought (1982 to present) is clearly seen in the reservoir water level. As it decreased and left the most active fault segments uncovered, the shallow activity was found to be more sensitive to rapid discharging than to the filling. This study indicates that geology, topography, lineations in seismicity, offsets in the faults, changes in fault trends and focal mechanisms are closely related. No relation was found between earthquake activity and both-ground water table fluctuations and water temperatures measured in wells located around the Kalabsha area. The peak ground acceleration is estimated in the dam site based on strong ground motion simulation. This seismic hazard analyses have indicated that AHD is stable with the present seismicity. The earthquake epicenters have recently took place approximately 5 km west of the AHD structure. This suggests that AHD dam must be

  12. Earthquake Hazard and Risk in New Zealand

    NASA Astrophysics Data System (ADS)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  13. Earthquake hazard after a mainshock in california.

    PubMed

    Reasenberg, P A; Jones, L M

    1989-03-03

    After a strong earthquake, the possibility of the occurrence of either significant aftershocks or an even stronger mainshock is a continuing hazard that threatens the resumption of critical services and reoccupation of essential but partially damaged structures. A stochastic parametric model allows determination of probabilities for aftershocks and larger mainshocks during intervals following the mainshock. The probabilities depend strongly on the model parameters, which are estimated with Bayesian statistics from both the ongoing aftershock sequence and from a suite of historic California aftershock sequences. Probabilities for damaging aftershocks and greater mainshocks are typically well-constrained after the first day of the sequence, with accuracy increasing with time.

  14. Late Holocene liquefaction features in the Dominican Republic: A powerful tool for earthquake hazard assessment in the northeastern Caribbean

    USGS Publications Warehouse

    Tuttle, M.P.; Prentice, C.S.; Dyer-Williams, K.; Pena, L.R.; Burr, G.

    2003-01-01

    Several generations of sand blows and sand dikes, indicative of significant and recurrent liquefaction, are preserved in the late Holocene alluvial deposits of the Cibao Valley in northern Dominican Republic. The Cibao Valley is structurally controlled by the Septentrional fault, an onshore section of the North American-Caribbean strike-slip plate boundary. The Septentrional fault was previously studied in the central part of the valley, where it sinistrally offsets Holocene terrace risers and soil horizons. In the eastern and western parts of the valley, the Septentrional fault is buried by Holocene alluvial deposits, making direct study of the structure difficult. Liquefaction features that formed in these Holocene deposits as a result of strong ground shaking provide a record of earthquakes in these areas. Liquefaction features in the eastern Cibao Valley indicate that at least one historic earthquake, probably the moment magnitude, M 8, 4 August 1946 event, and two to four prehistoric earthquakes of M 7 to 8 struck this area during the past 1100 yr. The prehistoric earthquakes appear to cluster in time and could have resulted from rupture of the central and eastern sections of the Septentrional fault circa A.D. 1200. Liquefaction features in the western Cibao Valley indicate that one historic earthquake, probably the M 8, 7 May 1842 event, and two prehistoric earthquakes of M 7-8 struck this area during the past 1600 yr. Our findings suggest that rupture of the Septentrional fault circa A.D. 1200 may have extended beyond the central Cibao Valley and generated an earthquake of M 8. Additional information regarding the age and size distribution of liquefaction features is needed to reconstruct the prehistoric earthquake history of Hispaniola and to define the long-term behavior and earthquake potential of faults associated with the North American-Caribbean plate boundary.

  15. Update earthquake risk assessment in Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  16. Update earthquake risk assessment in Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-12-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  17. National Earthquake Hazards Reduction Program; time to expand

    USGS Publications Warehouse

    Steinbrugge, K.V.

    1990-01-01

    All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role? 

  18. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    NASA Astrophysics Data System (ADS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-11-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh.

  19. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 25. Parameters for Specifying Intensity-Related Earthquake Ground Motions.

    DTIC Science & Technology

    1987-09-01

    adjacent to causative faults. 16 0) (I) go gal w) LLcc10 Wc, LL. off l UFO tb~~ mm c IL I.-. ujco~ C1 t 7- LWLL wi0 0 0 WU.4 zz zw4 WuE S< amAl Ito 1 NEAR...and Sponheuer, W. 1969. Scale of Seismic Intensity: Proc. Fourth World Conf. on Earthquake Engineering, Santiago, Chile . Murphy, J. R., and O’Brien, L

  20. Comprehensive Seismic Monitoring for Emergency Response and Hazards Assessment: Recent Developments at the USGS National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Buland, R. P.; Guy, M.; Kragness, D.; Patton, J.; Erickson, B.; Morrison, M.; Bryon, C.; Ketchum, D.; Benz, H.

    2009-12-01

    The USGS National Earthquake Information Center (NEIC) has put into operation a new generation of seismic acquisition, processing and distribution subsystems that seamlessly integrate regional, national and global seismic network data for routine monitoring of earthquake activity and response to large, damaging earthquakes. The system, Bulletin Hydra, was designed to meet Advanced National Seismic System (ANSS) design goals to handle thousands of channels of real-time seismic data, compute and distribute time-critical seismic information for emergency response applications, and manage the integration of contributed earthquake products and information, arriving from near-real-time up to six weeks after an event. Bulletin Hydra is able meet these goals due to a modular, scalable, and flexible architecture that supports on-the-fly consumption of new data, readily allows for the addition of new scientific processing modules, and provides distributed client workflow management displays. Through the Edge subsystem, Bulletin Hydra accepts waveforms in half a dozen formats. In addition, Bulletin Hydra accepts contributed seismic information including hypocenters, magnitudes, moment tensors, unassociated and associated picks, and amplitudes in a variety of formats including earthworm import/export pairs and EIDS. Bulletin Hydra has state-driven algorithms for computing all IASPEI standard magnitudes (e.g. mb, mb_BB, ML, mb_LG, Ms_20, and Ms_BB) as well as Md, Ms(VMAX), moment tensor algorithms for modeling different portions of the wave-field at different distances (e.g. teleseismic body-wave, centroid, and regional moment tensors), and broadband depth. All contributed and derived data are centrally managed in an Oracle database. To improve on single station observations, Bulletin Hydra also does continuous real-time beam forming of high-frequency arrays. Finally, workflow management displays are used to assist NEIC analysts in their day-to-day duties. All combined

  1. Prediction of earthquake hazard by hidden Markov model (around Bilecik, NW Turkey)

    NASA Astrophysics Data System (ADS)

    Can, Ceren Eda; Ergun, Gul; Gokceoglu, Candan

    2014-09-01

    Earthquakes are one of the most important natural hazards to be evaluated carefully in engineering projects, due to the severely damaging effects on human-life and human-made structures. The hazard of an earthquake is defined by several approaches and consequently earthquake parameters such as peak ground acceleration occurring on the focused area can be determined. In an earthquake prone area, the identification of the seismicity patterns is an important task to assess the seismic activities and evaluate the risk of damage and loss along with an earthquake occurrence. As a powerful and flexible framework to characterize the temporal seismicity changes and reveal unexpected patterns, Poisson hidden Markov model provides a better understanding of the nature of earthquakes. In this paper, Poisson hidden Markov model is used to predict the earthquake hazard in Bilecik (NW Turkey) as a result of its important geographic location. Bilecik is in close proximity to the North Anatolian Fault Zone and situated between Ankara and Istanbul, the two biggest cites of Turkey. Consequently, there are major highways, railroads and many engineering structures are being constructed in this area. The annual frequencies of earthquakes occurred within a radius of 100 km area centered on Bilecik, from January 1900 to December 2012, with magnitudes ( M) at least 4.0 are modeled by using Poisson-HMM. The hazards for the next 35 years from 2013 to 2047 around the area are obtained from the model by forecasting the annual frequencies of M ≥ 4 earthquakes.

  2. Prediction of earthquake hazard by hidden Markov model (around Bilecik, NW Turkey)

    NASA Astrophysics Data System (ADS)

    Can, Ceren; Ergun, Gul; Gokceoglu, Candan

    2014-09-01

    Earthquakes are one of the most important natural hazards to be evaluated carefully in engineering projects, due to the severely damaging effects on human-life and human-made structures. The hazard of an earthquake is defined by several approaches and consequently earthquake parameters such as peak ground acceleration occurring on the focused area can be determined. In an earthquake prone area, the identification of the seismicity patterns is an important task to assess the seismic activities and evaluate the risk of damage and loss along with an earthquake occurrence. As a powerful and flexible framework to characterize the temporal seismicity changes and reveal unexpected patterns, Poisson hidden Markov model provides a better understanding of the nature of earthquakes. In this paper, Poisson hidden Markov model is used to predict the earthquake hazard in Bilecik (NW Turkey) as a result of its important geographic location. Bilecik is in close proximity to the North Anatolian Fault Zone and situated between Ankara and Istanbul, the two biggest cites of Turkey. Consequently, there are major highways, railroads and many engineering structures are being constructed in this area. The annual frequencies of earthquakes occurred within a radius of 100 km area centered on Bilecik, from January 1900 to December 2012, with magnitudes (M) at least 4.0 are modeled by using Poisson-HMM. The hazards for the next 35 years from 2013 to 2047 around the area are obtained from the model by forecasting the annual frequencies of M ≥ 4 earthquakes.

  3. Vulnerability of port and harbor communities to earthquake and tsunami hazards: The use of GIS in community hazard planning

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.

    2004-01-01

    AbstractEarthquakes and tsunamis pose significant threats to Pacific Northwest coastal port and harbor communities. Developing holistic mitigation and preparedness strategies to reduce the potential for loss of life and property damage requires community-wide vulnerability assessments that transcend traditional site-specific analyses. The ability of a geographic information system (GIS) to integrate natural, socioeconomic, and hazards information makes it an ideal assessment tool to support community hazard planning efforts. This article summarizes how GIS was used to assess the vulnerability of an Oregon port and harbor community to earthquake and tsunami hazards, as part of a larger risk-reduction planning initiative. The primary purposes of the GIS were to highlight community vulnerability issues and to identify areas that both are susceptible to hazards and contain valued port and harbor community resources. Results of the GIS analyses can help decision makers with limited mitigation resources set priorities for increasing community resiliency to natural hazards.

  4. Active Fault Mapping of Naga-Disang Thrust (Belt of Schuppen) for Assessing Future Earthquake Hazards in NE India

    NASA Astrophysics Data System (ADS)

    Kumar, A.

    2014-12-01

    We observe the geodynamic appraisal of Naga-Disang Thrust North East India. The Disang thrust extends NE-SW over a length of 480 km and it defines the eastern margin of Neogene basin. It branches out from Haflong-Naga thrust and in the NE at Bulbulia in the right bank of Noa Dihing River, it is terminated by Mishmi thrust, which extends into Myanmar as 'Sagaing fault,which dip generally towards SE. It extends between Dauki fault in the SW and Mishmi thrust in the NE. When the SW end of 'Belt of Schuppen' moved upwards and towards east along the Dauki fault, the NE end moved downwards and towards west along the Mishmi thrust, causing its 'S' shaped bending. The SRTM generated DEM is used to map the topographic expression of the schuppen belt, where these thrusts are significantly marked by topographic break. Satellite imagery map also shows presence lineaments supporting the post tectonic activities along Naga-Disang Thrusts. The southern part of 'Belt of Schuppen' extends along the sheared western limb of southerly plunging Kohima synform, a part of Indo Burma Ranges (IBR) and it is seismically active.The crustal velocity at SE of Schuppen is 39.90 mm/yr with a azimuth of 70.780 at Lumami, 38.84 mm/yr (Azimuth 54.09) at Senapati and 36.85 mm/yr (Azimuth 54.09) at Imphal. The crustal velocity at NW of Schuppen belt is 52.67 mm/yr (Azimuth 57.66) near Dhauki Fault in Meghalaya. It becomes 43.60 mm/yr (Azimuth76.50) - 44.25 (Azimuth 73.27) at Tiding and Kamlang Nagar around Mishmi thrust. The presence of Schuppen is marked by a change in high crustal velocity from Indian plate to low crustal velocity in Mishmi Suture as well as Indo Burma Ranges. The difference in crustal velocities results in building up of strain along the Schuppen which may trigger a large earthquake in the NE India in future. The belt of schuppean seems to be seismically active, however, the enough number of large earthquakes are not recorded. These observations are significant on Naga

  5. Nationwide tsunami hazard assessment project in Japan

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2014-12-01

    In 2012, we began a project of nationwide Probabilistic Tsunami Hazard Assessment (PTHA) in Japan to support various measures (Fujiwara et al., 2013, JpGU; Hirata et al., 2014, AOGS). The most important strategy in the nationwide PTHA is predominance of aleatory uncertainty in the assessment but use of epistemic uncertainty is limited to the minimum, because the number of all possible combinations among epistemic uncertainties diverges quickly when the number of epistemic uncertainties in the assessment increases ; we consider only a type of earthquake occurrence probability distribution as epistemic uncertainty. We briefly show outlines of the nationwide PTHA as follows; (i) we consider all possible earthquakes in the future, including those that the Headquarters for Earthquake Research Promotion (HERP) of Japanese Government, already assessed. (ii) We construct a set of simplified earthquake fault models, called "Characterized Earthquake Fault Models (CEFMs)", for all of the earthquakes by following prescribed rules (Toyama et al., 2014, JpGU; Korenaga et al., 2014, JpGU). (iii) For all of initial water surface distributions caused by a number of the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. (iv) Finally, we integrate information about the tsunamis calculated from the numerous CEFMs to get nationwide tsunami hazard assessments. One of the most popular representations of the integrated information is a tsunami hazard curve for coastal tsunami heights, incorporating uncertainties inherent in tsunami simulation and earthquake fault slip heterogeneity (Abe et al., 2014, JpGU). We will show a PTHA along the eastern coast of Honshu, Japan, based on approximately 1,800 tsunami sources located within the subduction zone along the Japan Trench, as a prototype of the nationwide PTHA. This study is supported by part of the research

  6. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  7. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto; Murru, Maura; Falcone, Giuseppe; Parsons, Tom

    2017-03-01

    The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.

  8. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto; Murru, Maura; Falcone, Giuseppe; Parsons, Tom

    2017-02-01

    The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.

  9. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    USGS Publications Warehouse

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto; Murru, Maura; Falcone, Giuseppe; Parsons, Thomas E.

    2017-01-01

    The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.

  10. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    USGS Publications Warehouse

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto; Murru, Maura; Falcone, Giuseppe; Parsons, Thomas E.

    2017-01-01

    The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.

  11. Fault Imaging with High-Resolution Seismic Reflection for Earthquake Hazard and Geothermal Resource Assessment in Reno, Nevada

    SciTech Connect

    Frary, Roxanna

    2012-05-05

    The Truckee Meadows basin is situated adjacent to the Sierra Nevada microplate, on the western boundary of the Walker Lane. Being in the transition zone between a range-front normal fault on the west and northwest-striking right-lateral strike slip faults to the east, there is no absence of faulting in this basin. The Reno- Sparks metropolitan area is located in this basin, and with a signi cant population living here, it is important to know where these faults are. High-resolution seismic reflection surveys are used for the imaging of these faults along the Truckee River, across which only one fault was previously mapped, and in southern Reno near and along Manzanita Lane, where a swarm of short faults has been mapped. The reflection profiles constrain the geometries of these faults, and suggest additional faults not seen before. Used in conjunction with depth to bedrock calculations and gravity measurements, the seismic reflection surveys provide de nitive locations of faults, as well as their orientations. O sets on these faults indicate how active they are, and this in turn has implications for seismic hazard in the area. In addition to seismic hazard, the faults imaged here tell us something about the conduits for geothermal fluid resources in Reno.

  12. Identification and Reduction of Nonstructural Earthquake Hazards in California Schools.

    ERIC Educational Resources Information Center

    Greene, Marjorie; And Others

    It is necessary to identify nonstructural hazards at the school site to reduce the possibly of injury in the event of an earthquake. Nonstructural hazards can occur in every part of a building and all of its contents with the exception of structure. In other words, nonstructural elements are everything but the columns, beams, floors, load-bearing…

  13. Guide and Checklist for Nonstructural Earthquake Hazards in California Schools.

    ERIC Educational Resources Information Center

    2003

    The recommendations included in this document are intended to reduce seismic hazards associated with the non-structural components of schools buildings, including mechanical systems, ceiling systems, partitions, light fixtures, furnishings, and other building contents. It identifies potential earthquake hazards and provides recommendations for…

  14. Workshop on evaluation of earthquake hazards and risk in the Puget Sound and Portland areas

    SciTech Connect

    Hays, W.W.; Kitzmiller, C.

    1988-01-01

    Three tasks were undertaken in the forum provided by the workshop: (1) assessing the present state-of-knowledge of earthquake hazards in Washington and Oregon including scientific, engineering, and hazard-reduction components; (2) determining the need for additional scientific, engineering, and societal response information to implement an effective earthquake-hazard reduction program; and (3) developing a strategy for implementing programs to reduce potential earthquake losses and to foster preparedness and mitigation. Thirty-five papers were given at the workshop and each of these has been abstracted for the U.S. Department of Energy's Energy Data Base (EDB). In addition, the volume includes a glossary of technical terms used in earthquake engineering in Appendix A.

  15. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  16. Problems of seismic hazard estimation in regions with few large earthquakes: Examples from eastern Canada

    NASA Astrophysics Data System (ADS)

    Basham, P. W.; Adams, John

    1989-10-01

    Seismic hazard estimates and seismic zoning maps are based on an assessment of historical and recent seismieity and any correlations with geologic and tectonic features that might define the earthquake potential. Evidence is accumulating that the large earthquakes in eastern Canada ( M ~ 7) may be associated with the rift systems hat surround or break the integrity of the North American craton. The problem for seismic hazard estimation is that the larger historical earthquakes are not uniformly distributed along the Paleozoic St. Lawrence-Ottawa rift system and are too rare on the Mesozoic eastern margin rift to assess the overall seismogenic potential. Multiple source zone models for hazard estimation could include hypotheses of future M = 7 earthquakes at any location along these rift systems, but at a moderate probability (such as that used in the Canadian zoning maps) the resultant hazard will be so diluted that it will not result in adequate design against the near-source effects of such earthquakes. The near-source effects of large, rare earthquakes can, however, be accommodated in conservative codes and standards for critical facilities, if society is willing to pay the price.

  17. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  18. Investigating Canada's Lithosphere and earthquake hazards with portable arrays

    NASA Astrophysics Data System (ADS)

    Eaton, D. W.; Adams, J.; Asudeh, I.; Atkinson, G. M.; Bostock, M. G.; Cassidy, J. F.; Ferguson, I. J.; Samson, C.; Snyder, D. B.; Tiampo, K. F.; Unsworth, M. J.

    A multi-institutional research initiative, POLARIS, is providing scientists with unprecedented opportunities to map Earth structure and assess earthquake hazards across Canada. By completion of the initiative's installation phase in August 2005, deployments of POLARIS (Portable Observatories for Lithospheric Analysis and Research Investigating Seismicity) instruments will include 100 telemetered broadband seismograph systems, 10 with continuous-recording magnetotelluric (MT) devices (devices that record natural variations in the geomagnetic field). Data from these observatories are transmitted by satellite (with a latency of 5 s) to data acquisition hubs in London (Canada) and Ottawa, where they are made available in near real-time by an automatic data-request manager (AutoDRM).Conceived in 2000 by an interdisciplinary group of 10 geoscientists, the 4-year, C$11 million infrastructure project is fostering strong partnerships between academia, government laboratories, and the private sector.

  19. Tank farms hazards assessment

    SciTech Connect

    Broz, R.E.

    1994-09-30

    Hanford contractors are writing new facility specific emergency procedures in response to new and revised US Department of Energy (DOE) Orders on emergency preparedness. Emergency procedures are required for each Hanford facility that has the potential to exceed the criteria for the lowest level emergency, an Alert. The set includes: (1) a facility specific procedure on Recognition and Classification of Emergencies, (2) area procedures on Initial Emergency Response and, (3) an area procedure on Protective Action Guidance. The first steps in developing these procedures are to identify the hazards at each facility, identify the conditions that could release the hazardous material, and calculate the consequences of the releases. These steps are called a Hazards Assessment. The final product is a document that is similar in some respects to a Safety Analysis Report (SAR). The document could br produced in a month for a simple facility but could take much longer for a complex facility. Hanford has both types of facilities. A strategy has been adopted to permit completion of the first version of the new emergency procedures before all the facility hazards Assessments are complete. The procedures will initially be based on input from a task group for each facility. This strategy will but improved emergency procedures in place sooner and therefore enhance Hanford emergency preparedness. The purpose of this document is to summarize the applicable information contained within the Waste Tank Facility ``Interim Safety Basis Document, WHC-SD-WM-ISB-001`` as a resource, since the SARs covering Waste Tank Operations are not current in all cases. This hazards assessment serves to collect, organize, document and present the information utilized during the determination process.

  20. Roaming earthquakes in China highlight midcontinental hazards

    NASA Astrophysics Data System (ADS)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  1. Seismic survey probes urban earthquake hazards in Pacific Northwest

    USGS Publications Warehouse

    Fisher, M.A.; Brocher, T.M.; Hyndman, R.D.; Trehu, A.M.; Weaver, C.S.; Creager, K.C.; Crosson, R.S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B.C.; Hammer, P.T.; Childs, J. R.; Cochrane, G.R.; Chopra, S.; Walia, R.

    1999-01-01

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region. The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  2. Seismic survey probes urban earthquake hazards in Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Fisher, M. A.; Brocher, T. M.; Hyndman, R. D.; Trehu, A. M.; Weaver, C. S.; Creager, K. C.; Crosson, R. S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B. C.; Hammer, P. T.; ten Brink, U.; Pratt, T. L.; Miller, K. C.; Childs, J. R.; Cochrane, G. R.; Chopra, S.; Walia, R.

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region.The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  3. PUREX facility hazards assessment

    SciTech Connect

    Sutton, L.N.

    1994-09-23

    This report documents the hazards assessment for the Plutonium Uranium Extraction Plant (PUREX) located on the US Department of Energy (DOE) Hanford Site. Operation of PUREX is the responsibility of Westinghouse Hanford Company (WHC). This hazards assessment was conducted to provide the emergency planning technical basis for PUREX. DOE Order 5500.3A requires an emergency planning hazards assessment for each facility that has the potential to reach or exceed the lowest level emergency classification. In October of 1990, WHC was directed to place PUREX in standby. In December of 1992 the DOE Assistant Secretary for Environmental Restoration and Waste Management authorized the termination of PUREX and directed DOE-RL to proceed with shutdown planning and terminal clean out activities. Prior to this action, its mission was to reprocess irradiated fuels for the recovery of uranium and plutonium. The present mission is to establish a passively safe and environmentally secure configuration at the PUREX facility and to preserve that condition for 10 years. The ten year time frame represents the typical duration expended to define, authorize and initiate follow-on decommissioning and decontamination activities.

  4. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  5. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  6. Central US earthquake catalog for hazard maps of Memphis, Tennessee

    USGS Publications Warehouse

    Wheeler, R.L.; Mueller, C.S.

    2001-01-01

    An updated version of the catalog that was used for the current national probabilistic seismic-hazard maps would suffice for production of large-scale hazard maps of the Memphis urban area. Deaggregation maps provide guidance as to the area that a catalog for calculating Memphis hazard should cover. For the future, the Nuttli and local network catalogs could be examined for earthquakes not presently included in the catalog. Additional work on aftershock removal might reduce hazard uncertainty. Graphs of decadal and annual earthquake rates suggest completeness at and above magnitude 3 for the last three or four decades. Any additional work on completeness should consider the effects of rapid, local population changes during the Nation's westward expansion. ?? 2001 Elsevier Science B.V. All rights reserved.

  7. Seismic hazard assessments at Islamic Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Khalil, A. E.; Deif, A.; Abdel Hafiez, H. E.

    2015-12-01

    Islamic Cairo is one of the important Islamic monumental complexes in Egypt, near the center of present-day metropolitan Cairo. The age of these buildings is up to one thousand years. Unfortunately, many of the buildings are suffering from huge mishandling that may lead to mass damage. Many buildings and masjids were partially and totally collapsed because of 12th October 1992 Cairo earthquake that took place at some 25 km from the study area with a magnitude Mw = 5.8. Henceforth, potential damage assessments there are compulsory. The deterministic and probabilistic techniques were used to predict the expected future large earthquakes' strong-motion characteristics in the study area. The current study started with compiling the available studies concerned with the distribution of the seismogenic sources and earthquake catalogs. The deterministic method is used to provide a description of the largest earthquake effect on the area of interest, while the probabilistic method, on the other hand, is used to define the uniform hazard curves at three time periods 475, 950, 2475 years. Both deterministic and probabilistic results were obtained for bedrock conditions and the resulted hazard levels were deaggregated to identify the contribution of each seismic source to the total hazard. Moreover, the results obtained show that the expected seismic activities combined with the present situation of the buildings pose high alert to rescue both the cultural heritage and expected human losses.

  8. Earthquake induced landslide hazard field observatory in the Avcilar peninsula

    NASA Astrophysics Data System (ADS)

    Bigarre, Pascal; Coccia, Stella; Theoleyre, Fiona; Ergintav, Semih; Özel, Oguz; Yalçinkaya, Esref; Lenti, Luca; Martino, Salvatore; Gamba, Paolo; Zucca, Francesco; Moro, Marco

    2015-04-01

    SAR temporal series has been undertaken, providing global but accurate Identification and characterization of gravitational phenomena covering the aera. Evaluation of the resolution and identification of landslide hazard-related features using space multispectral/hyperspectral image data has been realized. Profit has been gained from a vast drilling and geological - geotechnical survey program undertaken by the Istanbul Metropolitan Area, to get important data to complete the geological model of the landslide as well as one deep borehole to set up permanent instrumentation on a quite large slow landslide, fully encircled by a dense building environment. The selected landslide was instrumented in 2014 with a real-time observational system including GPS, rainfall, piezometer and seismic monitoring. Objective of this permanent monitoring system is three folds: first to detect and quantify interaction between seismic motion, rainfall and mass movement, building a database opened to the scientific community in the future, second to help to calibrate dynamic numerical geomechanical simulations intending to study the sensitivity to seismic loading, and last but not least. Last but not least important geophysical field work has been conducted to assess seismic site effects already noticed during the 1999 earthquake .Data, metadata and main results are from now progressively compiled and formatted for appropriate integration in the cloud monitoring infrastructure for data sharing.

  9. How Well Do Earthquake Hazard Maps Work and How Good Do They Have to be?

    NASA Astrophysics Data System (ADS)

    Brooks, E.; Stein, S. A.; Spencer, B. D.

    2014-12-01

    Earthquake hazard maps seek to describe the level of earthquake hazards in a region and provide a scientific foundation for earthquake preparation and mitigation. In many cases these maps do reasonably well. However, recent large earthquakes that did great damage in areas predicted to be relatively safe illustrate the need to assess how well these maps are actually performing and how good they need to be to be useful. Economic analysis comparing the cost of mitigation to the expected reduction in loss shows that an inaccurate hazard estimate still is useful as long as it is not too much of an overestimate. Because better hazard forecasts can yield better mitigation policy, we need agreed ways of assessing how well a map performed and thus whether one map performed better than another. The metric implicit in current maps, that during the chosen time interval the predicted ground motion will be exceeded only at a specific fraction of the sites, is useful but permits maps to be nominally successful although they significantly underpredict or overpredict shaking, or to be nominally unsuccessful but do well in terms of predicting shaking. We explore some possible metrics that better measure the effects of overprediction and underprediction and can be weighted to reflect the two differently and to reflect differences in populations and property at risk. Although no single metric alone fully characterizes map behavior, using several metrics can provide useful insight for comparing and improving hazard maps.

  10. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  11. Community Exposure and Sensitivity to Earthquake Hazards in Washington State

    NASA Astrophysics Data System (ADS)

    Ratliff, J.; Wood, N. J.; Weaver, C. S.

    2011-12-01

    Communities in Washington State are potentially threatened by earthquakes from many sources, including the Cascadia Subduction zone and myriad inland faults (Seattle fault, Tacoma fault, etc.). The USGS Western Geographic Science Center, in collaboration with the State of Washington Military Department Emergency Management Division, has been working to identify Washington community vulnerability to twenty-one earthquake scenarios to provide assistance for mitigation, preparedness, and outreach. We calculate community earthquake exposure and sensitivity by overlaying demographic and economic data with peak ground acceleration values of each scenario in a geographic information system. We summarize community and county earthquake vulnerability to assist emergency managers by the number of earthquake scenarios affecting each area, as well as the number of residents, occupied households, businesses (individual and sector), and employees in each predicted Modified Mercalli Intensity value (ranging from V to IX). Percentages based on community, county, and scenario totals also provide emergency managers insight to community sensitivity to the earthquake scenarios. Results indicate significant spatial and temporal residential variations as well as spatial economic variations in exposure and sensitivity to earthquake hazards in the State of Washington, especially for communities west of the Cascade Range.

  12. Probabilistic Tsunami Hazard Assessment - Application to the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Sorensen, M. B.; Spada, M.; Babeyko, A.; Wiemer, S.; Grünthal, G.

    2009-12-01

    Following several large tsunami events around the world in the recent years, the tsunami hazard is becoming an increasing concern. The traditional way of assessing tsunami hazard has been through deterministic scenario calculations which provide the expected wave heights due to a given tsunami source, usually a worst-case scenario. For quantitative hazard and risk assessment, however, it is necessary to move towards a probabilistic framework. In this study we focus on earthquake generated tsunamis and present a scheme for probabilistic tsunami hazard assessment (PTHA). Our PTHA methodology is based on the use of Monte-Carlo simulations and follows probabilistic seismic hazard assessment methodologies closely. The PTHA is performed in four steps. First, earthquake and tsunami catalogues are analyzed in order to define a number of potential tsunami sources in the study area. For each of these sources, activity rates, maximum earthquake magnitude and uncertainties are assigned. Following, a synthetic earthquake catalogue is established, based on the information about the sources. The third step is to calculate multiple synthetic tsunami scenarios for all potentially tsunamigenic earthquakes in the synthetic catalogue. The tsunami scenarios are then combined at the fourth step to generate hazard curves and maps. We implement the PTHA methodology in the Mediterranean Sea, where numerous tsunami events have been reported in history. We derive a 100000 year-long catalog of potentially tsunamigenic earthquakes and calculate tsunami propagation scenarios for ca. 85000 M6.5+ earthquakes from the synthetic catalog. Results show that the highest tsunami hazard is attributed to the Eastern Mediterranean region, but that also the Western Mediterranean can experience significant tsunami waves for long return periods. Hazard maps will be presented for a range of probability levels together with hazard curves for selected critical locations.

  13. 2016 one-year seismic hazard forecast for the Central and Eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-03-28

    The U.S. Geological Survey (USGS) has produced a 1-year seismic hazard forecast for 2016 for the Central and Eastern United States (CEUS) that includes contributions from both induced and natural earthquakes. The model assumes that earthquake rates calculated from several different time windows will remain relatively stationary and can be used to forecast earthquake hazard and damage intensity for the year 2016. This assessment is the first step in developing an operational earthquake forecast for the CEUS, and the analysis could be revised with updated seismicity and model parameters. Consensus input models consider alternative earthquake catalog durations, smoothing parameters, maximum magnitudes, and ground motion estimates, and represent uncertainties in earthquake occurrence and diversity of opinion in the science community. Ground shaking seismic hazard for 1-percent probability of exceedance in 1 year reaches 0.6 g (as a fraction of standard gravity [g]) in northern Oklahoma and southern Kansas, and about 0.2 g in the Raton Basin of Colorado and New Mexico, in central Arkansas, and in north-central Texas near Dallas. Near some areas of active induced earthquakes, hazard is higher than in the 2014 USGS National Seismic Hazard Model (NHSM) by more than a factor of 3; the 2014 NHSM did not consider induced earthquakes. In some areas, previously observed induced earthquakes have stopped, so the seismic hazard reverts back to the 2014 NSHM. Increased seismic activity, whether defined as induced or natural, produces high hazard. Conversion of ground shaking to seismic intensity indicates that some places in Oklahoma, Kansas, Colorado, New Mexico, Texas, and Arkansas may experience damage if the induced seismicity continues unabated. The chance of having Modified Mercalli Intensity (MMI) VI or greater (damaging earthquake shaking) is 5–12 percent per year in north-central Oklahoma and southern Kansas, similar to the chance of damage caused by natural earthquakes

  14. Earthquake Hazard Analysis using Geological Characteristics and Geographic Information System (GIS) in the Southeastern Part of Korea

    NASA Astrophysics Data System (ADS)

    Song, Kyo-Young

    2010-05-01

    Earthquake Hazard Analysis using Geological Characteristics and GIS in the Southeastern Part of Korea The purpose of this study is to investigate earthquake hazards using geologic characteristics and geographic information system (GIS) for assessment and mitigation of earthquake hazards. The southeastern part of Korean peninsula, especially Ulsan and Pohang cities, was chosen for construction of GIS database and analysis of earthquake hazards such as liquefaction, landslide. Two municipal areas are represented as ones of the most populous industrial cities in Korea. However, several large-scale faults such as the Yangsan fault occurred in the vicinity of those areas. In this study, important factors closely related to earthquake hazards such as seismicity, geology, soil distribution, groundwater depth and ground slope data were compiled for spatial database using GIS, and ranked by relative susceptibility of earthquake hazards. To classify vulnerable areas and analyze probability for susceptibility of earthquake hazards, each factor was computed and applied to established dataset for liquefaction and landslide induced from earthquake. To present, the probability of liquefaction in the study area is calculated to about 0.012~0.133 when g value is 0.13~0.14 g. But if the moment magnitude increases to 7.0, the probability of liquefaction increases up to 0.802. The probability of landslide is almost null at present, but it increases rapidly when the moment magnitude reaches 5.0. The landslide is expected in all unstable slopes when the moment magnitude exceeds 6.0. The acquired results from the study area indicate that the liquefaction and landslide induced from earthquake is closely related to the geology. Therefore, general geology such as kind of rocks and age of rocks is very important factor in analyzing earthquake hazards.

  15. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  16. Remotely Triggered Earthquakes in Intraplate Regions: Distributed Hazard, Dependent Events

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Mueller, K.; Bilham, R.; Ambraseys, N.; Martin, S.

    2003-12-01

    The central and eastern United States has experienced only 5 historic earthquakes with Mw above 7.0, the 1886 Charleston earthquake and four during the New Madrid sequence of 1811-1812 (three principal mainshocks and the so-called ``dawn aftershock'' following the first mainshock.) Careful consideration of historic accounts yields compelling evidence for a number of remotely triggered earthquakes in both 1812 and 1886, including several events large enough to be potentially damaging. We propose that one of the (alleged) New Madrid mainshocks, on 23 January 1812, may itself be a remotely triggered earthquake, with a location some 200 km north of the New Madrid Seismic Zone. Our proposed source location is near the location of the 1968 southern Illinois earthquake, which occurred on a blind thrust fault at 20-25 km depth. Intensity data for the 1812 event are consistent with expectations for a similarly deep event. Such triggered events presumably do not represent a wholly new source of hazard but rather a potential source of dependent hazard. That is, the common assumption is that the triggering will cause only a ``clock advance,'' rather than causing earthquakes that would not have otherwise occurred. However, in a low strain-rate region, a given dynamic stress change can potentially represent a much larger clock advance than the same change would cause in a high strain-rate region. Moreover, in regions with low seismicity and a short historic record, overlooked remotely triggered historic earthquakes may be important events. It is thus possible that significant events are currently missing from the historic catalogs. Such events--even if large--can be difficult to identify without instrumental data. The (interplate) 1905 Kangra, India earthquake, further illustrates this point. In this case, early seismic records provide corroboration of an early triggered event whose existence is suggested--but difficult to prove--based on detailed macroseismic data. In the

  17. EFEHR - the European Facilities for Earthquake Hazard and Risk: beyond the web-platform

    NASA Astrophysics Data System (ADS)

    Danciu, Laurentiu; Wiemer, Stefan; Haslinger, Florian; Kastli, Philipp; Giardini, Domenico

    2017-04-01

    European Facilities for Earthquake Hazard and Risk (EEFEHR) represents the sustainable community resource for seismic hazard and risk in Europe. The EFEHR web platform is the main gateway to access data, models and tools as well as provide expertise relevant for assessment of seismic hazard and risk. The main services (databases and web-platform) are hosted at ETH Zurich and operated by the Swiss Seismological Service (Schweizerischer Erdbebendienst SED). EFEHR web-portal (www.efehr.org) collects and displays (i) harmonized datasets necessary for hazard and risk modeling, e.g. seismic catalogues, fault compilations, site amplifications, vulnerabilities, inventories; (ii) extensive seismic hazard products, namely hazard curves, uniform hazard spectra and maps for national and regional assessments. (ii) standardized configuration files for re-computing the regional seismic hazard models; (iv) relevant documentation of harmonized datasets, models and web-services. Today, EFEHR distributes full output of the 2013 European Seismic Hazard Model, ESHM13, as developed within the SHARE project (http://www.share-eu.org/); the latest results of the 2014 Earthquake Model of the Middle East (EMME14), derived within the EMME Project (www.emme-gem.org); the 2001 Global Seismic Hazard Assessment Project (GSHAP) results and the 2015 updates of the Swiss Seismic Hazard. New datasets related to either seismic hazard or risk will be incorporated as they become available. We present the currents status of the EFEHR platform, with focus on the challenges, summaries of the up-to-date datasets, user experience and feedback, as well as the roadmap to future technological innovation beyond the web-platform development. We also show the new services foreseen to fully integrate with the seismological core services of European Plate Observing System (EPOS).

  18. Ice Mass Fluctuations and Earthquake Hazard

    NASA Technical Reports Server (NTRS)

    Sauber, J.

    2006-01-01

    In south central Alaska, tectonic strain rates are high in a region that includes large glaciers undergoing ice wastage over the last 100-150 years [Sauber et al., 2000; Sauber and Molnia, 2004]. In this study we focus on the region referred to as the Yakataga segment of the Pacific-North American plate boundary zone in Alaska. In this region, the Bering and Malaspina glacier ablation zones have average ice elevation decreases from 1-3 meters/year (see summary and references in Molnia, 2005). The elastic response of the solid Earth to this ice mass decrease alone would cause several mm/yr of horizontal motion and uplift rates of up to 10-12 mm/yr. In this same region observed horizontal rates of tectonic deformation range from 10 to 40 mm/yr to the north-northwest and the predicted tectonic uplift rates range from -2 mm/year near the Gulf of Alaska coast to 12mm/year further inland [Savage and Lisowski, 1988; Ma et al, 1990; Sauber et al., 1997, 2000, 2004; Elliot et al., 2005]. The large ice mass changes associated with glacial wastage and surges perturb the tectonic rate of deformation at a variety of temporal and spatial scales. The associated incremental stress change may enhance or inhibit earthquake occurrence. We report recent (seasonal to decadal) ice elevation changes derived from data from NASA's ICESat satellite laser altimeter combined with earlier DEM's as a reference surface to illustrate the characteristics of short-term ice elevation changes [Sauber et al., 2005, Muskett et al., 2005]. Since we are interested in evaluating the effect of ice changes on faulting potential, we calculated the predicted surface displacement changes and incremental stresses over a specified time interval and calculated the change in the fault stability margin using the approach given by Wu and Hasegawa [1996]. Additionally, we explored the possibility that these ice mass fluctuations altered the seismic rate of background seismicity. Although we primarily focus on

  19. Model Uncertainty, Earthquake Hazard, and the WGCEP-2002 Forecast

    NASA Astrophysics Data System (ADS)

    Page, M. T.; Carlson, J. M.

    2005-12-01

    Model uncertainty is prevalent in Probabilistic Seismic Hazard Analysis (PSHA) because the true mechanism generating risk is unknown. While it is well-understood how to incorporate parameter uncertainty in PSHA, model uncertainty is more difficult to incorporate due to the high degree of dependence between different earthquake-recurrence models. We find that the method used by the 2002 Working Group on California Earthquake Probabilities (WG02) to combine the probability distributions given by multiple models has several adverse effects on their result. In particular, taking a linear combination of the various models ignores issues of model dependence and leads to large uncertainties in the final hazard estimate. Furthermore, choosing model weights based on data can systematically bias the final probability distribution. The weighting scheme of the WG02 report also depends upon an arbitrary ordering of models. In addition to analyzing current statistical problems, we present alternative methods for rigorously incorporating model uncertainty into PSHA.

  20. Integrating Real-time Earthquakes into Natural Hazard Courses

    NASA Astrophysics Data System (ADS)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.

    2001-12-01

    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm

  1. Probabilistic Tsunami Hazard Assessment for Nuclear Power Plants in Japan

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2012-12-01

    Tsunami hazard assessments for nuclear power stations (NPS) in Japan had been conducted by a deterministic method, but probabilistic methods are being adopted following the accident of Fukushima Daiichi NPS. The deterministic tsunami hazard assessment (DTHA), proposed by Japan Society of Civil Engineers in 2002 (Yanagisawa et al., 2007, Pageoph) considers various uncertainties by parameter studies. The design tsunami height at Fukushima NPS was set as 6.1 m, based on parameter studies by varying location, depth, and strike, dip and slip angles of the 1938 off-Fukushima earthquake (M 7.4). The maximum tsunami height for a hypothetical "tsunami earthquake" off Fukushima, similar to the 1896 Sanriku earthquake (Mt 8.2), and that for the 869 Jogan earthquake model (Mw 8.4) were estimated as 15.7 m and 8.9 m, respectively, before the 2011 accident (TEPCO report, 2012). The actual tsunami height at the Fukushima NPS on March 11, 2011 was 12 to 16 m. A probabilistic tsunami hazard assessment (PTHA) has been also proposed by JSCE (An'naka et al., 2007, Pageoph), and recently adopted in "Implementation Standard of Tsunami Probabilistic Risk Assessment (PRA) of NPPs" published in 2012 by Atomic Energy Society of Japan. In PTHA, tsunami hazard curves, or probability of exeedance for tsunami heights, are constructed by integrating over aleatory uncertainties. The epistemic uncertainties are treated as branches of logic trees. The logic-tree branches for the earthquake source include the earthquake type, magnitude range, recurrence interval and the parameters of BPT distribution for the recurrent earthquakes. Because no "tsunami earthquake" was recorded off the Fukushima NPS, whether or not a "tsunami earthquake" occurs along the Japan trench off Fukushima, was a one of logic-tree branches, and the weight was determined by experts' opinions. Possibilities for multi-segment earthquakes are now added as logic-tree branches, after the 2011 Tohoku earthquake, which is considered as

  2. Awareness and understanding of earthquake hazards at school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  3. Dynamic evaluation of seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2016-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A + B•(6 - M) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L, A characterizes the average annual rate of strong (M = 6) earthquakes, B determines the balance between magnitude ranges, and C estimates the fractal dimension of seismic locus in projection to the Earth surface. The parameters A, B, and C of USLE are used to assess, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity or paleo data), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures. The hazard maps for a given territory change dramatically, when the methodology is applied to a certain size moving time window, e.g. about a decade long for an intermediate-term regional assessment or exponentially increasing intervals for a daily local strong aftershock forecasting. The of dynamical seismic hazard and risks assessment is illustrated by applications to the territory of Greater Caucasus and Crimea and the two-year series of aftershocks of the 11 October 2008 Kurchaloy, Chechnya earthquake which case-history appears to be encouraging for further systematic testing as potential short-term forecasting tool.

  4. Maximum Earthquake Magnitude Assessments by Japanese Government Committees (Invited)

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2013-12-01

    The 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history and such a gigantic earthquake was not foreseen around Japan. After the 2011 disaster, various government committees in Japan have discussed and assessed the maximum credible earthquake size around Japan, but their values vary without definite consensus. I will review them with earthquakes along the Nankai Trough as an example. The Central Disaster Management Council, under Cabinet Office, set up a policy for the future tsunami disaster mitigation. The possible future tsunamis are classified into two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, for which saving people's lives is the first priority with soft measures such as tsunami hazard maps, evacuation facilities or disaster education. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared. The assessments of L1 and L2 events are left to local governments. The CDMC also assigned M 9.1 as the maximum size of earthquake along the Nankai trough, then computed the ground shaking and tsunami inundation for several scenario earthquakes. The estimated loss is about ten times the 2011 disaster, with maximum casualties of 320,000 and economic loss of 2 trillion dollars. The Headquarters of Earthquake Research Promotion, under MEXT, was set up after the 1995 Kobe earthquake and has made long-term forecast of large earthquakes and published national seismic hazard maps. The future probability of earthquake occurrence, for example in the next 30 years, was calculated from the past data of large earthquakes, on the basis of characteristic earthquake model. The HERP recently revised the long-term forecast of Naknai trough earthquake; while the 30 year probability (60 - 70 %) is similar to the previous estimate, they noted the size can be M 8 to 9, considering the variability of past

  5. 2017 one-year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert A.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce the 2017 one-year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one-year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one-year) in five focus areas: Oklahoma-Kansas, the Raton Basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 magnitude (M) ≥ 4 and three M ≥ 5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma-Kansas focus area two earthquakes with M ≥ 4 occurred near Trinidad, Colorado (in the Raton Basin focus area), but no earthquakes with M ≥ 2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared to 2015, which may be related to decreased wastewater injection, caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  6. 2017 One‐year seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  7. Earthquake Hazard in the New Madrid Seismic Zone Remains a Concern

    USGS Publications Warehouse

    Frankel, A.D.; Applegate, D.; Tuttle, M.P.; Williams, R.A.

    2009-01-01

    There is broad agreement in the scientific community that a continuing concern exists for a major destructive earthquake in the New Madrid seismic zone. Many structures in Memphis, Tenn., St. Louis, Mo., and other communities in the central Mississippi River Valley region are vulnerable and at risk from severe ground shaking. This assessment is based on decades of research on New Madrid earthquakes and related phenomena by dozens of Federal, university, State, and consulting earth scientists. Considerable interest has developed recently from media reports that the New Madrid seismic zone may be shutting down. These reports stem from published research using global positioning system (GPS) instruments with results of geodetic measurements of strain in the Earth's crust. Because of a lack of measurable strain at the surface in some areas of the seismic zone over the past 14 years, arguments have been advanced that there is no buildup of stress at depth within the New Madrid seismic zone and that the zone may no longer pose a significant hazard. As part of the consensus-building process used to develop the national seismic hazard maps, the U.S. Geological Survey (USGS) convened a workshop of experts in 2006 to evaluate the latest findings in earthquake hazards in the Eastern United States. These experts considered the GPS data from New Madrid available at that time that also showed little to no ground movement at the surface. The experts did not find the GPS data to be a convincing reason to lower the assessment of earthquake hazard in the New Madrid region, especially in light of the many other types of data that are used to construct the hazard assessment, several of which are described here.

  8. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  9. Seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir; Nekrasova, Anastasia

    2014-05-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. Any kind of risk estimates R(g) at location g results from a convolution of the natural hazard H(g) with the exposed object under consideration O(g) along with its vulnerability V(O(g)). Note that g could be a point, or a line, or a cell on or under the Earth surface and that distribution of hazards, as well as objects of concern and their vulnerability, could be time-dependent. There exist many different risk estimates even if the same object of risk and the same hazard are involved. It may result from the different laws of convolution, as well as from different kinds of vulnerability of an object of risk under specific environments and conditions. Both conceptual issues must be resolved in a multidisciplinary problem oriented research performed by specialists in the fields of hazard, objects of risk, and object vulnerability, i.e. specialists in earthquake engineering, social sciences and economics. To illustrate this general concept, we first construct seismic hazard assessment maps based on the Unified Scaling Law for Earthquakes (USLE). The parameters A, B, and C of USLE, i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an area of linear size L, are used to estimate the expected maximum

  10. Seismic hazard assessment in Grecce: Revisited

    NASA Astrophysics Data System (ADS)

    Makropoulos, Kostas; Chousianitis, Kostas; Kaviris, George; Kassaras, Ioannis

    2013-04-01

    Greece is the most earthquake prone country in the eastern Mediterranean territory and one of the most active areas globally. Seismic Hazard Assessment (SHA) is a useful procedure to estimate the expected earthquake magnitude and strong ground-motion parameters which are necessary for earthquake resistant design. Several studies on the SHA of Greece are available, constituting the basis of the National Seismic Code. However, the recently available more complete, accurate and homogenous seismological data (the new earthquake catalogue of Makropoulos et al., 2012), the revised seismic zones determined within the framework of the SHARE project (2012), new empirical attenuation formulas extracted for several regions in Greece, as well as new algorithms of SHA, are innovations that motivated the present study. Herewith, the expected earthquake magnitude for Greece is evaluated by applying the zone-free, upper bounded Gumbel's third asymptotic distribution of extreme values method. The peak ground acceleration (PGA), velocity (PGV) and displacement (PGD) are calculated at the seismic bedrock using two methods: (a) the Gumbel's first asymptotic distribution of extreme values, since it is valid for initial open-end distributions and (b) the Cornell-McGuire approach, using the CRISIS2007 (Ordaz et. al., 2007) software. The latter takes into account seismic source zones for which seismicity parameters are assigned following a Poisson recurrence model. Thus, each source is characterized by a series of seismic parameters, such as the magnitude recurrence and the recurrence rate for threshold magnitude, while different predictive equations can be assigned to different seismic source zones. Recent available attenuation parameters were considered. Moreover, new attenuation parameters for the very seismically active Corinth Gulf deduced during this study, from recordings of the RASMON accelerometric array, were used. The hazard parameters such as the most probable annual maximum

  11. Volcano and earthquake hazards in the Crater Lake region, Oregon

    USGS Publications Warehouse

    Bacon, Charles R.; Mastin, Larry G.; Scott, Kevin M.; Nathenson, Manuel

    1997-01-01

    Crater Lake lies in a basin, or caldera, formed by collapse of the Cascade volcano known as Mount Mazama during a violent, climactic eruption about 7,700 years ago. This event dramatically changed the character of the volcano so that many potential types of future events have no precedent there. This potentially active volcanic center is contained within Crater Lake National Park, visited by 500,000 people per year, and is adjacent to the main transportation corridor east of the Cascade Range. Because a lake is now present within the most likely site of future volcanic activity, many of the hazards at Crater Lake are different from those at most other Cascade volcanoes. Also significant are many faults near Crater Lake that clearly have been active in the recent past. These faults, and historic seismicity, indicate that damaging earthquakes can occur there in the future. This report describes the various types of volcano and earthquake hazards in the Crater Lake area, estimates of the likelihood of future events, recommendations for mitigation, and a map of hazard zones. The main conclusions are summarized below.

  12. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    SciTech Connect

    Thenhaus, P.C. )

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-12 in the area of New Madrid, Missouri. They are considered to be the greatest earthquakes in the conterminous U.S. because they were felt and caused damage at far greater distances than any other earthquakes in US history. In contrast to California, where earthquakes are felt frequently, the damaging earthquakes that have occurred in the Eastern US are generally regarded as only historical phenomena. A fundamental problem in the Eastern US, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the tools that geoscientists have to study the region. The so-called earthquake hazard is defined by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. The term earthquake risk, on the other hand, refers to aspects of the expected damage to manmade structures and to lifelines as a result of the earthquake hazard.

  13. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation

  14. How Can Museum Exhibits Enhance Earthquake and Tsunami Hazard Resiliency?

    NASA Astrophysics Data System (ADS)

    Olds, S. E.

    2015-12-01

    Creating a natural disaster-ready community requires interoperating scientific, technical, and social systems. In addition to the technical elements that need to be in place, communities and individuals need to be prepared to react when a natural hazard event occurs. Natural hazard awareness and preparedness training and education often takes place through informal learning at science centers and formal k-12 education programs as well as through awareness raising via strategically placed informational tsunami warning signs and placards. Museums and science centers are influential in raising science literacy within a community, however can science centers enhance earthquake and tsunami resiliency by providing hazard science content and preparedness exhibits? Museum docents and informal educators are uniquely situated within the community. They are transmitters and translators of science information to broad audiences. Through interaction with the public, docents are well positioned to be informants of the knowledge beliefs, and feelings of science center visitors. They themselves are life-long learners, both constantly learning from the museum content around them and sharing this content with visitors. They are also members of a community where they live. In-depth interviews with museum informal educators and docents were conducted at a science center in coastal Pacific Northwest. This region has a potential to be struck by a great 9+ Mw earthquake and subsequent tsunami. During the interviews, docents described how they applied learning from natural hazard exhibits at a science visitor center to their daily lives. During the individual interviews, the museum docents described their awareness (knowledge, attitudes, and behaviors) of natural hazards where they live and work, the feelings evoked as they learned about their hazard vulnerability, the extent to which they applied this learning and awareness to their lives, such as creating an evacuation plan, whether

  15. Nationwide Assessment of Seismic Hazard for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.; Mumladze, T.

    2014-12-01

    The work presents a framework for assessment of seismic hazards on national level for the Georgia. Based on a historical review of the compilation of seismic hazard zoning maps for the Georgia became evident that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation. The methodology for the probabilistic assessment of seismic hazard used here includes the following steps: produce comprehensive catalogue of historical earthquakes (up to 1900) and the period of instrumental observations with uniform scale of magnitudes; produce models of seismic source zones (SSZ) and their parameterization; develop appropriate ground motion prediction equation (GMPE) models; develop seismic hazard curves for spectral amplitudes at each period and maps in digital format. Firstly, the new seismic catalog of Georgia was created, with 1700 eqs from ancient times on 2012, Mw³4.0. Secondly, were allocated seismic source zones (SSZ). The identification of area SSZ was obtained on the bases of structural geology, parameters of seismicity and seismotectonics. In constructing the SSZ, the slope of the appropriate active fault plane, the width of the dynamic influence of the fault, power of seismoactive layer are taken into account. Finally each SSZ was defined with the parameters: the geometry, the percentage of focal mechanism, predominant azimuth and dip angle values, activity rates, maximum magnitude, hypocenter depth distribution, lower and upper seismogenic depth values. Thirdly, seismic hazard maps were calculated based on modern approach of selecting and ranking global and regional ground motion prediction equation for region. Finally, probabilistic seismic hazard assessment in terms of ground acceleration were calculated for the territory of Georgia. On the basis of obtained area seismic sources probabilistic seismic hazard maps were calculated showing peak ground acceleration (PGA) and spectral accelerations (SA) at

  16. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  17. Earthquake hazard mapping for lifeline engineering Coquitlam, British Columbia

    SciTech Connect

    Gohl, W.B.; Hawson, H.H.; Dou, H.; Nyberg, N.; Lee, R.; Wong, H.

    1995-12-31

    A series of maps plotted at a 1:15,000 scale were prepared to illustrate geotechnical aspects of seismic hazard for the 475 year return period earthquake event within the City of Coquitlam located in the Vancouver Lower Mainland of British Columbia. The maps were prepared to facilitate evaluation of lifeline damage potential within the City of Coquitlam (e.g. roads, sewers, water supply lines, oil/gas pipelines, power lines, compressor/pumping stations, water reservoirs, bridges, and rail lines) and to assist in evaluation of the impact of seismic ground shaking on new infrastructure.

  18. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    SciTech Connect

    Blanchard, A.

    2000-02-28

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program.

  19. The Pacific Northwest; linkage between earthquake and volcano hazards

    USGS Publications Warehouse

    Crosson, R.S.

    1990-01-01

    The Pacific Northwest (Oregon, Washington, and northern California) is experiencing rapid industrial and population growth. The same conditions that make the region attractive- close proximity to both mountains and oceans, volcanoes and spectacular inland waters- also present significant geologic hazards that are easily overlooked in the normal timetable of human activities. The catastrophic eruption of Mount St. Helens 10 years ago serves as a dramatic reminder of the forces of nature that can be unleashed through volcanism. other volcanoes such as  mount Rainier, a majestic symbol of Washington, or Mount hood in Oregon, lie closer to population centers and could present far greater hazards should they become active. Earthquakes may affect even larger regions, prodcuging more cumulative damage. 

  20. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  1. KSC VAB Aeroacoustic Hazard Assessment

    DTIC Science & Technology

    2010-07-01

    KSC VAB Aeroacoustic Hazard Assessment Justin M. Oliveira1, Sabrina Yedo2, and Michael D. Campbell3 NASA, Kennedy Space Center, FL, 32899 Joseph...of a motor burning within the Vehicle Assembly Building ( VAB ). The analysis was carried out with support from ASRC Aerospace who modeled transmission...TITLE AND SUBTITLE KSC VAB Aeroacoustic Hazard Assessment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT

  2. Using a physics-based earthquake simulator to evaluate seismic hazard in NW Iran

    NASA Astrophysics Data System (ADS)

    Khodaverdian, A.; Zafarani, H.; Rahimian, M.

    2016-07-01

    NW Iran is a region of active deformation in the Eurasia-Arabia collision zone. This high strain field has caused intensive faulting accompanied by several major (M > 6.5) earthquakes as it is evident from historical records. Whereas seismic data (i.e. instrumental and historical catalogues) are either short, or inaccurate and inhomogeneous, physics-based long-term simulations are beneficial to better assess seismic hazard. In this study, a deterministic seismicity model, which consists of major active faults, is first constructed, and used to generate a synthetic catalogue of large-magnitude (M > 5.5) earthquakes. The frequency-magnitude distribution of the synthetic earthquake catalogue, which is based on the physical characteristic and slip rate of the mapped faults, is consistent with the empirical distribution evaluated using record of instrumental and historical events. The obtained results are also in accordance with palaeoseismic studies and other independent kinematic deformation models of the Iranian Plateau. Using the synthetic catalogue, characteristic magnitude for all 16 active faults in the study area is determined. Magnitude and epicentre of these earthquakes are comparable with the historical records. Large earthquake recurrence times and their variations are evaluated, either for an individual fault or for the region as a whole. Goodness-of-fitness tests revealed that recurrence times can be well described by the Weibull distribution. Time-dependent conditional probabilities for large earthquakes in the study area are also estimated for different time intervals. The resulting synthetic catalogue can be utilized as a useful data set for hazard and risk assessment instead of short, incomplete and inhomogeneous available catalogues.

  3. Seismic Hazard and Risk Assessment in Multi-Hazard Prone Urban Areas: The Case Study of Cologne, Germany

    NASA Astrophysics Data System (ADS)

    Tyagunov, S.; Fleming, K.; Parolai, S.; Pittore, M.; Vorogushyn, S.; Wieland, M.; Zschau, J.

    2012-04-01

    Most hazard and risk assessment studies usually analyze and represent different kinds of hazards and risks separately, although risk assessment and mitigation programs in multi-hazard prone urban areas should take into consideration possible interactions of different hazards. This is particularly true for communities located in seismically active zones, where, on the one hand, earthquakes are capable of triggering other types of hazards, while, on the other hand, one should bear in mind that temporal coincidence or succession of different hazardous events may influence the vulnerability of the existing built environment and, correspondingly, the level of the total risk. Therefore, possible inter-dependencies and inter-influences of different hazards should be reflected properly in the hazard, vulnerability and risk analyses. This work presents some methodological aspects and preliminary results of a study being implemented within the framework of the MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) project. One of the test cases of the MATRIX project is the city of Cologne, which is one of the largest cities of Germany. The area of Cologne, being exposed to windstorm, flood and earthquake hazards, has already been considered in comparative risk assessments. However, possible interactions of these different hazards have been neglected. The present study is aimed at the further development of a holistic multi-risk assessment methodology, taking into consideration possible time coincidence and inter-influences of flooding and earthquakes in the area.

  4. Reducing Vulnerability of Ports and Harbors to Earthquake and Tsunami Hazards

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Recent scientific research suggests the Pacific Northwest could experience catastrophic earthquakes in the near future, both from distant and local sources, posing a significant threat to coastal communities. Damage could result from numerous earthquake-related hazards, such as severe ground shaking, soil liquefaction, landslides, land subsidence/uplift, and tsunami inundation. Because of their geographic location, ports and harbors are especially vulnerable to these hazards. Ports and harbors, however, are important components of many coastal communities, supporting numerous activities critical to the local and regional economy and possibly serving as vital post-event, response-recovery transportation links. A collaborative, multi-year initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to earthquake and tsunami hazards, involving Oregon Sea Grant (OSG), Washington Sea Grant (WSG), the National Oceanic and Atmospheric Administration Coastal Services Center (CSC), and the U.S. Geological Survey Center for Science Policy (CSP). Specific products of this research, planning, and outreach initiative include a regional stakeholder issues and needs assessment, a community-based mitigation planning process, a Geographic Information System (GIS) — based vulnerability assessment methodology, an educational web-site and a regional data archive. This paper summarizes these efforts, including results of two pilot port-harbor community projects, one in Yaquina Bay, Oregon and the other in Sinclair Inlet, Washington. Finally, plans are outlined for outreach to other port and harbor communities in the Pacific Northwest and beyond, using "getting started" workshops and a web-based tutorial.

  5. Cruise report for 01-99-SC: southern California earthquake hazards project

    USGS Publications Warehouse

    Normark, William R.; Reid, Jane A.; Sliter, Ray W.; Holton, David; Gutmacher, Christina E.; Fisher, Michael A.; Childs, Jonathan R.

    1999-01-01

    The focus of the Southern California Earthquake Hazards project is to identify the landslide and earthquake hazards and related ground-deformation processes occurring in the offshore areas that have significant potential to impact the inhabitants of the Southern California coastal region. The project activity is supported through the Coastal and Marine Geology Program of the Geologic Division of the U. S. Geological Survey (USGS) and is a component of the Geologic Division's Science Strategy under Goal 1—Conduct Geologic Hazard Assessments for Mitigation Planning (Bohlen et al., 1998). The project research is specifically stated under Activity 1.1.2 of the Science Strategy: Earthquake Hazard Assessments and Loss Reduction Products in Urban Regions. This activity involves "research, seismic and geodetic monitoring, field studies, geologic mapping, and analyses needed to provide seismic hazard assessments of major urban centers in earthquake-prone regions including adjoining coastal and offshore areas." The southern California urban areas, which form the most populated urban corridor along the U.S. Pacific margin, are among a few specifically designated for special emphasis under the Division's science strategy (Bohlen et al., 1998). The primary objective of the project is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this objective, we are conducting field investigations to observe the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (Fig. 1). In addition, acoustic imaging should help determine the subsurface dimensions of the faults and identify the size and frequency of submarine landslides, both of which are necessary for evaluating the potential for

  6. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  7. Salient beliefs about earthquake hazards and household preparedness.

    PubMed

    Becker, Julia S; Paton, Douglas; Johnston, David M; Ronan, Kevin R

    2013-09-01

    Prior research has found little or no direct link between beliefs about earthquake risk and household preparedness. Furthermore, only limited work has been conducted on how people's beliefs influence the nature and number of preparedness measures adopted. To address this gap, 48 qualitative interviews were undertaken with residents in three urban locations in New Zealand subject to seismic risk. The study aimed to identify the diverse hazard and preparedness-related beliefs people hold and to articulate how these are influenced by public education to encourage preparedness. The study also explored how beliefs and competencies at personal, social, and environmental levels interact to influence people's risk management choices. Three main categories of beliefs were found: hazard beliefs; preparedness beliefs; and personal beliefs. Several salient beliefs found previously to influence the preparedness process were confirmed by this study, including beliefs related to earthquakes being an inevitable and imminent threat, self-efficacy, outcome expectancy, personal responsibility, responsibility for others, and beliefs related to denial, fatalism, normalization bias, and optimistic bias. New salient beliefs were also identified (e.g., preparedness being a "way of life"), as well as insight into how some of these beliefs interact within the wider informational and societal context.

  8. Exploration of resilience assessments for natural hazards

    NASA Astrophysics Data System (ADS)

    Lo Jacomo, Anna; Han, Dawei; Champneys, Alan

    2017-04-01

    measures of resilience are hazard dependent, and require hazard information. In those cases, the type of hazard information required varies from long term information such as the general probability of occurrence of a particular hazard, to short term information such as the observed damage following a specific earthquake occurrence. The required information also varies from national scale, such as census data, to local scale, such as stakeholder perceptions of a threat. This is shown through examples of resilience assessments, along with a discussion of their ability to inform decision making.

  9. Evansville Area Earthquake Hazards Mapping Project (EAEHMP) - Progress Report, 2008

    USGS Publications Warehouse

    Boyd, Oliver S.; Haase, Jennifer L.; Moore, David W.

    2009-01-01

    Maps of surficial geology, deterministic and probabilistic seismic hazard, and liquefaction potential index have been prepared by various members of the Evansville Area Earthquake Hazard Mapping Project for seven quadrangles in the Evansville, Indiana, and Henderson, Kentucky, metropolitan areas. The surficial geologic maps feature 23 types of surficial geologic deposits, artificial fill, and undifferentiated bedrock outcrop and include alluvial and lake deposits of the Ohio River valley. Probabilistic and deterministic seismic hazard and liquefaction hazard mapping is made possible by drawing on a wealth of information including surficial geologic maps, water well logs, and in-situ testing profiles using the cone penetration test, standard penetration test, down-hole shear wave velocity tests, and seismic refraction tests. These data were compiled and collected with contributions from the Indiana Geological Survey, Kentucky Geological Survey, Illinois State Geological Survey, United States Geological Survey, and Purdue University. Hazard map products are in progress and are expected to be completed by the end of 2009, with a public roll out in early 2010. Preliminary results suggest that there is a 2 percent probability that peak ground accelerations of about 0.3 g will be exceeded in much of the study area within 50 years, which is similar to the 2002 USGS National Seismic Hazard Maps for a firm rock site value. Accelerations as high as 0.4-0.5 g may be exceeded along the edge of the Ohio River basin. Most of the region outside of the river basin has a low liquefaction potential index (LPI), where the probability that LPI is greater than 5 (that is, there is a high potential for liquefaction) for a M7.7 New Madrid type event is only 20-30 percent. Within the river basin, most of the region has high LPI, where the probability that LPI is greater than 5 for a New Madrid type event is 80-100 percent.

  10. Numerical earthquake model of the 20 April 2015 southern Ryukyu subduction zone M6.4 event and its impact on seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong

    2015-10-01

    The M6.4 earthquake that took place on the 20 April 2015 off the shore of eastern Taiwan was the largest event in the vicinity of Taiwan during 2015. The mainshock was located in the southern Ryukyu subduction zone, which is the interface between the Philippine Sea Plate and the Eurasian Plate. People in Taipei experienced strong ground shaking for more than 40 s, even though the epicenter was located more than 150 km away. In order to understand the origin of ground motions from this earthquake and how it caused such strong shaking in Taipei, a numerical earthquake model is analyzed, including models of source rupture and wave propagation. First, a joint source inversion was performed using teleseismic body wave and local ground motion data. Source inversion results show that a large slip occurred near the hypocenter, which rapidly released seismic energy in the first 2 s. Then, the rupture propagated toward the shallow fault plane. A large amount of seismic energy was released during this rupture stage that slipped for more than 8 s before the end of the rupture. The estimated stress drop is 2.48 MPa, which is consistent with values for subduction zone earthquakes. Forward simulation using this inverted source rupture model and a 3D seismic velocity model based on the spectral-element method was then performed. Results indicate that the strong ground motion in Taipei resulted from two factors: (1) the Taipei basin amplification effect and (2) the specific source radiation pattern. The results of this numerical earthquake model imply that future subduction zone events that occur in offshore eastern Taiwan are likely to cause relatively strong ground shaking in northern Taiwan, especially in the Taipei metropolitan area.

  11. Numerical earthquake model of the 20 April 2015 southern Ryukyu subduction zone M6.4 event and its impact on seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Lee, S. J.

    2015-12-01

    The M6.4 earthquake that took place on the 20th April 2015 off the shore of eastern Taiwan was the largest event that occurred in the vicinity of Taiwan during 2015. The mainshock located in the southern Ryukyu subduction zone, which is the interface between the Philippine Sea Plate and the Eurasian Plate. People in Taipei experienced strong ground shaking for more than 40 s, even though the epicenter was located more than 150 km away. In order to understand the origin of this earthquake and how it caused such strong shaking in Taipei, a numerical earthquake model is analyzed, including models of source rupture and wave propagation. First, a joint source inversion is performed using teleseismic body wave and local ground motion data. Source inversion results show that a large slip occurred near the hypocenter, which rapidly released seismic energy in the first 2 s. Then, the rupture propagated toward the shallow fault plane. A large amount of seismic energy was released during this rupture stage that slipped for more than 8 s before the end of the rupture. The estimated stress drop is 2.48 MPa, which is consistent with values for subduction zone earthquakes. Forward simulation using this inverted source rupture model based on the spectral-element method is then performed. Results indicate that the strong ground motion in Taipei resulted from two factors: (1) the Taipei basin amplification effect and (2) the specific source radiation pattern. The results of this numerical earthquake model imply that future subduction zone events that occur in offshore eastern Taiwan are likely to cause relatively strong ground shaking in northern Taiwan, especially in the Taipei metropolitan area.

  12. Secondary impact hazard assessment

    NASA Astrophysics Data System (ADS)

    1986-06-01

    A series of light gas gun shots (4 to 7 km/sec) were performed with 5 mg nylon and aluminum projectiles to determine the size, mass, velocity, and spatial distribution of spall and ejecta from a number of graphite/epoxy targets. Similar determinations were also performed on a few aluminum targets. Target thickness and material were chosen to be representative of proposed Space Station structure. The data from these shots and other information were used to predict the hazard to Space Station elements from secondary particles resulting from impacts of micrometeoroids and orbital debris on the Space Station. This hazard was quantified as an additional flux over and above the primary micrometeoroid and orbital debris flux that must be considered in the design process. In order to simplify the calculations, eject and spall mass were assumed to scale directly with the energy of the projectile. Other scaling systems may be closer to reality. The secondary particles considered are only those particles that may impact other structure immediately after the primary impact. The addition to the orbital debris problem from these primary impacts was not addressed. Data from this study should be fed into the orbital debris model to see if Space Station secondaries make a significant contribution to orbital debris. The hazard to a Space Station element from secondary particles above and beyond the micrometeoroid and orbital debris hazard is categorized in terms of two factors: (1) the 'view factor' of the element to other Space Station structure or the geometry of placement of the element, and (2) the sensitivity to damage, stated in terms of energy. Several example cases were chosen, the Space Station module windows, windows of a Shuttle docked to the Space Station, the habitat module walls, and the photovoltaic solar cell arrays. For the examples chosen the secondary flux contributed no more than 10 percent to the total flux (primary and secondary) above a given calculated

  13. Secondary impact hazard assessment

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A series of light gas gun shots (4 to 7 km/sec) were performed with 5 mg nylon and aluminum projectiles to determine the size, mass, velocity, and spatial distribution of spall and ejecta from a number of graphite/epoxy targets. Similar determinations were also performed on a few aluminum targets. Target thickness and material were chosen to be representative of proposed Space Station structure. The data from these shots and other information were used to predict the hazard to Space Station elements from secondary particles resulting from impacts of micrometeoroids and orbital debris on the Space Station. This hazard was quantified as an additional flux over and above the primary micrometeoroid and orbital debris flux that must be considered in the design process. In order to simplify the calculations, eject and spall mass were assumed to scale directly with the energy of the projectile. Other scaling systems may be closer to reality. The secondary particles considered are only those particles that may impact other structure immediately after the primary impact. The addition to the orbital debris problem from these primary impacts was not addressed. Data from this study should be fed into the orbital debris model to see if Space Station secondaries make a significant contribution to orbital debris. The hazard to a Space Station element from secondary particles above and beyond the micrometeoroid and orbital debris hazard is categorized in terms of two factors: (1) the 'view factor' of the element to other Space Station structure or the geometry of placement of the element, and (2) the sensitivity to damage, stated in terms of energy. Several example cases were chosen, the Space Station module windows, windows of a Shuttle docked to the Space Station, the habitat module walls, and the photovoltaic solar cell arrays. For the examples chosen the secondary flux contributed no more than 10 percent to the total flux (primary and secondary) above a given calculated

  14. Errors in Seismic Hazard Assessment are Creating Huge Human Losses

    NASA Astrophysics Data System (ADS)

    Bela, J.

    2015-12-01

    The current practice of representing earthquake hazards to the public based upon their perceived likelihood or probability of occurrence is proven now by the global record of actual earthquakes to be not only erroneous and unreliable, but also too deadly! Earthquake occurrence is sporadic and therefore assumptions of earthquake frequency and return-period are both not only misleading, but also categorically false. More than 700,000 people have now lost their lives (2000-2011), wherein 11 of the World's Deadliest Earthquakes have occurred in locations where probability-based seismic hazard assessments had predicted only low seismic low hazard. Unless seismic hazard assessment and the setting of minimum earthquake design safety standards for buildings and bridges are based on a more realistic deterministic recognition of "what can happen" rather than on what mathematical models suggest is "most likely to happen" such future huge human losses can only be expected to continue! The actual earthquake events that did occur were at or near the maximum potential-size event that either already had occurred in the past; or were geologically known to be possible. Haiti's M7 earthquake, 2010 (with > 222,000 fatalities) meant the dead could not even be buried with dignity. Japan's catastrophic Tohoku earthquake, 2011; a M9 Megathrust earthquake, unleashed a tsunami that not only obliterated coastal communities along the northern Japanese coast, but also claimed > 20,000 lives. This tsunami flooded nuclear reactors at Fukushima, causing 4 explosions and 3 reactors to melt down. But while this history of huge human losses due to erroneous and misleading seismic hazard estimates, despite its wrenching pain, cannot be unlived; if faced with courage and a more realistic deterministic estimate of "what is possible", it need not be lived again. An objective testing of the results of global probability based seismic hazard maps against real occurrences has never been done by the

  15. Preparation of Synthetic Earthquake Catalogue and Tsunami Hazard Curves in Marmara Sea using Monte Carlo Simulations

    NASA Astrophysics Data System (ADS)

    Bayraktar, Başak; Özer Sözdinler, Ceren; Necmioǧlu, Öcal; Meral Özel, Nurcan

    2017-04-01

    The Marmara Sea and its surrounding is one of the most populated areas in Turkey. Many densely populated cities, such as megacity Istanbul with a population of more than 14 million, a great number of industrial facilities in largest capacity and potential, refineries, ports and harbors are located along the coasts of Marmara Sea. The region is highly seismically active. There has been a wide range of studies in this region regarding the fault mechanisms, seismic activities, earthquakes and triggered tsunamis in the Sea of Marmara. The historical documents reveal that the region has been experienced many earthquakes and tsunamis in the past. According to Altinok et al. (2011), 35 tsunami events happened in Marmara Sea between BC 330 and 1999. As earthquakes are expected in Marmara Sea with the break of segments of North Anatolian Fault (NAF) in the future, the region should be investigated in terms of the possibility of tsunamis by the occurrence of earthquakes in specific return periods. This study aims to make probabilistic tsunami hazard analysis in Marmara Sea. For this purpose, the possible sources of tsunami scenarios are specified by compiling the earthquake catalogues, historical records and scientific studies conducted in the region. After compiling all this data, a synthetic earthquake and tsunami catalogue are prepared using Monte Carlo simulations. For specific return periods, the possible epicenters, rupture lengths, widths and displacements are determined with Monte Carlo simulations assuming the angles of fault segments as deterministic. For each earthquake of synthetic catalogue, the tsunami wave heights will be calculated at specific locations along Marmara Sea. As a further objective, this study will determine the tsunami hazard curves for specific locations in Marmara Sea including the tsunami wave heights and their probability of exceedance. This work is supported by SATREPS-MarDim Project (Earthquake and Tsunami Disaster Mitigation in the

  16. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    USGS Publications Warehouse

    Thenhaus, P.C.

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-1812 in the area of New Madrid, Missouri. they are considered to be the greatest earthquakes in the conterminous U.S because they were felt and caused damage at far greater distances than any other earthquakes in U.S history. The large population currently living within the damage area of these earthquakes means that widespread destruction and loss of life is likely if the sequence were repeated. In contrast to California, where the earthquakes are felt frequently, the damaging earthquakes that have occurred in the Easter U.S-in 155 (Cape Ann, Mass.), 1811-12 (New Madrid, Mo.), 1886 (Charleston S.C) ,and 1897 (Giles County, Va.- are generally regarded as only historical phenomena (fig. 1). The social memory of these earthquakes no longer exists. A fundamental problem in the Eastern U.S, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the "tools" that geoscientists have to study the region. The so-called earthquake hazard is defined  by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. the term "earthquake risk," on the other hand, refers to aspects of the expected damage to manmade strctures and to lifelines as a result of the earthquake hazard.  

  17. Transparent Global Seismic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen

    2013-04-01

    Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits

  18. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  19. Large Historical Earthquakes and Tsunami Hazards in the Western Mediterranean: Source Characteristics and Modelling

    NASA Astrophysics Data System (ADS)

    Harbi, Assia; Meghraoui, Mustapha; Belabbes, Samir; Maouche, Said

    2010-05-01

    The western Mediterranean region was the site of numerous large earthquakes in the past. Most of these earthquakes are located at the East-West trending Africa-Eurasia plate boundary and along the coastline of North Africa. The most recent recorded tsunamigenic earthquake occurred in 2003 at Zemmouri-Boumerdes (Mw 6.8) and generated ~ 2-m-high tsunami wave. The destructive wave affected the Balearic Islands and Almeria in southern Spain and Carloforte in southern Sardinia (Italy). The earthquake provided a unique opportunity to gather instrumental records of seismic waves and tide gauges in the western Mediterranean. A database that includes a historical catalogue of main events, seismic sources and related fault parameters was prepared in order to assess the tsunami hazard of this region. In addition to the analysis of the 2003 records, we study the 1790 Oran and 1856 Jijel historical tsunamigenic earthquakes (Io = IX and X, respectively) that provide detailed observations on the heights and extension of past tsunamis and damage in coastal zones. We performed the modelling of wave propagation using NAMI-DANCE code and tested different fault sources from synthetic tide gauges. We observe that the characteristics of seismic sources control the size and directivity of tsunami wave propagation on both northern and southern coasts of the western Mediterranean.

  20. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 15. Tsunamis, Seiches, and Landslide-Induced Water Waves.

    DTIC Science & Technology

    1979-11-01

    EXCEPT ALEUTIAN ISLANDS) ,LANDSLIDES OR SUBAQUEOUS SLIDES )CAN PRODUCE ZONE 5 ELEVATIONS1 *% (e.g. LITUYA BAY , ALASKA) ALEUTIAN ISLANDS (SAME AS GULF...has been documentedwas generated in 1958 by a landslide that was triggered by an earthquake and slid into Lituya Bay , Alaska. The landslide generated...generated waves in Lituya Bay in 1853, 1874, and 1936 (Miller, 1960). 118. Subaqueous landslides triggered by the 1964 Alaskan tsunami caused widespread

  1. Probabilistic Seismic Hazard Assessment from Incomplete and Uncertain Data

    NASA Astrophysics Data System (ADS)

    Smit, Ansie; Kijko, Andrzej

    2016-04-01

    A question that frequently arises with seismic hazard assessment is why are our assessments so poor? Often the answer is that in many cases the standard applied methodologies do not take into account the nature of seismic event catalogs. In reality these catalogues are incomplete with uncertain magnitude estimates and a significant discrepancy between the empirical data and applied occurrence model. Most probabilistic seismic hazard analysis procedures require knowledge of at least three seismic source parameters: the mean seismic activity rate λ, the Gutenberg-Richter b-value, and the area-characteristic (seismogenic source) maximum possible earthquake magnitude Mmax. In almost all currently used seismic hazard assessment procedures utilizing these three parameters, it's explicitly assumed that all three remain constant over a specified time and space. However, closer examination of most earthquake catalogues indicates that there are significant spatial and temporal variations in the seismic activity rate λ as well as the Gutenberg-Richter b-value. In the proposed methodology the maximum likelihood estimation of these earthquake hazard parameters takes into account the incompleteness of catalogues, uncertainty in the earthquake magnitude determination as well as the uncertainty associated with the applied earthquake occurrence models. The uncertainty in the earthquake occurrence models are introduced by assuming that both, the mean, seismic activity rate λ and the b-value of Gutenberg-Richter are random variables, each described by the Gamma distribution. The approach results in the extension of the classic frequency-magnitude Gutenberg-Richter relation and the Poisson distribution of number of earthquakes, with their compounded counterparts. The proposed procedure is applied in the estimation of the seismic parameters for the area of Ceres-Tulbagh, South Africa, which experienced the strongest earthquake in the country's recorded history. In this example it is

  2. Hazard maps of earthquake induced permanent displacements validated by site numerical simulation

    NASA Astrophysics Data System (ADS)

    Vessia, Giovanna; Pisano, Luca; Parise, Mario; Tromba, Giuseppe

    2016-04-01

    Hazard maps of seismically induced instability at the urban scale can be drawn by means of GIS spatial interpolation tools starting from (1) a Digital terrain model (DTM) and (2) geological and geotechnical hydro-mechanical site characterization. These maps are commonly related to a fixed return period of the natural phenomenon under study, or to a particular hazard scenario from the most significant past events. The maps could be used to guide the planning activity as well as the emergency actions, but the main limit of such maps is that typically no reliability analyses is performed. Spatial variability and uncertainties in subsoil properties, poor description of geomorphological evidence of active instability, and geometrical approximations and simplifications in DTMs, among the others, could be responsible for inaccurate maps. In this study, a possible method is proposed to control and increase the overall reliability of an hazard scenario map for earthquake-induced slope instability. The procedure can be summarized as follows: (1) GIS Statistical tools are used to improve the spatial distribution of the hydro-mechanical properties of the surface lithologies; (2) Hazard maps are drawn from the preceding information layer on both groundwater and mechanical properties of surficial deposits combined with seismic parameters propagated by means of Ground Motion Propagation Equations; (3) Point numerical stability analyses carried out by means of the Finite Element Method (e.g. Geostudio 2004) are performed to anchor hazard maps prediction to point quantitative analyses. These numerical analyses are used to generate a conversion scale from urban to point estimates in terms of permanent displacements. Although this conversion scale differs from case to case, it could be suggested as a general method to convert the results of large scale map analyses to site hazard assessment. In this study, the procedure is applied to the urban area of Castelfranci (Avellino province

  3. Effects of Strike-Slip Fault Segmentation on Earthquake Energy and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Madden, E. H.; Cooke, M. L.; Savage, H. M.; McBeck, J.

    2014-12-01

    Many major strike-slip faults are segmented along strike, including those along plate boundaries in California and Turkey. Failure of distinct fault segments at depth may be the source of multiple pulses of seismic radiation observed for single earthquakes. However, how and when segmentation affects fault behavior and energy release is the basis of many outstanding questions related to the physics of faulting and seismic hazard. These include the probability for a single earthquake to rupture multiple fault segments and the effects of segmentation on earthquake magnitude, radiated seismic energy, and ground motions. Using numerical models, we quantify components of the earthquake energy budget, including the tectonic work acting externally on the system, the energy of internal rock strain, the energy required to overcome fault strength and initiate slip, the energy required to overcome frictional resistance during slip, and the radiated seismic energy. We compare the energy budgets of systems of two en echelon fault segments with various spacing that include both releasing and restraining steps. First, we allow the fault segments to fail simultaneously and capture the effects of segmentation geometry on the earthquake energy budget and on the efficiency with which applied displacement is accommodated. Assuming that higher efficiency correlates with higher probability for a single, larger earthquake, this approach has utility for assessing the seismic hazard of segmented faults. Second, we nucleate slip along a weak portion of one fault segment and let the quasi-static rupture propagate across the system. Allowing fractures to form near faults in these models shows that damage develops within releasing steps and promotes slip along the second fault, while damage develops outside of restraining steps and can prohibit slip along the second fault. Work is consumed in both the propagation of and frictional slip along these new fractures, impacting the energy available

  4. Response of the Laprak Landslide to the 2015 Nepal Earthquake and Implications for the Utility of Simple Infinite Slope Models in Regional Landslide Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Haneberg, W. C.; Gurung, N.

    2016-12-01

    The village of Laprak, located in the Gorkha District of western Nepal, was built on a large colluvium landslide about 10 km from the epicenter of the 25 April 2015 M 7.8 Nepal earthquake. Recent episodic movement began during a wet period in 1999 and continued in at least 2002, 2006, and 2007, destroying 24 homes, removing 23 hectares of land from agricultural production, and claiming 1 life. Reconnaissance mapping, soil sampling and testing, and slope stability analyses undertaken before the 2015 earthquake suggested that the hillside should be stable under dry conditions, unstable to marginally stable under static wet conditions, and wholly unstable under wet seismic conditions. Most of the buildings in Laprak, which were predominantly of dry fitted stone masonry, were destroyed by Intensity IX shaking during the 2015 earthquake. Interpretation of remotely sensed imagery and published photographs shows new landslide features; hence, some downslope movement occurred but the landslide did not mobilize into a long run-out flow. Monte Carlo simulations based upon a pseudostatic infinite slope model and constrained by reasonable distributions of soil shear strength, pore pressure, and slope angle from earlier work and seismic coefficients based upon the observed Intensity IX shaking (and inferred PGA) yield high probabilities of failure for steep portions of the slope above and below the village but moderate probabilities of failure for the more gentle portion of the slope upon which most of the village was constructed. In retrospect, the seismic coefficient selected for the pre-earthquake analysis proved to be remarkably prescient. Similar results were obtained using a first-order, second-moment (FOSM) approach that is convenient for GIS based regional analyses. Predictions of permanent displacement made using a variety of published empirical formulae based upon sliding block analyses range from about 10 cm to about 200 cm, also broadly consistent with the observed

  5. Physical vulnerability modelling in natural hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Douglas, J.

    2007-04-01

    An evaluation of the risk to an exposed element from a hazardous event requires a consideration of the element's vulnerability, which expresses its propensity to suffer damage. This concept allows the assessed level of hazard to be translated to an estimated level of risk and is often used to evaluate the risk from earthquakes and cyclones. However, for other natural perils, such as mass movements, coastal erosion and volcanoes, the incorporation of vulnerability within risk assessment is not well established and consequently quantitative risk estimations are not often made. This impedes the study of the relative contributions from different hazards to the overall risk at a site. Physical vulnerability is poorly modelled for many reasons: the cause of human casualties (from the event itself rather than by building damage); lack of observational data on the hazard, the elements at risk and the induced damage; the complexity of the structural damage mechanisms; the temporal and geographical scales; and the ability to modify the hazard level. Many of these causes are related to the nature of the peril therefore for some hazards, such as coastal erosion, the benefits of considering an element's physical vulnerability may be limited. However, for hazards such as volcanoes and mass movements the modelling of vulnerability should be improved by, for example, following the efforts made in earthquake risk assessment. For example, additional observational data on induced building damage and the hazardous event should be routinely collected and correlated and also numerical modelling of building behaviour during a damaging event should be attempted.

  6. Earthquake Risk Assessment and Risk Transfer

    NASA Astrophysics Data System (ADS)

    Liechti, D.; Zbinden, A.; Rüttener, E.

    Research on risk assessment of natural catastrophes is very important for estimating its economical and social impact. The loss potentials of such disasters (e.g. earthquake and storms) for property owners, insurance and nationwide economies are driven by the hazard, the damageability (vulnerability) of buildings and infrastructures and depend on the ability to transfer these losses to different parties. In addition, the geographic distribution of the exposed values, the uncertainty of building vulnerability and the individual deductible are main factors determining the size of a loss. The deductible is the key element that steers the distribution of losses between insured and insurer. Therefore the risk analysis concentrates on deductible and vulnerability of insured buildings and maps their variations to allow efficient decisions. With consideration to stochastic event sets, the corresponding event losses can be modelled as expected loss grades of a Beta probability density function. Based on deductible and standard deviation of expected loss grades, the loss for the insured and for the insurer can be quantified. In addition, the varying deductible impact on different geographic regions can be described. This analysis has been carried out for earthquake insurance portfolios with various building types and different deductibles. Besides quantifying loss distributions between insured and insurer based on uncertainty assumptions and deductible consideration, mapping yields ideas to optimise the risk transfer process and can be used for developing risk mitigation strategies.

  7. Earthquake Hazard and Risk in Sub-Saharan Africa: current status of the Global Earthquake model (GEM) initiative in the region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay; Midzi, Vunganai; Ateba, Bekoa; Mulabisana, Thifhelimbilu; Marimira, Kwangwari; Hlatywayo, Dumisani J.; Akpan, Ofonime; Amponsah, Paulina; Georges, Tuluka M.; Durrheim, Ray

    2013-04-01

    Large magnitude earthquakes have been observed in Sub-Saharan Africa in the recent past, such as the Machaze event of 2006 (Mw, 7.0) in Mozambique and the 2009 Karonga earthquake (Mw 6.2) in Malawi. The December 13, 1910 earthquake (Ms = 7.3) in the Rukwa rift (Tanzania) is the largest of all instrumentally recorded events known to have occurred in East Africa. The overall earthquake hazard in the region is on the lower side compared to other earthquake prone areas in the globe. However, the risk level is high enough for it to receive attention of the African governments and the donor community. The latest earthquake hazard map for the sub-Saharan Africa was done in 1999 and updating is long overdue as several development activities in the construction industry is booming allover sub-Saharan Africa. To this effect, regional seismologists are working together under the GEM (Global Earthquake Model) framework to improve incomplete, inhomogeneous and uncertain catalogues. The working group is also contributing to the UNESCO-IGCP (SIDA) 601 project and assessing all possible sources of data for the catalogue as well as for the seismotectonic characteristics that will help to develop a reasonable hazard model in the region. In the current progress, it is noted that the region is more seismically active than we thought. This demands the coordinated effort of the regional experts to systematically compile all available information for a better output so as to mitigate earthquake risk in the sub-Saharan Africa.

  8. Towards Practical, Real-Time Estimation of Spatial Aftershock Probabilities: A Feasibility Study in Earthquake Hazard

    NASA Astrophysics Data System (ADS)

    Morrow, P.; McCloskey, J.; Steacy, S.

    2001-12-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemetered seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following days to tens of days. Specifically, the project aims to assess the

  9. Regional liquefaction hazard evaluation following the 2010-2011 Christchurch (New Zealand) earthquake sequence

    NASA Astrophysics Data System (ADS)

    Begg, John; Brackley, Hannah; Irwin, Marion; Grant, Helen; Berryman, Kelvin; Dellow, Grant; Scott, David; Jones, Katie; Barrell, David; Lee, Julie; Townsend, Dougal; Jacka, Mike; Harwood, Nick; McCahon, Ian; Christensen, Steve

    2013-04-01

    Following the damaging 4 Sept 2010 Mw7.1 Darfield Earthquake, the 22 Feb 2011 Christchurch Earthquake and subsequent damaging aftershocks, we completed a liquefaction hazard evaluation for c. 2700 km2 of the coastal Canterbury region. Its purpose was to distinguish at a regional scale areas of land that, in the event of strong ground shaking, may be susceptible to damaging liquefaction from areas where damaging liquefaction is unlikely. This information will be used by local government for defining liquefaction-related geotechnical investigation requirements for consent applications. Following a review of historic records of liquefaction and existing liquefaction assessment maps, we undertook comprehensive new work that included: a geologic context from existing geologic maps; geomorphic mapping using LiDAR and integrating existing soil map data; compilation of lithological data for the surficial 10 m from an extensive drillhole database; modelling of depth to unconfined groundwater from existing subsurface and surface water data. Integrating and honouring all these sources of information, we mapped areas underlain by materials susceptible to liquefaction (liquefaction-prone lithologies present, or likely, in the near-surface, with shallow unconfined groundwater) from areas unlikely to suffer widespread liquefaction damage. Comparison of this work with more detailed liquefaction susceptibility assessment based on closely spaced geotechnical probes in Christchurch City provides a level of confidence in these results. We tested our susceptibility map by assigning a matrix of liquefaction susceptibility rankings to lithologies recorded in drillhole logs and local groundwater depths, then applying peak ground accelerations for four earthquake scenarios from the regional probabilistic seismic hazard model (25 year return = 0.13g; 100 year return = 0.22g; 500 year return = 0.38g and 2500 year return = 0.6g). Our mapped boundary between liquefaction-prone areas and areas

  10. Multi-hazards risk assessment at different levels

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2012-04-01

    Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The

  11. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could

  12. Comprehensive seismic hazard assessment of Tripura and Mizoram states

    NASA Astrophysics Data System (ADS)

    Sitharam, T. G.; Sil, Arjun

    2014-06-01

    Northeast India is one of the most highly seismically active regions in the world with more than seven earthquakes on an average per year of magnitude 5.0 and above. Reliable seismic hazard assessment could provide the necessary design inputs for earthquake resistant design of structures in this region. In this study, deterministic as well as probabilistic methods have been attempted for seismic hazard assessment of Tripura and Mizoram states at bedrock level condition. An updated earthquake catalogue was collected from various national and international seismological agencies for the period from 1731 to 2011. The homogenization, declustering and data completeness analysis of events have been carried out before hazard evaluation. Seismicity parameters have been estimated using G-R relationship for each source zone. Based on the seismicity, tectonic features and fault rupture mechanism, this region was divided into six major subzones. Region specific correlations were used for magnitude conversion for homogenization of earthquake size. Ground motion equations (Atkinson and Boore 2003; Gupta 2010) were validated with the observed PGA (peak ground acceleration) values before use in the hazard evaluation. In this study, the hazard is estimated using linear sources, identified in and around the study area. Results are presented in the form of PGA using both DSHA (deterministic seismic hazard analysis) and PSHA (probabilistic seismic hazard analysis) with 2 and 10% probability of exceedance in 50 years, and spectral acceleration (T = 0. 2 s, 1.0 s) for both the states (2% probability of exceedance in 50 years). The results are important to provide inputs for planning risk reduction strategies, for developing risk acceptance criteria and financial analysis for possible damages in the study area with a comprehensive analysis and higher resolution hazard mapping.

  13. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    ,

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  14. Tsunami hazard warning and risk prediction based on inaccurate earthquake source parameters

    NASA Astrophysics Data System (ADS)

    Goda, Katsuichiro; Abilova, Kamilla

    2016-02-01

    This study investigates the issues related to underestimation of the earthquake source parameters in the context of tsunami early warning and tsunami risk assessment. The magnitude of a very large event may be underestimated significantly during the early stage of the disaster, resulting in the issuance of incorrect tsunami warnings. Tsunamigenic events in the Tohoku region of Japan, where the 2011 tsunami occurred, are focused on as a case study to illustrate the significance of the problems. The effects of biases in the estimated earthquake magnitude on tsunami loss are investigated using a rigorous probabilistic tsunami loss calculation tool that can be applied to a range of earthquake magnitudes by accounting for uncertainties of earthquake source parameters (e.g., geometry, mean slip, and spatial slip distribution). The quantitative tsunami loss results provide valuable insights regarding the importance of deriving accurate seismic information as well as the potential biases of the anticipated tsunami consequences. Finally, the usefulness of rigorous tsunami risk assessment is discussed in defining critical hazard scenarios based on the potential consequences due to tsunami disasters.

  15. Tsunami hazard warning and risk prediction based on inaccurate earthquake source parameters

    NASA Astrophysics Data System (ADS)

    Goda, K.; Abilova, K.

    2015-12-01

    This study investigates the issues related to underestimation of the earthquake source parameters in the context of tsunami early warning and tsunami risk assessment. The magnitude of a very large event may be underestimated significantly during the early stage of the disaster, resulting in the issuance of incorrect tsunami warnings. Tsunamigenic events in the Tohoku region of Japan, where the 2011 tsunami occurred, are focused on as a case study to illustrate the significance of the problems. The effects of biases in the estimated earthquake magnitude on tsunami loss are investigated using a rigorous probabilistic tsunami loss calculation tool that can be applied to a range of earthquake magnitudes by accounting for uncertainties of earthquake source parameters (e.g. geometry, mean slip, and spatial slip distribution). The quantitative tsunami loss results provide with valuable insights regarding the importance of deriving accurate seismic information as well as the potential biases of the anticipated tsunami consequences. Finally, usefulness of rigorous tsunami risk assessment is discussed in defining critical hazard scenarios based on the potential consequences due to tsunami disasters.

  16. A Quantitative Appraisal of Earthquake Hazard Parameters Evaluated from Bayesian Approach for Different Regions in Iranian Plateau

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Türker, Tügba; Bayrak, Yusuf

    2016-06-01

    In this study, we used the program for seismic hazard Bayesian estimate which was elaborated by Alexey Lyubushin. Our study is the next in the sequence of applications of this software to seismic hazard assessment in different regions of the world. However, earthquake hazard parameters of maximum regional magnitude (M_{ max }), β value and seismic activity rate or intensity ( λ) and their uncertainties for the 15 different source regions in Iranian Plateau have been evaluated with the help of a complete and homogeneous earthquake catalogue during the period 1900-2014 with M_{{w}} ≥4.0. The estimated M_{ max } values varies between 6.25 and 8.37. Lowest value is observed in the Zagros foredeep whereas highest value is observed in the Makran. Also, it is observed that there is a strong relationship between the estimated maximum earthquake magnitudes estimated by Bayesian approach and maximum observed magnitudes. Moreover, in this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in 15 different source regions. Based on computed earthquake hazard parameters, the prerequisite guides to the earthquake estimation of the parameters referred to as the most seismically active regions of Iranian Plateau. The Makran and East Iran show earthquake magnitude greater than 8.0 in next 100-years with 90 % probability level as compared to other regions, which declares that these regions are more susceptible to occurrence of large earthquakes. The outcomes which obtained in the study may have useful implications in the probabilistic seismic hazard studies of Iranian Plateau.

  17. Tsunami Hazard Assessment in Guam

    NASA Astrophysics Data System (ADS)

    Arcas, D.; Uslu, B.; Titov, V.; Chamberlin, C.

    2008-12-01

    The island of Guam is located approximately 1500 miles south of Japan, in the vicinity of the Mariana Trench. It is surrounded in close proximity by three subduction zones, Nankai-Taiwan, East Philippines and Mariana Trench that pose a considerable near to intermediate field tsunami threat. Tsunami catalogues list 14 tsunamigenic earthquake with Mw≥8.0 since 1900 only in this region, (Soloviev and Go, 1974; Lander, 1993; Iida, 1984; Lander and Lowell, 2002), however the island has not been significantly affected by some of the largest far-field events of the past century, such as the 1952 Kamchatka, 1960 Chile, and the 1964 Great Alaska earthquake. An assessment of the tsunami threat to the island from both near and far field sources, using forecast tools originally developed at NOAA's Pacific Marine Environmental Laboratory (PMEL) for real-time forecasting of tsunamis is presented here. Tide gauge records from 1952 Kamchatka, 1964 Alaska, and 1960 Chile earthquakes at Apra Harbor are used to validate our model set up, and to explain the limited impact of these historical events on Guam. Identification of worst-case scenarios, and determination of tsunamigenic effective source regions are presented for five vulnerable locations on the island via a tsunami sensitivity study. Apra Harbor is the site of a National Ocean Service (NOS) tide gauge and the biggest harbor on the island. Tumon Bay, Pago Bay, Agana Bay and Inarajan Bay are densely populated areas that require careful investigation. The sensitivity study shows that earthquakes from Eastern Philippines present a major threat to west coast facing sites, whereas the Marina Trench poses the biggest concern to the east coast facing sites.

  18. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in

  19. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    USGS Publications Warehouse

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated

  20. A Probabilistic Tsunami Hazard Assessment Methodology

    NASA Astrophysics Data System (ADS)

    Gonzalez, Frank; Geist, Eric; Jaffe, Bruce; Kanoglu, Utku; Mofjeld, Harold; Synolakis, Costas; Titov, Vasily; Arcas, Diego

    2010-05-01

    A methodology for probabilistic tsunami hazard assessment (PTHA) will be described for multiple near- and far-field seismic sources. The method integrates tsunami inundation modeling with the approach of probabilistic seismic hazard assessment (PSHA). A database of inundation simulations is developed, with each simulation corresponding to an earthquake source for which the seismic parameters and mean interevent time have been estimated. A Poissonian model is then adopted for estimating the probability that tsunami flooding will exceed a given level during a specified period of time, taking into account multiple sources and multiple causes of uncertainty. Uncertainty in the tidal stage at tsunami arrival is dealt with by developing a parametric expression for the probability density function of the sum of the tides and a tsunami; uncertainty in the slip distribution of the near-field source is dealt with probabilistically by considering multiple sources in which width and slip values vary, subject to the constraint of a constant moment magnitude. The method was applied to Seaside, Oregon, to obtain estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. These results will be presented and discussed, including the primary remaining sources of uncertainty -- those associated with interevent time estimates, the modeling of background sea level, and temporal changes in bathymetry and topography. PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk.

  1. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  2. Earthquake and Flood Risk Assessments for Europe and Central Asia

    NASA Astrophysics Data System (ADS)

    Murnane, R. J.; Daniell, J. E.; Ward, P.; Winsemius, H.; Tijssen, A.; Toro, J.

    2015-12-01

    We report on a flood and earthquake risk assessment for 32 countries in Europe and Central Asia with a focus on how current flood and earthquake risk might evolve in the future due to changes in climate, population, and GDP. The future hazard and exposure conditions used for the risk assessment are consistent with selected IPCC AR5 Representative Concentration Pathways (RCPs) and Shared Socioeconomic Pathways (SSPs). Estimates of 2030 and 2080 population and GDP are derived using the IMAGE model forced by the socioeconomic conditions associated with the SSPs. Flood risk is modeled using the probabilistic GLOFRIS global flood risk modeling cascade which starts with meteorological fields derived from reanalysis data or climate models. For 2030 and 2080 climate conditions, the meteorological fields are generated from five climate models forced by the RCP4.5 and RCP8.5 scenarios. Future flood risk is estimated using population and GDP exposures consistent with the SSP2 and SSP3 scenarios. Population and GDP are defined as being affected by a flood when a grid cell receives any depth of flood inundation. The earthquake hazard is quantified using a 10,000-year stochastic catalog of over 15.8 million synthetic earthquake events of at least magnitude 5. Ground motion prediction and estimates of local site conditions are used to determine PGA. Future earthquake risk is estimated using population and GDP exposures consistent with all five SSPs. Population and GDP are defined as being affected by an earthquake when a grid cell experiences ground motion equaling or exceeding MMI VI. For most countries, changes in exposure alter flood risk to a greater extent than changes in climate. For both flood and earthquake, the spread in risk grows over time. There are large uncertainties due to the methodology; however, the results are not meant to be definitive. Instead they will be used to initiate discussions with governments regarding efforts to manage disaster risk.

  3. Earthquake engineering research: 1982

    NASA Astrophysics Data System (ADS)

    The Committee on Earthquake Engineering Research addressed two questions: What progress has research produced in earthquake engineering and which elements of the problem should future earthquake engineering pursue. It examined and reported in separate chapters of the report: Applications of Past Research, Assessment of Earthquake Hazard, Earthquake Ground Motion, Soil Mechanics and Earth Structures, Analytical and Experimental Structural Dynamics, Earthquake Design of Structures, Seismic Interaction of Structures and Fluids, Social and Economic Aspects, Earthquake Engineering Education, Research in Japan.

  4. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    NASA Astrophysics Data System (ADS)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  5. RISMUR II: New seismic hazard and risk study in Murcia Region after the Lorca Earthquake, 2011

    NASA Astrophysics Data System (ADS)

    Benito, Belen; Gaspar, Jorge; Rivas, Alicia; Quiros, Ligia; Ruiz, Sandra; Hernandez, Roman; Torres, Yolanda; Staller, Sandra

    2016-04-01

    The Murcia Region, is one of the highest seimic activity of Spain, located SE Iberian Peninsula. A system of active faults are included in the región, where the most recent damaging eartquakes took place in our country: 1999, 2002, 2005 and 2011. The last one ocurred in Lorca, causing 9 deads and notably material losses, including the artistic stock. The seismic emergency plann of the Murcia Region was developed in 2006, based of the results of the risk Project RISMUR I, which among other conslusions pointed out Lorca as one of the municipalities with highest risk in the province,. After the Lorca earthquake in 2011, a revisión of the previous study has been developed through the Project RISMUR II, including data of this earthquake , as well as updted Data Base of: seismicity, active faults, strong motion records, cadastre, vulnerability, etc. In adittion, the new study includes, some methodology innovations: modelization of faults as independent units for hazard assessment, analytic methods for risk estimations using data of the earthquake for calibration of capacity and fragility curves. In this work the results of RISMUR II are presented, which are compared with those reached in RISMUR I. The main conclusions are: Increasing of the hazard along the central system fault SW-NE (Alhama de Murcia, Totana nad Carracoy), which involve highest expected damages in the nearest populations to these faults: Lorca, Totana, Alcantarilla and Murcia.

  6. Earthquake and tsunami hazard in West Sumatra: integrating science, outreach, and local stakeholder needs

    NASA Astrophysics Data System (ADS)

    McCaughey, J.; Lubis, A. M.; Huang, Z.; Yao, Y.; Hill, E. M.; Eriksson, S.; Sieh, K.

    2012-04-01

    The Earth Observatory of Singapore (EOS) is building partnerships with local to provincial government agencies, NGOs, and educators in West Sumatra to inform their policymaking, disaster-risk-reduction, and education efforts. Geodetic and paleoseismic studies show that an earthquake as large as M 8.8 is likely sometime in the coming decades on the Mentawai patch of the Sunda megathrust. This earthquake and its tsunami would be devastating for the Mentawai Islands and neighboring areas of the western Sumatra coast. The low-lying coastal Sumatran city of Padang (pop. ~800,000) has been the object of many research and outreach efforts, especially since 2004. Padang experienced deadly earthquakes in 2007 and 2009 that, though tragedies in their own right, served also as wake-up calls for a larger earthquake to come. However, there remain significant barriers to linking science to policy: extant hazard information is sometimes contradictory or confusing for non-scientists, while turnover of agency leadership and staff means that, in the words of one local advocate, "we keep having to start from zero." Both better hazard knowledge and major infrastructure changes are necessary for risk reduction in Padang. In contrast, the small, isolated villages on the outlying Mentawai Islands have received relatively fewer outreach efforts, yet many villages have the potential for timely evacuation with existing infrastructure. Therefore, knowledge alone can go far toward risk reduction. The tragic October 2010 Mentawai tsunami has inspired further disaster-risk reduction work by local stakeholders. In both locations, we are engaging policymakers and local NGOs, providing science to help inform their work. Through outreach contacts, the Mentawai government requested that we produce the first-ever tsunami hazard map for their islands; this aligns well with scientific interests at EOS. We will work with the Mentawai government on the presentation and explanation of the hazard map, as

  7. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  8. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    USGS Publications Warehouse

    McNamara, Daniel E.; Yeck, William; Barnhart, William D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, Amod; Hough, S.E.; Benz, Harley M.; Earle, Paul

    2017-01-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard.Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a ~ 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10–15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  9. Urban Heat Wave Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Quattrochi, D. A.; Jedlovec, G.; Crane, D. L.; Meyer, P. J.; LaFontaine, F.

    2016-12-01

    Heat waves are one of the largest causes of environmentally-related deaths globally and are likely to become more numerous as a result of climate change. The intensification of heat waves by the urban heat island effect and elevated humidity, combined with urban demographics, are key elements leading to these disasters. Better warning of the potential hazards may help lower risks associated with heat waves. Moderate resolution thermal data from NASA satellites is used to derive high spatial resolution estimates of apparent temperature (heat index) over urban regions. These data, combined with demographic data, are used to produce a daily heat hazard/risk map for selected cities. MODIS data are used to derive daily composite maximum and minimum land surface temperature (LST) fields to represent the amplitude of the diurnal temperature cycle and identify extreme heat days. Compositing routines are used to generate representative daily maximum and minimum LSTs for the urban environment. The limited effect of relative humidity on the apparent temperature (typically 10-15%) allows for the use of modeled moisture fields to convert LST to apparent temperature without loss of spatial variability. The daily max/min apparent temperature fields are used to identify abnormally extreme heat days relative to climatological values in order to produce a heat wave hazard map. Reference to climatological values normalizes the hazard for a particular region (e.g., the impact of an extreme heat day). A heat wave hazard map has been produced for several case study periods and then computed on a quasi-operational basis during the summer of 2016 for Atlanta, GA, Chicago, IL, St. Louis, MO, and Huntsville, AL. A hazard does not become a risk until someone or something is exposed to that hazard at a level that might do harm. Demographic information is used to assess the urban risk associated with the heat wave hazard. Collectively, the heat wave hazard product can warn people in urban

  10. Sedimentary Basins: A Deeper Look at Seattle and Portland's Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Thompson, M.; Frankel, A. D.; Wirth, E. A.; Vidale, J. E.; Han, J.

    2015-12-01

    to assess the shaking hazards for Portland due to local earthquakes and great earthquakes on the CSZ.

  11. Bayesian network learning for natural hazard assessments

    NASA Astrophysics Data System (ADS)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  12. Challenges in assessing seismic hazard in intraplate Europe

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Liu, Mian; Camelbeeck, Thierry; Merino, Miguel; Landgraf, Angela; Hintersberger, Esther; Kübler, Simon

    2016-04-01

    Intraplate seismicity is often characterized by episodic, clustered and migrating earth- quakes and extended after-shock sequences. Can these observations - primarily from North America, China and Australia - usefully be applied to seismic hazard assessment for intraplate Europe? Existing assessments are based on instrumental and historical seismicity of the past c. 1000 years, as well as some data for active faults. This time span probably fails to capture typical large-event recurrence intervals of the order of tens of thousands of years. Palaeoseismology helps to lengthen the observation window, but preferentially produces data in regions suspected to be seismically active. Thus the expected maximum magnitudes of future earthquakes are fairly uncertain, possibly underestimated, and earthquakes are likely to occur in unexpected locations. These issues particularly arise in considering the hazards posed by low-probability events to both heavily populated areas and critical facilities. For example, are the variations in seismicity (and thus assumed seismic hazard) along the Rhine Graben a result of short sampling or are they real? In addition to a better assessment of hazards with new data and models, it is important to recognize and communicate uncertainties in hazard estimates. The more users know about how much confidence to place in hazard maps, the more effectively the maps can be used.

  13. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    SciTech Connect

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  14. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    NASA Astrophysics Data System (ADS)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  15. PAGER - Rapid Assessment of an Earthquake's Impact

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.

    2007-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system to rapidly assess the number of people and regions exposed to severe shaking by an earthquake, and inform emergency responders, government agencies, and the media to the scope of the potential disaster. PAGER monitors the U.S. Geological Survey?s near real-time U.S. and global earthquake detections and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts.

  16. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily

  17. Global assessment of human losses due to earthquakes

    USGS Publications Warehouse

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  18. The application of the geography census data in seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Yuan, Shen; Ying, Zhang

    2017-04-01

    Limited by basic data timeliness to earthquake emergency database in Sichuan province, after the earthquake disaster assessment results and the actual damage there is a certain gap. In 2015, Sichuan completed the province census for the first time which including topography, traffic, vegetation coverage, water area, desert and bare ground, traffic network, the census residents and facilities, geographical unit, geological hazard as well as the Lushan earthquake-stricken area's town planning construction and ecological environment restoration. On this basis, combining with the existing achievements of basic geographic information data and high resolution image data, supplemented by remote sensing image interpretation and geological survey, Carried out distribution and change situation of statistical analysis and information extraction for earthquake disaster hazard-affected body elements such as surface coverage, roads, structures infrastructure in Lushan county before 2013 after 2015. At the same time, achieved the transformation and updating from geographical conditions census data to earthquake emergency basic data through research their data type, structure and relationship. Finally, based on multi-source disaster information including hazard-affected body changed data and Lushan 7.0 magnitude earthquake CORS network coseismal displacement field, etc. obtaining intensity control points through information fusion. Then completed the seismic influence field correction and assessed earthquake disaster again through Sichuan earthquake relief headquarters technology platform. Compared the new assessment result,original assessment result and actual earthquake disaster loss which shows that the revised evaluation result is more close to the actual earthquake disaster loss. In the future can realize geographical conditions census data to earthquake emergency basic data's normalized updates, ensure the timeliness to earthquake emergency database meanwhile improve the

  19. Earthquake stress triggers, stress shadows, and seismic hazard

    USGS Publications Warehouse

    Harris, R.A.

    2000-01-01

    Many aspects of earthquake mechanics remain an enigma at the beginning of the twenty-first century. One potential bright spot is the realization that simple calculations of stress changes may explain some earthquake interactions, just as previous and ongoing studies of stress changes have begun to explain human- induced seismicity. This paper, which is an update of Harris1, reviews many published works and presents a compilation of quantitative earthquake-interaction studies from a stress change perspective. This synthesis supplies some clues about certain aspects of earthquake mechanics. It also demonstrates that much work remains to be done before we have a complete story of how earthquakes work.

  20. Monogenetic volcanic hazards and assessment

    NASA Astrophysics Data System (ADS)

    Connor, C.; Connor, L. J.; Richardson, J. A.

    2012-12-01

    Many of the Earth's major cities are build on the products of monogenetic volcanic eruptions and within geologically active basaltic volcanic fields. These cities include Mexico City (Mexico), Auckland (New Zealand), Melbourne (Australia), and Portland (USA) to name a few. Volcanic hazards in these areas are complex, and involve the potential formation of new volcanic vents and associated hazards, such as lava flows, tephra fallout, and ballistic hazards. Hazard assessment is complicated by the low recurrence rate of volcanism in most volcanic fields. We have developed a two-stage process for probabilistic modeling monogenetic volcanic hazards. The first step is an estimation of the possible locations of future eruptive vents based on kernel density estimation and recurrence rate of volcanism using Monte Carlo simulation and accounting for uncertainties in age determinations. The second step is convolution of this spatial density / recurrence rate model with hazard codes for modeling lava inundation, tephra fallout, and ballistic impacts. A methodology is presented using this two-stage approach to estimate lava flow hazard in several monogenetic volcanic fields, including at a nuclear power plant site near the Shamiram Plateau, a Quaternary volcanic field in Armenia. The location of possible future vents is determined by estimating spatial density from a distribution of 18 mapped vents using a 2-D elliptical Gaussian kernel function. The SAMSE method, a modified asymptotic mean squared error approach, uses the distribution of known eruptive vents to optimally determine a smoothing bandwidth for the Gaussian kernel function. The result is a probability map of vent density. A large random sample (N=10000) of vent locations is drawn from this probability map. For each randomly sampled vent location, a lava flow inundation model is executed. Lava flow input parameters (volume and average thickness) are determined from distributions fit to field observations of the low

  1. Magnetohydrodynamics and its hazard assessment

    NASA Astrophysics Data System (ADS)

    Chan, W.-T.

    1981-11-01

    Potential occupational and environmental hazards of a typical combined open-cycle MHD/steam cycle power plant are critically assessed on the basis of direct/indirect research information. Among the potential occupational hazards, explosion at the coal feed system or at the superconducting magnet; combustor rupture in a confined pit; high intensity dc magnetic field exposure at the channel; and combustion products leakage from the pressurized systems are of primary concern. While environmental emissions of SO(x), NO(x) and fine particulates are considered under control in experimental scale, control effectiveness at high capacity operation remains uncertain. Gaseous emission of some highly toxic trace elements including radioactive species may be of concern without gas cleaning device in the MHD design.

  2. Earthquake hazards to domestic water distribution systems in Salt Lake County, Utah

    USGS Publications Warehouse

    Highland, Lynn M.

    1985-01-01

    A magnitude-7. 5 earthquake occurring along the central portion of the Wasatch Fault, Utah, may cause significant damage to Salt Lake County's domestic water system. This system is composed of water treatment plants, aqueducts, distribution mains, and other facilities that are vulnerable to ground shaking, liquefaction, fault movement, and slope failures. Recent investigations into surface faulting, landslide potential, and earthquake intensity provide basic data for evaluating the potential earthquake hazards to water-distribution systems in the event of a large earthquake. Water supply system components may be vulnerable to one or more earthquake-related effects, depending on site geology and topography. Case studies of water-system damage by recent large earthquakes in Utah and in other regions of the United States offer valuable insights in evaluating water system vulnerability to earthquakes.

  3. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at

  4. Hazards assessment for the Hazardous Waste Storage Facility

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency.

  5. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  6. NGNP SITE 2 HAZARDS ASSESSMENT

    SciTech Connect

    Wayne Moe

    2011-10-01

    The Next Generation Nuclear Plant (NGNP) Project initiated at Idaho National Laboratory (INL) by the U.S. Department of Energy pursuant to the 2005 Energy Policy Act, is based on research and development activities supported by the Generation IV Nuclear Energy Systems Initiative. The principal objective of the NGNP Project is to support commercialization of the high temperature gas-cooled reactor (HTGR) technology. The HTGR is a helium-cooled and graphite-moderated reactor that can operate at temperatures much higher than those of conventional light water reactor (LWR) technologies. Accordingly, it can be applied in many industrial applications as a substitute for burning fossil fuels, such as natural gas, to generate process heat in addition to producing electricity, which is the principal application of current LWRs. Nuclear energy in the form of LWRs has been used in the U.S. and internationally principally for the generation of electricity. However, because the HTGR operates at higher temperatures than LWRs, it can be used to displace the use of fossil fuels in many industrial applications. It also provides a carbon emission-free energy supply. For example, the energy needs for the recovery and refining of petroleum, for the petrochemical industry and for production of transportation fuels and feedstocks using coal conversion processes require process heat provided at temperatures approaching 800 C. This temperature range is readily achieved by the HTGR technology. This report summarizes a site assessment authorized by INL under the NGNP Project to determine hazards and potential challenges that site owners and HTGR designers need to be aware of when developing the HTGR design for co-location at industrial facilities, and to evaluate the site for suitability considering certain site characteristics. The objectives of the NGNP site hazard assessments are to do an initial screening of representative sites in order to identify potential challenges and restraints

  7. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes.

  8. Identification of Potential Hazard using Hazard Identification and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Sari, R. M.; Syahputri, K.; Rizkya, I.; Siregar, I.

    2017-03-01

    This research was conducted in the paper production’s company. These Paper products will be used as a cigarette paper. Along in the production’s process, Company provides the machines and equipment that operated by workers. During the operations, all workers may potentially injured. It known as a potential hazard. Hazard identification and risk assessment is one part of a safety and health program in the stage of risk management. This is very important as part of efforts to prevent occupational injuries and diseases resulting from work. This research is experiencing a problem that is not the identification of potential hazards and risks that would be faced by workers during the running production process. The purpose of this study was to identify the potential hazards by using hazard identification and risk assessment methods. Risk assessment is done using severity criteria and the probability of an accident. According to the research there are 23 potential hazard that occurs with varying severity and probability. Then made the determination Risk Assessment Code (RAC) for each potential hazard, and gained 3 extreme risks, 10 high risks, 6 medium risks and 3 low risks. We have successfully identified potential hazard using RAC.

  9. A fractal approach to probabilistic seismic hazard assessment

    NASA Technical Reports Server (NTRS)

    Turcotte, D. L.

    1989-01-01

    The definition of a fractal distribution is that the number of objects (events) N with a characteristic size greater than r satisfies the relation N proportional to r exp - D is the fractal dimension. The applicability of a fractal relation implies that the underlying physical process is scale-invariant over the range of applicability of the relation. The empirical frequency-magnitude relation for earthquakes defining a b-value is a fractal relation with D = 2b. Accepting the fractal distribution, the level of regional seismicity can be related to the rate of regional strain and the magnitude of the largest characteristic earthquake. High levels of seismic activity indicate either a large regional strain or a low-magnitude maximum characteristic earthquake (or both). If the regional seismicity has a weak time dependence, the approach can be used to make probabilistic seismic hazard assessments.

  10. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  11. WEB-based System for Aftershock Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Baranov, Sergey; Shebalin, Peter

    2017-04-01

    The first version of web-based system for automatic aftershock hazard assessment is available at http://afcast.org/afcast. The system software downloads earthquake data every 2 hours from ANSS Comprehensive Earthquake Catalog (ComCat, http://earthquake.usgs.gov/data/comcat/) provided on-line by USGS. Currently the system is aimed to assess hazard of aftershocks of M5.5+ after earthquakes of M6.5+. The access to the system is unlimited to the registered users only. First, the system estimates in quasi real time mode an area where strong aftershocks are expected. This area is modeled by an ellipse and stadium (the locus of distances from a line segment not exceeding a given value), both centered and oriented according to the main shock rupture, estimated using epicenters of the first 12 hours aftershocks. The sizes of the areas are controlled by q part of earthquakes for 12 hours after the mainshock from the enclosed circle with radius of 0.03x10^M/2. The chosen q-values are based on retrospective (1980-2015) analysis of the error diagram and imply three forecasting strategies: "soft", "neutral" and "hard". The "soft" strategy minimizes false alarms at a reasonable rate of failures to predict. The "hard" strategy, in contrary, minimizes the rate of failures to predict at a reasonable area of alarms. The "neutral" strategy equalizes errors of two types. Three concentric ellipses or stadiums may serve as benchmarks for the choice corresponding to specific hazard reduction measures between the three strategies. Next, the system will estimate the period and magnitude of the strongest aftershock expected inside the alarm area. This research was carried out at the expense of the Russian Science Foundation (Project No. 16-17-00093).

  12. Seismic hazard assessment in Central Asia using smoothed seismicity approaches

    NASA Astrophysics Data System (ADS)

    Ullah, Shahid; Bindi, Dino; Zuccolo, Elisa; Mikhailova, Natalia; Danciu, Laurentiu; Parolai, Stefano

    2014-05-01

    Central Asia has a long history of large to moderate frequent seismicity and is therefore considered one of the most seismically active regions with a high hazard level in the world. In the hazard map produced at global scale by GSHAP project in 1999( Giardini, 1999), Central Asia is characterized by peak ground accelerations with return period of 475 years as high as 4.8 m/s2. Therefore Central Asia was selected as a target area for EMCA project (Earthquake Model Central Asia), a regional project of GEM (Global Earthquake Model) for this area. In the framework of EMCA, a new generation of seismic hazard maps are foreseen in terms of macro-seismic intensity, in turn to be used to obtain seismic risk maps for the region. Therefore Intensity Prediction Equation (IPE) had been developed for the region based on the distribution of intensity data for different earthquakes occurred in Central Asia since the end of 19th century (Bindi et al. 2011). The same observed intensity distribution had been used to assess the seismic hazard following the site approach (Bindi et al. 2012). In this study, we present the probabilistic seismic hazard assessment of Central Asia in terms of MSK-64 based on two kernel estimation methods. We consider the smoothed seismicity approaches of Frankel (1995), modified for considering the adaptive kernel proposed by Stock and Smith (2002), and of Woo (1996), modified for considering a grid of sites and estimating a separate bandwidth for each site. The activity rate maps are shown from Frankel approach showing the effects of fixed and adaptive kernel. The hazard is estimated for rock site condition based on 10% probability of exceedance in 50 years. Maximum intensity of about 9 is observed in the Hindukush region.

  13. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  14. Challenges in Assessing Seismic Hazard in Intraplate Europe

    NASA Astrophysics Data System (ADS)

    Hintersberger, E.; Kuebler, S.; Landgraf, A.; Stein, S. A.

    2014-12-01

    Intraplate regions are often characterized by scattered, clustered and migrating seismicity and the occurrence of low-strain areas next to high-strain ones. Increasing evidence for large paleoearthquakes in such regions together with population growth and development of critical facilities, call for better assessments of earthquake hazards. Existing seismic hazard assessment for intraplate Europe is based on instrumental and historical seismicity of the past 1000 years, as well some active fault data. These observations face important limitations due to the quantity and quality of the available data bases. Even considering the long record of historical events in some populated areas of Europe, this time-span of thousand years likely fails to capture some faults' typical large-event recurrence intervals that are in the order of tens of thousands of years. Paleoseismology helps lengthen the observation window, but only produces point measurements, and preferentially in regions suspected to be seismically active. As a result, the expected maximum magnitudes of future earthquakes are quite uncertain, likely to be underestimated, and earthquakes are likely to occur in unexpected locations. These issues in particular arise in the heavily populated Rhine Graben and Vienna Basin areas, and in considering the hazard to critical facilities like nuclear power plants posed by low-probability events.

  15. Multiple-site estimations in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Sokolov, Vladimir; Ismail-Zadeh, Alik

    2016-04-01

    We analyze specific features of multiple-site probabilistic seismic hazard assessment (PSHA), i.e. annual rate of ground motion level exceedance in at least one site of several sites of interest located within in an area or along a linear extended object. The relation between the multiple-scale hazard estimations and strong ground-motion records obtained during the 2008 Wenchuan (China) Mw 7.9 earthquake is discussed. The ground-motion records may be considered as an example of ground motion exceeding the design level estimated using the classical point-wise PSHA. We showed that the multiple-site hazard (MSH) assessment, when being performed for standard return period 475 years, provide reasonable estimations of the ground motions that may occur during the earthquake, parameters of which are close to maximum possible events accepted in PSHA for the region. Thus the MSH may be useful in estimation of maximum considered earthquake ground motion for the considered territory taking into account its extent.

  16. Probabilistic Seismic Hazard assessment for Sultanate of Oman

    NASA Astrophysics Data System (ADS)

    El Hussain, I. W.; Deif, A.; El-Hady, S.; Toksoz, M. N.; Al-Jabri, K.; Al-Hashmi, S.; Al-Toubi, K. I.; Al-Shijbi, Y.; Al-Saifi, M.

    2010-12-01

    Seismic hazard assessment for Oman is conducted utilizing probabilistic approach. Probabilistic Seismic Hazard Assessment (PSHA) has been performed within a logic tree framework. An earthquake catalogue for Oman was compiled and declustered to include only independent earthquakes. The declustered catalogue was used to define seismotectonic source model with 26 source zones that characterize earthquakes in the tectonic environments in and around Oman. The recurrence parameters for all the seismogenic zones are determined using the doubly bounded exponential distribution except the seismogenic zones of Makran subduction zone which were modeled using the characteristic distribution. The maximum earthquakes on known faults were determined geologically and the remaining zones were determined statistically from the compiled catalogue. Horizontal ground accelerations in terms of geometric mean were calculated using ground-motion prediction relationships that were developed from seismic data obtained from the shallow active environment, stable craton environment, and from subduction earthquakes. In this analysis, we have used alternative seismotectonic source models, maximum magnitude, and attenuation models and weighted them to account for the epistemic uncertainty. The application of this methodology leads to the definition of 5% damped seismic hazard maps at rock sites for 72, 475, and 2475 year return periods for spectral accelerations at periods of 0.0 (corresponding to peak ground acceleration), 0.1, 0.2, 0.3, 1.0 and 2.0 sec. Mean and 84th percentile acceleration contour maps were represented. The results also were displayed as uniform hazard spectra for rock sites in the cities of Khasab, Diba, Sohar, Muscat, Nizwa, Sur, and Salalah in Oman and the cities of Abu Dhabi and Dubai in UAE. The PGA across Oman ranges from 20 cm/sec2 in the Mid-West and 115 cm/sec2 at the northern part for 475 years return period and between 40 cm/sec2 and 180 cm/sec2 for 2475 years

  17. Microzonation of Seismic Hazards and Estimation of Human Fatality for Scenario Earthquakes in Chianan Area, Taiwan

    NASA Astrophysics Data System (ADS)

    Liu, K. S.; Chiang, C. L.; Ho, T. T.; Tsai, Y. B.

    2015-12-01

    In this study, we assess seismic hazards in the 57 administration districts of Chianan area, Taiwan in the form of ShakeMaps as well as to estimate potential human fatalities from scenario earthquakes on the three Type I active faults in this area. As a result, it is noted that two regions with high MMI intensity greater than IX in the map of maximum ground motion. One is in the Chiayi area around Minsyong, Dalin and Meishan due to presence of the Meishan fault and large site amplification factors which can reach as high as 2.38 and 2.09 for PGA and PGV, respectively, in Minsyong. The other is in the Tainan area around Jiali, Madou, Siaying, Syuejia, Jiangjyun and Yanshuei due to a disastrous earthquake occurred near the border between Jiali and Madou with a magnitude of Mw 6.83 in 1862 and large site amplification factors which can reach as high as 2.89 and 2.97 for PGA and PGV, respectively, in Madou. In addition, the probabilities in 10, 30, and 50-year periods with seismic intensity exceeding MMII VIII in above areas are greater than 45%, 80% and 95%, respectively. Moreover, from the distribution of probabilities, high values of greater than 95% over a 10 year period with seismic intensity corresponding to CWBI V and MMI VI are found in central and northern Chiayi and northern Tainan. At last, from estimation of human fatalities for scenario earthquakes on three active faults in Chianan area, it is noted that the numbers of fatalities increase rapidly for people above age 45. Compared to the 1946 Hsinhua earthquake, the number of fatality estimated from the scenario earthquake on the Hsinhua active fault is significantly high. However, the higher number of fatality in this case is reasonable after considering the probably reasons. Hence, we urge local and the central governments to pay special attention on seismic hazard mitigation in this highly urbanized area with large number of old buildings.

  18. Natural hazard assessment through an oriented GIS

    NASA Astrophysics Data System (ADS)

    Giammarinaro, M. S.; Alletti, M.; Azzara, R. M.; Canzoneri, V.; Maiorana, S.; Rovelli, A.; Tertulliani, A.; Vallone, P.

    2003-04-01

    Natural and in particular seismic hazard assessment for urban areas, characterized by variable geological and physical-mechanical properties, requires a high spatial resolution. This difficult task is made feasible when a great deal of stratigraphic, geotechnical, and macroseismic data are available. However these data are affected by inhomogeneities and large dispersion, deriving from the different sources and the acquisition procedure. A suitable data processing, assisted by dedicated tools, makes data omogeneization possible, reduces the dispersion, working out a reliable dataset which is the starting line for risk assessment. The mutual spatial relationships and the correlations between different kinds of data are easily visualized in a GIS framework. This produces an increase of the information about the studied area. In an oriented GIS, instruments and research keys in equipment carry out specific elaborations useful for natural hazard evaluation. For these reasons a dedicated GIS is the optimal tool to identify and define, at a high spatial resolution, areas subjected to higher natural hazard. Such a GIS, called City-GIS, has been developed at the Department of Geology of the Palermo University. It is especially endowed by instruments and research keys devoted to seismic hazard. A very reliable dataset concerning the urban area of Palermo was elaborated through City-GIS and stored in it. City-GIS has been successfully used during the sequence following the September 6, 2002, ML 5.6 earthquake. During the emergency, the system was an efficient support tool to correlate surface geology with damage focalization observed in the southeastern sector of the city. Moreover the City-GIS was used to select the optimal sites where eight seismological stations were installed to quantify the variability of sesmic response inside the city. The recorded data confirmed that large variations of ground motion occur in the urban area of Palermo, according to predictions based

  19. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  20. Characterisation of active faulting and earthquake hazard in the Mongolian Altay Mountains based on previously unknown ancient earthquake surface ruptures

    NASA Astrophysics Data System (ADS)

    Gregory, L. C.; Walker, R.; Nissen, E.; Mac Niocaill, C.; Gantulga, B.; Amgalan, B.

    2012-12-01

    Earthquakes in continental collision zones are typically distributed across a region that may be several thousands of kilometres away from the main collisional margin. This far-field deformation is poorly understood in terms of how strain is distributed onto upper crustal faults, particularly because active faults can be difficult to identify in regions where historical seismicity is sparse. The collision between India and Asia forms the most impressive example of active continental deformation on earth, with several zones of faulting and uplift extending across a region over 2500 km wide. The Altay Mountains, in western Mongolia, are at the northern edge of the India-Asia collision zone. Active dextral strike-slip faults in the Altay have produced M 8 earthquakes (such as the 1931 Fu Yun earthquake), and according to GPS measurements, the region accommodates approximately 7 mm/yr of shortening. Surface ruptures of pre-historic earthquakes are exceptionally preserved due to the cold and arid climate of the Altay. Observed surface ruptures are an effective extension to the historical seismic record, because the size and expression of ruptures may reveal important characteristics of the Altay active faults, such as typical earthquake magnitudes and definitive locations of active faults. We present observations of, previously unknown, surface ruptures and active faulting from the central Altay. The moment magnitudes of the ancient earthquakes are estimated based on the length of the ruptures using classic earthquake scaling laws. The newly discovered ruptures are combined with previously described earthquake ruptures to estimate the combined strike-slip rates of the Altay faults over the past ~1000 years on the basis of total moment release. This strike-slip rate will be discussed in the context of the modern-day estimates of shortening rate and the implications for the earthquake hazard in western Mongolia.

  1. Assessing volcanic hazards with Vhub

    NASA Astrophysics Data System (ADS)

    Palma, J. L.; Charbonnier, S.; Courtland, L.; Valentine, G.; Connor, C.; Connor, L.

    2012-04-01

    Vhub (online at vhub.org) is a virtual organization and community cyberinfrastructure designed for collaboration in volcanology research, education, and outreach. One of the core objectives of this project is to accelerate the transfer of research tools to organizations and stakeholders charged with volcano hazard and risk mitigation (such as volcano observatories). Vhub offers a clearinghouse for computational models of volcanic processes and data analysis, documentation of those models, and capabilities for online collaborative groups focused on issues such as code development, configuration management, benchmarking, and validation. Vhub supports computer simulations and numerical modeling at two levels: (1) some models can be executed online via Vhub, without needing to download code and compile on the user's local machine; (2) other models are not available for online execution but for offline use in the user's computer. VHub also has wikis, blogs and group functions around specific topics to encourage collaboration, communication and discussion. Some of the simulation tools currently available to Vhub users are: Energy Cone (rapid delineation of the impact zone by pyroclastic density currents), Tephra2 (tephra dispersion forecast tool), Bent (atmospheric plume analysis), Hazmap (simulate sedimentation of volcanic particles) and TITAN2D (mass flow simulation tool). The list of online simulations available on Vhub is expected to expand considerably as the volcanological community becomes more involved in the project. This presentation focuses on the implementation of online simulation tools, and other Vhub's features, for assessing volcanic hazards following approaches similar to those reported in the literature. Attention is drawn to the minimum computational resources needed by the user to carry out such analyses, and to the tools and media provided to facilitate the effective use of Vhub's infrastructure for hazard and risk assessment. Currently the project

  2. Damage-consistent hazard assessment - the revival of intensities

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2016-04-01

    Proposed key-note speech (Introduction of session). Current civil engineering standards for residential buildings in many countries are based on (frequently probabilistic) seismic hazard assessments using ground motion parameters like peak ground accelerations or pseudo displacements as hazard parameters. This approach has its roots in the still wide spread force-based design of structures using simplified methods like linear response spectra in combination with equivalent static forces procedures for the design of structures. In the engineering practice this has led to practical problems because it's not economic to design structures against the maximum forces of earthquakes. Furthermore, a completely linear-elastic response of structures is seldom required. Different types of reduction factors (performance-dependent response factors) considering for example overstrength, structural redundancy and structural ductility have been developed in different countries for compensating the use of simplified and conservative design methods. This has the practical consequence that the methods used in engineering as well as the output results of hazard assessment studies are poorly related to the physics of damaging. Reliable predictions for the response of structures under earthquake loading using such simplified design methods are not feasible. In dependence of the type of structures damage may be controlled by hazard parameters that are different from ground motion accelerations. Furthermore, a realistic risk assessment has to be based on reliable predictions of damage. This is crucial for effective decision-making. This opens the space for a return to the use of intensities as the key output parameter of seismic hazard assessment. Site intensities (e.g. EMS-98) are very well correlated to the damage of structures. They can easily be converted into the required set of engineering parameters or even directly into earthquake time-histories suitable for structural analysis

  3. Hazard Assessment in a Big Data World

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir; Nekrasova, Anastasia

    2017-04-01

    Open data in a Big Data World provides unprecedented opportunities for enhancing scientific studies and better understanding of the Earth System. At the same time, it opens wide avenues for deceptive associations in inter- and transdisciplinary data misleading to erroneous predictions, which are unacceptable for implementation. Even the advanced tools of data analysis may lead to wrong assessments when inappropriately used to describe the phenomenon under consideration. A (self-) deceptive conclusion could be avoided by verification of candidate models in experiments on empirical data and in no other way. Seismology is not an exception. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in early history of instrumental seismology can be proved erroneous when subjected to objective hypothesis testing. In many cases of seismic hazard assessment (SHA), either probabilistic or deterministic, term-less or short-term, the claims of a high potential of a model forecasts are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers, which situation creates numerous deception points and resulted controversies. So far, most, if not all, the standard probabilistic methods to assess seismic hazard and associated risks are based on subjective, commonly unrealistic, and even erroneous assumptions about seismic recurrence and none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Accurate testing against real observations must be done in advance claiming seismically hazardous areas and/or times. The set of errors of the first and second kind in such a comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a user-defined cost-benefit function. The information obtained in testing experiments may supply

  4. Evaluating the Use of Declustering for Induced Seismicity Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Michael, A. J.

    2016-12-01

    The recent dramatic seismicity rate increase in the central and eastern US (CEUS) has motivated the development of seismic hazard assessments for induced seismicity (e.g., Petersen et al., 2016). Standard probabilistic seismic hazard assessment (PSHA) relies fundamentally on the assumption that seismicity is Poissonian (Cornell, BSSA, 1968); therefore, the earthquake catalogs used in PSHA are typically declustered (e.g., Petersen et al., 2014) even though this may remove earthquakes that may cause damage or concern (Petersen et al., 2015; 2016). In some induced earthquake sequences in the CEUS, the standard declustering can remove up to 90% of the sequence, reducing the estimated seismicity rate by a factor of 10 compared to estimates from the complete catalog. In tectonic regions the reduction is often only about a factor of 2. We investigate how three declustering methods treat induced seismicity: the window-based Gardner-Knopoff (GK) algorithm, often used for PSHA (Gardner and Knopoff, BSSA, 1974); the link-based Reasenberg algorithm (Reasenberg, JGR,1985); and a stochastic declustering method based on a space-time Epidemic-Type Aftershock Sequence model (Ogata, JASA, 1988; Zhuang et al., JASA, 2002). We apply these methods to three catalogs that likely contain some induced seismicity. For the Guy-Greenbrier, AR earthquake swarm from 2010-2013, declustering reduces the seismicity rate by factors of 6-14, depending on the algorithm. In northern Oklahoma and southern Kansas from 2010-2015, the reduction varies from factors of 1.5-20. In the Salton Trough of southern California from 1975-2013, the rate is reduced by factors of 3-20. Stochastic declustering tends to remove the most events, followed by the GK method, while the Reasenberg method removes the fewest. Given that declustering and choice of algorithm have such a large impact on the resulting seismicity rate estimates, we suggest that more accurate hazard assessments may be found using the complete catalog.

  5. Statistical analysis of time-dependent earthquake occurrence and its impact on hazard in the low seismicity region Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Hainzl, Sebastian; Scherbaum, Frank; Beauval, Céline

    2007-11-01

    The time-dependence of earthquake occurrence is mostly ignored in standard seismic hazard assessment even though earthquake clustering is well known. In this work, we attempt to quantify the impact of more realistic dynamics on the seismic hazard estimations. We include the time and space dependences between earthquakes into the hazard analysis via Monte Carlo simulations. Our target region is the Lower Rhine Embayment, a low seismicity area in Germany. Including aftershock sequences by using the epidemic type aftershock-sequence (ETAS) model, we find that on average the hypothesis of uncorrelated random earthquake activity underestimates the hazard by 5-10 per cent. Furthermore, we show that aftershock activity of past large earthquakes can locally increase the hazard even centuries later. We also analyse the impact of the so-called long-term behaviour, assuming a quasi-periodic occurrence of main events on a major fault in that region. We found that a significant impact on hazard is only expected for the special case of a very regular recurrence of the main shocks.

  6. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    USGS Publications Warehouse

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the

  7. Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)

    SciTech Connect

    Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.

    2010-09-24

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.

  8. The U.S. Geological Survey Earthquake Hazards Program Website: Summary of Recent and Ongoing Developments

    NASA Astrophysics Data System (ADS)

    Wald, L. A.; Zirbes, M.; Robert, S.; Wald, D.; Presgrace, B.; Earle, P.; Schwarz, S.; Haefner, S.; Haller, K.; Rhea, S.

    2003-12-01

    The U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) website (http://earthquake.usgs.gov/) focuses on 1) earthquake reporting for informed decisions after an earthquake, 2) hazards information for informed decisions and planning before an earthquake, and 3) the basics of earthquake science to help the users of the information understand what is presented. The majority of website visitors are looking for information about current earthquakes in the U.S. and around the world, and the second most visited portion of the website are the education-related pages. People are eager for information, and they are most interested in "what's in my backyard?" Recent and future web developments are aimed at answering this question, making the information more relevant to users, and enabling users to more quickly and easily find the information they are looking for. Recent and/or current web developments include the new enhanced Recent Global Earthquakes and U.S. Earthquakes webpages, the Earthquake in the News system, the Rapid Accurate Tectonic Summaries (RATS), online Significant Earthquake Summary Posters (ESP's), and the U.S. Quaternary Fault & Fold Database, the details of which are covered individually in greater detail in this or other sessions. Future planned developments include a consistent look across all EHP webpages, an integrated one-stop-shopping earthquake notification (EQMail) subscription webpage, new navigation tabs, and a backend database allowing the user to search for earthquake information across all the various EHP websites (on different webservers) based on a topic or region. Another goal is to eventually allow a user to input their address (Zip Code?) and in return receive all the relevant EHP information (and links to more detailed information) such as closest fault, the last significant nearby earthquake, a local seismicity map, and a local hazard map, for example. This would essentially be a dynamic report based on the entered location

  9. Aftereffects of Subduction-Zone Earthquakes: Potential Tsunami Hazards along the Japan Sea Coast.

    PubMed

    Minoura, Koji; Sugawara, Daisuke; Yamanoi, Tohru; Yamada, Tsutomu

    2015-10-01

    The 2011 Tohoku-Oki Earthquake is a typical subduction-zone earthquake and is the 4th largest earthquake after the beginning of instrumental observation of earthquakes in the 19th century. In fact, the 2011 Tohoku-Oki Earthquake displaced the northeast Japan island arc horizontally and vertically. The displacement largely changed the tectonic situation of the arc from compressive to tensile. The 9th century in Japan was a period of natural hazards caused by frequent large-scale earthquakes. The aseismic tsunamis that inflicted damage on the Japan Sea coast in the 11th century were related to the occurrence of massive earthquakes that represented the final stage of a period of high seismic activity. Anti-compressive tectonics triggered by the subduction-zone earthquakes induced gravitational instability, which resulted in the generation of tsunamis caused by slope failing at the arc-back-arc boundary. The crustal displacement after the 2011 earthquake infers an increased risk of unexpected local tsunami flooding in the Japan Sea coastal areas.

  10. Assessment and Prediction of Natural Hazards from Satellite Imagery.

    PubMed

    Gillespie, Thomas W; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan

    2007-10-01

    Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth's surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth's surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space.

  11. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  12. Increasing seismicity in the U. S. midcontinent: Implications for earthquake hazard

    USGS Publications Warehouse

    Ellsworth, William L.; Llenos, Andrea L.; McGarr, Arthur F.; Michael, Andrew J.; Rubinstein, Justin L.; Mueller, Charles S.; Petersen, Mark D.; Calais, Eric

    2015-01-01

    Earthquake activity in parts of the central United States has increased dramatically in recent years. The space-time distribution of the increased seismicity, as well as numerous published case studies, indicates that the increase is of anthropogenic origin, principally driven by injection of wastewater coproduced with oil and gas from tight formations. Enhanced oil recovery and long-term production also contribute to seismicity at a few locations. Preliminary hazard models indicate that areas experiencing the highest rate of earthquakes in 2014 have a short-term (one-year) hazard comparable to or higher than the hazard in the source region of tectonic earthquakes in the New Madrid and Charleston seismic zones.

  13. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions

  14. Earthquake related tsunami hazard along the western coast of Thailand

    NASA Astrophysics Data System (ADS)

    Løvholt, F.; Bungum, H.; Harbitz, C. B.; Glimsdal, S.; Lindholm, C. D.; Pedersen, G.

    2006-11-01

    The primary background for the present study was a project to assist the authorities in Thailand with development of plans for how to deal with the future tsunami risk in both short and long term perspectives, in the wake of the devastating 26 December 2004 Sumatra-Andaman earthquake and tsunami. The study is focussed on defining and analyzing a number of possible future earthquake scenarios (magnitudes 8.5, 8.0 and 7.5) with associated return periods, each one accompanied by specific tsunami modelling. Along the most affected part of the western coast of Thailand, the 2004 tsunami wave caused a maximum water level ranging from 5 to 15 m above mean sea level. These levels and their spatial distributions have been confirmed by detailed numerical simulations. The applied earthquake source is developed based on available seismological and geodetic inversions, and the simulation using the source as initial condition agree well with sea level records and run-up observations. A conclusion from the study is that another megathrust earthquake generating a tsunami affecting the coastline of western Thailand is not likely to occur again for several hundred years. This is in part based on the assumption that the Southern Andaman Microplate Boundary near the Simeulue Islands constitutes a geologic barrier that will prohibit significant rupture across it, and in part on the decreasing subduction rates north of the Banda Ache region. It is also concluded that the largest credible earthquake to be prepared for along the part of the Sunda-Andaman arc that could affect Thailand, is within the next 50-100 years an earthquake of magnitude 8.5, which is expected to occur with more spatial and temporal irregularity than the megathrust events. Numerical simulations have shown such earthquakes to cause tsunamis with maximum water levels up to 1.5-2.0 m along the western coast of Thailand, possibly 2.5-3.0 m on a high tide. However, in a longer time perspective (say more than 50-100 years

  15. Hazards assessment for the INEL Landfill Complex

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-02-01

    This report documents the hazards assessment for the INEL Landfill Complex (LC) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and the DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes the hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding the LC, the buildings and structures at the LC, and the processes that are used at the LC are described in this report. All hazardous materials, both radiological and nonradiological, at the LC were identified and screened against threshold quantities according to DOE Order 5500.3A guidance. Asbestos at the Asbestos Pit was the only hazardous material that exceeded its specified threshold quantity. However, the type of asbestos received and the packaging practices used are believed to limit the potential for an airborne release of asbestos fibers. Therefore, in accordance with DOE Order 5500.3A guidance, no further hazardous material characterization or analysis was required for this hazards assessment.

  16. Earthquakes

    EPA Pesticide Factsheets

    Information on this page will help you understand environmental dangers related to earthquakes, what you can do to prepare and recover. It will also help you recognize possible environmental hazards and learn what you can do to protect you and your family

  17. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    USGS Publications Warehouse

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  18. Near real-time aftershock hazard maps for earthquakes

    NASA Astrophysics Data System (ADS)

    McCloskey, J.; Nalbant, S. S.

    2009-04-01

    Stress interaction modelling is routinely used to explain the spatial relationships between earthquakes and their aftershocks. On 28 October 2008 a M6.4 earthquake occurred near the Pakistan-Afghanistan border killing several hundred and causing widespread devastation. A second M6.4 event occurred 12 hours later 20km to the south east. By making some well supported assumptions concerning the source event and the geometry of any likely triggered event it was possible to map those areas most likely to experience further activity. Using Google earth, it would further have been possible to identify particular settlements in the source area which were particularly at risk and to publish their locations globally within about 3 hours of the first earthquake. Such actions could have significantly focused the initial emergency response management. We argue for routine prospective testing of such forecasts and dialogue between social and physical scientists and emergency response professionals around the practical application of these techniques.

  19. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  20. Earthquakes in Hawai‘i—an underappreciated but serious hazard

    USGS Publications Warehouse

    Okubo, Paul G.; Nakata, Jennifer S.

    2011-01-01

    The State of Hawaii has a history of damaging earthquakes. Earthquakes in the State are primarily the result of active volcanism and related geologic processes. It is not a question of "if" a devastating quake will strike Hawai‘i but rather "when." Tsunamis generated by both distant and local quakes are also an associated threat and have caused many deaths in the State. The U.S. Geological Survey (USGS) and its cooperators monitor seismic activity in the State and are providing crucial information needed to help better prepare emergency managers and residents of Hawai‘i for the quakes that are certain to strike in the future.

  1. Investigating the Radiation Pattern of Earthquakes in the Central and Eastern United States and Comments on Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Bekaert, D. P.; Hooper, A. J.; Samsonov, S. V.; Wright, T. J.; González, P. J.; Pathier, E.; Kostoglodov, V.

    2014-12-01

    The radiation pattern emitted from earthquakes is not currently considered in many seismic hazard assessments. This may be due to the fact that the focal mechanisms of potential ruptures are not well studied or are assumed to be random. In this case, all mechanisms are given equal likelihood, and the effect of radiation pattern is essentially averaged. But for about a dozen earthquake sources in the central and eastern United States (CEUS), faults with known mechanism are incorporated into the hazard assessment, but the radiation pattern is not included. In this study, we investigate the radiation pattern from larger CEUS earthquakes, one of which, the 2011 M5.7 Prague earthquake, was sampled by the relatively uniform and broad coverage of USArray. The radiation pattern from this event is readily apparent below about 1 Hz out to several hundred kilometers from the epicenter and decays with increasing frequency and distance, consistent with the effects of scattering attenuation. This decay is modeled with an apparent attenuation that is 5-­10 times greater than the attenuation of Lg waves for the CEUS. We consider the radiation pattern of potential sources in the New Madrid seismic zone to show the effect of radiation pattern on the seismic hazard assessment of major metropolitan areas in the region including Memphis, Tenn., Evansville, Ind., St Louis, Mo., and Little Rock, Ark. For the scenarios we choose, earthquakes with expected mechanisms within the seismic zone, both strike-slip and thrust, tend to focus energy to the southwest towards Little Rock and to the northeast towards Evansville. Eastern Memphis and St Louis, on the other hand, tend to be in lobes of reduced seismic shaking. This can have a significant impact on seismic hazard assessment for these cities, increasing hazard for the former and decreasing it for the latter, particularly for larger structures that are sensitive to longer shaking periods. It is more complicated, however, when considering

  2. Investigating the Radiation Pattern of Earthquakes in the Central and Eastern United States and Comments on Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Boyd, O. S.

    2015-12-01

    The radiation pattern emitted from earthquakes is not currently considered in many seismic hazard assessments. This may be due to the fact that the focal mechanisms of potential ruptures are not well studied or are assumed to be random. In this case, all mechanisms are given equal likelihood, and the effect of radiation pattern is essentially averaged. But for about a dozen earthquake sources in the central and eastern United States (CEUS), faults with known mechanism are incorporated into the hazard assessment, but the radiation pattern is not included. In this study, we investigate the radiation pattern from larger CEUS earthquakes, one of which, the 2011 M5.7 Prague earthquake, was sampled by the relatively uniform and broad coverage of USArray. The radiation pattern from this event is readily apparent below about 1 Hz out to several hundred kilometers from the epicenter and decays with increasing frequency and distance, consistent with the effects of scattering attenuation. This decay is modeled with an apparent attenuation that is 5-­10 times greater than the attenuation of Lg waves for the CEUS. We consider the radiation pattern of potential sources in the New Madrid seismic zone to show the effect of radiation pattern on the seismic hazard assessment of major metropolitan areas in the region including Memphis, Tenn., Evansville, Ind., St Louis, Mo., and Little Rock, Ark. For the scenarios we choose, earthquakes with expected mechanisms within the seismic zone, both strike-slip and thrust, tend to focus energy to the southwest towards Little Rock and to the northeast towards Evansville. Eastern Memphis and St Louis, on the other hand, tend to be in lobes of reduced seismic shaking. This can have a significant impact on seismic hazard assessment for these cities, increasing hazard for the former and decreasing it for the latter, particularly for larger structures that are sensitive to longer shaking periods. It is more complicated, however, when considering

  3. Probabilistic seismic hazard assessment for the two layer fault system of Antalya (SW Turkey) area

    NASA Astrophysics Data System (ADS)

    Dipova, Nihat; Cangir, Bülent

    2017-09-01

    Southwest Turkey, along Mediterranean coast, is prone to large earthquakes resulting from subduction of the African plate under the Eurasian plate and shallow crustal faults. Maximum observed magnitude of subduction earthquakes is Mw = 6.5 whereas that of crustal earthquakes is Mw = 6.6. Crustal earthquakes are sourced from faults which are related with Isparta Angle and Cyprus Arc tectonic structures. The primary goal of this study is to assess seismic hazard for Antalya area (SW Turkey) using a probabilistic approach. A new earthquake catalog for Antalya area, with unified moment magnitude scale, was prepared in the scope of the study. Seismicity of the area has been evaluated by the Gutenberg-Richter recurrence relationship. For hazard computation, CRISIS2007 software was used following the standard Cornell-McGuire methodology. Attenuation model developed by Youngs et al. Seismol Res Lett 68(1):58-73, (1997) was used for deep subduction earthquakes and Chiou and Youngs Earthq Spectra 24(1):173-215, (2008) model was used for shallow crustal earthquakes. A seismic hazard map was developed for peak ground acceleration and for rock ground with a hazard level of a 10% probability of exceedance in 50 years. Results of the study show that peak ground acceleration values on bedrock change between 0.215 and 0.23 g in the center of Antalya.

  4. Probabilistic seismic hazard assessment for the two layer fault system of Antalya (SW Turkey) area

    NASA Astrophysics Data System (ADS)

    Dipova, Nihat; Cangir, Bülent

    2017-03-01

    Southwest Turkey, along Mediterranean coast, is prone to large earthquakes resulting from subduction of the African plate under the Eurasian plate and shallow crustal faults. Maximum observed magnitude of subduction earthquakes is Mw = 6.5 whereas that of crustal earthquakes is Mw = 6.6. Crustal earthquakes are sourced from faults which are related with Isparta Angle and Cyprus Arc tectonic structures. The primary goal of this study is to assess seismic hazard for Antalya area (SW Turkey) using a probabilistic approach. A new earthquake catalog for Antalya area, with unified moment magnitude scale, was prepared in the scope of the study. Seismicity of the area has been evaluated by the Gutenberg-Richter recurrence relationship. For hazard computation, CRISIS2007 software was used following the standard Cornell-McGuire methodology. Attenuation model developed by Youngs et al. Seismol Res Lett 68(1):58-73, (1997) was used for deep subduction earthquakes and Chiou and Youngs Earthq Spectra 24(1):173-215, (2008) model was used for shallow crustal earthquakes. A seismic hazard map was developed for peak ground acceleration and for rock ground with a hazard level of a 10% probability of exceedance in 50 years. Results of the study show that peak ground acceleration values on bedrock change between 0.215 and 0.23 g in the center of Antalya.

  5. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases

  6. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  7. Preliminary Probabilistic Tsunami Hazard Assessment of Canadian Coastlines

    NASA Astrophysics Data System (ADS)

    Leonard, L. J.; Rogers, G. C.; Mazzotti, S.

    2012-12-01

    We present a preliminary probabilistic tsunami hazard assessment of Canadian coastlines from local and far-field, earthquake and large landslide sources. Our multifaceted analysis is based on published historical, paleotsunami and paleoseismic data, modelling, and empirical relations between fault area, earthquake magnitude and tsunami runup. We consider geological sources with known tsunami impacts on Canadian coasts (e.g., Cascadia and other Pacific subduction zones; the 1755 Lisbon tsunami source; Atlantic continental slope failures) as well as potential sources with previously unknown impact (e.g., Explorer plate subduction; Caribbean subduction zones; crustal faults). The cumulative estimated tsunami hazard for potentially damaging runup (≥ 1.5 m) of the outer Canadian Pacific coastline is ~40-80% in 50 y, respectively one and two orders of magnitude greater than the outer Atlantic (~1-15%) and the Arctic (< 1%). For larger runup with significant damage potential (≥ 3 m), Pacific hazard is ~10-30% in 50 y, again much larger than both the Atlantic (~1-5%) and Arctic (< 1%). For outer Pacific coastlines, the ≥ 1.5 m runup hazard is dominated by far-field subduction zone sources, but the probability of runup ≥ 3 m is highest for local megathrust sources, particularly the Cascadia subduction zone; potential thrust sources along the Explorer and Queen Charlotte margins may also be significant for the more northern coasts of British Columbia, where there is a lack of known paleo-event data. For the more sheltered inner Pacific coastlines of Juan de Fuca and Georgia Straits, the hazard at both levels is contributed mainly by Cascadia megathrust events. Tsunami hazard on the Atlantic coastline is dominated by poorly-constrained far-field subduction sources; a lesser hazard is posed by near-field continental slope failures similar to the 1929 Grand Banks event. Tsunami hazard on the Arctic coastline is poorly constrained, but is likely dominated by continental

  8. Understanding Earthquake Hazard & Disaster in Himalaya - A Perspective on Earthquake Forecast in Himalayan Region of South Central Tibet

    NASA Astrophysics Data System (ADS)

    Shanker, D.; Paudyal, ,; Singh, H.

    2010-12-01

    It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were

  9. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  10. Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S. A.; Spencer, B. D.

    2015-12-01

    The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.

  11. Oklahoma experiences largest earthquake during ongoing regional wastewater injection hazard mitigation efforts

    USGS Publications Warehouse

    Yeck, William; Hayes, Gavin; McNamara, Daniel E.; Rubinstein, Justin L.; Barnhart, William; Earle, Paul; Benz, Harley M.

    2017-01-01

    The 3 September 2016, Mw 5.8 Pawnee earthquake was the largest recorded earthquake in the state of Oklahoma. Seismic and geodetic observations of the Pawnee sequence, including precise hypocenter locations and moment tensor modeling, shows that the Pawnee earthquake occurred on a previously unknown left-lateral strike-slip basement fault that intersects the mapped right-lateral Labette fault zone. The Pawnee earthquake is part of an unprecedented increase in the earthquake rate in Oklahoma that is largely considered the result of the deep injection of waste fluids from oil and gas production. If this is, indeed, the case for the M5.8 Pawnee earthquake, then this would be the largest event to have been induced by fluid injection. Since 2015, Oklahoma has undergone wide-scale mitigation efforts primarily aimed at reducing injection volumes. Thus far in 2016, the rate of M3 and greater earthquakes has decreased as compared to 2015, while the cumulative moment—or energy released from earthquakes—has increased. This highlights the difficulty in earthquake hazard mitigation efforts given the poorly understood long-term diffusive effects of wastewater injection and their connection to seismicity.

  12. Oklahoma experiences largest earthquake during ongoing regional wastewater injection hazard mitigation efforts

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Hayes, G. P.; McNamara, D. E.; Rubinstein, J. L.; Barnhart, W. D.; Earle, P. S.; Benz, H. M.

    2017-01-01

    The 3 September 2016, Mw 5.8 Pawnee earthquake was the largest recorded earthquake in the state of Oklahoma. Seismic and geodetic observations of the Pawnee sequence, including precise hypocenter locations and moment tensor modeling, shows that the Pawnee earthquake occurred on a previously unknown left-lateral strike-slip basement fault that intersects the mapped right-lateral Labette fault zone. The Pawnee earthquake is part of an unprecedented increase in the earthquake rate in Oklahoma that is largely considered the result of the deep injection of waste fluids from oil and gas production. If this is, indeed, the case for the M5.8 Pawnee earthquake, then this would be the largest event to have been induced by fluid injection. Since 2015, Oklahoma has undergone wide-scale mitigation efforts primarily aimed at reducing injection volumes. Thus far in 2016, the rate of M3 and greater earthquakes has decreased as compared to 2015, while the cumulative moment—or energy released from earthquakes—has increased. This highlights the difficulty in earthquake hazard mitigation efforts given the poorly understood long-term diffusive effects of wastewater injection and their connection to seismicity.

  13. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions, exposure...

  14. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions, exposure...

  15. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions, exposure...

  16. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions, exposure...

  17. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions, exposure...

  18. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the

  19. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  20. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  1. Too generous to a fault? Is reliable earthquake safety a lost art? Errors in expected human losses due to incorrect seismic hazard estimates

    NASA Astrophysics Data System (ADS)

    Bela, James

    2014-11-01

    "One is well advised, when traveling to a new territory, to take a good map and then to check the map with the actual territory during the journey." In just such a reality check, Global Seismic Hazard Assessment Program (GSHAP) maps (prepared using PSHA) portrayed a "low seismic hazard," which was then also assumed to be the "risk to which the populations were exposed." But time-after-time-after-time the actual earthquakes that occurred were not only "surprises" (many times larger than those implied on the maps), but they were often near the maximum potential size (Maximum Credible Earthquake or MCE) that geologically could occur. Given these "errors in expected human losses due to incorrect seismic hazard estimates" revealed globally in these past performances of the GSHAP maps (> 700,000 deaths 2001-2011), we need to ask not only: "Is reliable earthquake safety a lost art?" but also: "Who and what were the `Raiders of the Lost Art?' "

  2. Tsunami Forecast Technology for Asteroid Impact Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Moore, C. W.

    2015-12-01

    Over 75% of all historically documented tsunamis have been generated by earthquakes. As the result, all existing Tsunami Warning and Forecast systems focus almost exclusively on detecting, warning and forecasting earthquake-generated tsunamis.The sequence of devastating tsunamis across the globe over the past 10 years has significantly heightened awareness and preparation activities associated with these high-impact events. Since the catastrophic 2004 Sumatra tsunami, NOAA has invested significant efforts in modernizing the U.S. tsunami warning system. Recent developments in tsunami modeling capability, inundation forecasting, sensing networks, dissemination capability and local preparation and mitigation activities have gone a long way toward enhancing tsunami resilience within the United States. The remaining quarter of the tsunami hazard problem is related to other mechanisms of tsunami generation, that may not have received adequate attention. Among those tsunami sources, the asteroid impact may be the most exotic, but possible one of the most devastating tsunami generation mechanisms. Tsunami forecast capabilities that have been developed for the tsunami warning system can be used to explore both, hazard assessment and the forecast of a tsunami generated by the asteroid impact. Existing tsunami flooding forecast technology allows for forecast for non-seismically generated tsunamis (asteroid impact, meteo-generated tsunamis, landslides, etc.), given an adequate data for the tsunami source parameters. Problems and opportunities for forecast of tsunamis from asteroid impact will be discussed. Preliminary results of impact-generated tsunami analysis for forecast and hazard assessment will be presented.

  3. Seismic hazard assessment of Chennai city considering local site effects

    NASA Astrophysics Data System (ADS)

    Boominathan, A.; Dodagoudar, G. R.; Suganthi, A.; Uma Maheswari, R.

    2008-11-01

    Chennai city suffered moderate tremors during the 2001 Bhuj and Pondicherry earthquakes and the 2004 Sumatra earthquake. After the Bhuj earthquake, Indian Standard IS: 1893 was revised and Chennai city was upgraded from zone II to zone III which leads to a substantial increase of the design ground motion parameters. Therefore, a comprehensive study is carried out to assess the seismic hazard of Chennai city based on a deterministic approach. The seismicity and seismotectonic details within a 100 km radius of the study area have been considered. The one-dimensional ground response analysis was carried out for 38 representative sites by the equivalent linear method using the SHAKE91 program to estimate the ground motion parameters considering the local site effects. The shear wave velocity profile was inferred from the corrected blow counts and it was verified with the Multichannel Analysis of Surface Wave (MASW) test performed for a representative site. The seismic hazard is represented in terms of characteristic site period and Spectral Acceleration Ratio (SAR) contours for the entire city. It is found that structures with low natural period undergo significant amplification mostly in the central and southern parts of Chennai city due to the presence of deep soil sites with clayey or sandy deposits and the remaining parts undergo marginal amplification.

  4. Assessing natural hazard risk using images and data

    NASA Astrophysics Data System (ADS)

    Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.

    2012-12-01

    Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.

  5. Earthquake hazard and damage on traditional rural structures in Turkey

    NASA Astrophysics Data System (ADS)

    Korkmaz, H. H.; Korkmaz, S. Z.; Donduren, M. S.

    2010-03-01

    During the last earthquakes in Turkey, reinforced concrete structures in the cities and masonry structures in the rural part were exposed to damage and failure. Masonry houses such as earthen, brick and stone structures are composed of building blocks with weak inter-binding action which have low tension capacity. Bending and shear forces generate tensile stresses which cannot be well tolerated. In this paper, the performance of masonry structures during recent earthquakes in Turkey is discussed with illustrative photographs taken after earthquakes. The followings are the main weakness in the materials and unreinforced masonry constructions and other reasons for the extensive damage of masonry buildings. Very low tensile and shear strength particularly with poor mortar, brittle behaviour in tension as well as compression, stress concentration at corners of windows and doors, overall unsymmetry in plan and elevation of building, unsymmetry due to imbalance in the sizes and positions of walls and openings in the walls, defects in construction such as use of substandard materials, unfilled joints between bricks, not-plump walls, improper bonding between walls at right angles etc.

  6. Where, when and why does earthquake hazard turn to become a disaster?

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2014-05-01

    This presentation highlights the importance of analysis and understanding of contributing components to earthquake disaster risk and tries to answer the questions highlighted in the title of the paper. The basic goal of such an analysis should be a society without disasters. Observing and modelling capabilities to reduce uncertainties in forecasting large earthquakes would be discussed. A novel approach to probabilistic seismic hazard analysis will be presented and compared to the current approach. Vulnerability's and hazard's contributions to earthquake risk would be analysed. I would consider the economic and political factors as well as the factors of awareness and preparedness, which brought about the humanitarian tragedies of the early XXI century, and discuss trans-disciplinary approach to disaster risk research.

  7. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-01-01

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and in preparing emergency response plans. The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group of California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping (NSHM) Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault to the east of the study area. Earthquake scenarios are intended to depict the potential consequences of significant earthquakes. They are not necessarily the largest or most damaging earthquakes possible. Earthquake scenarios are both large enough and likely enough that emergency planners should consider them in regional emergency response plans. Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM).For the Hilton Creek Fault, two alternative scenarios were developed in addition to the NSHM scenario to account for different opinions in how far north the fault extends into the Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice

  8. Insight on fault segmentation, linkage and hazard from the 2016 Mw6.2 Amatrice earthquake (central Italy)

    NASA Astrophysics Data System (ADS)

    Walters, R. J.; Gregory, L. C.; Wedmore, L. N. J.; Craig, T. J.; Elliott, J. R.; Wilkinson, M. W.; McCaffrey, K. J. W.; Michetti, A.; Vittori, E.; Livio, F.; Iezzi, F.; Chen, J.; Li, Z.; Roberts, G.

    2016-12-01

    detailed historical and palaeoseismological record of the region, but more study is needed to understand their absence, and to ensure we can rule-out such large events in future. This earthquake prompts reassessment of the state of linkage between other faults in central Italy, with an aim to better assess the increased seismic hazard posed by multi-fault ruptures.

  9. KSC VAB Aeroacoustic Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.

    2010-01-01

    NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.

  10. Apparent stress, fault maturity and seismic hazard for normal-fault earthquakes at subduction zones

    USGS Publications Warehouse

    Choy, G.L.; Kirby, S.H.

    2004-01-01

    The behavior of apparent stress for normal-fault earthquakes at subduction zones is derived by examining the apparent stress (?? a = ??Es/Mo, where E s is radiated energy and Mo is seismic moment) of all globally distributed shallow (depth, ?? 1 MPa) are also generally intraslab, but occur where the lithosphere has just begun subduction beneath the overriding plate. They usually occur in cold slabs near trenches where the direction of plate motion across the trench is oblique to the trench axis, or where there are local contortions or geometrical complexities of the plate boundary. Lower ??a (< 1 MPa) is associated with events occurring at the outer rise (OR) complex (between the OR and the trench axis), as well as with intracrustal events occurring just landward of the trench. The average apparent stress of intraslab-normal-fault earthquakes is considerably higher than the average apparent stress of interplate-thrust-fault earthquakes. In turn, the average ?? a of strike-slip earthquakes in intraoceanic environments is considerably higher than that of intraslab-normal-fault earthquakes. The variation of average ??a with focal mechanism and tectonic regime suggests that the level of ?? a is related to fault maturity. Lower stress drops are needed to rupture mature faults such as those found at plate interfaces that have been smoothed by large cumulative displacements (from hundreds to thousands of kilometres). In contrast, immature faults, such as those on which intraslab-normal-fault earthquakes generally occur, are found in cold and intact lithosphere in which total fault displacement has been much less (from hundreds of metres to a few kilometres). Also, faults on which high ??a oceanic strike-slip earthquakes occur are predominantly intraplate or at evolving ends of transforms. At subduction zones, earthquakes occurring on immature faults are likely to be more hazardous as they tend to generate higher amounts of radiated energy per unit of moment than

  11. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    USGS Publications Warehouse

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude

  12. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  13. 283-E and 283-W hazards assessment

    SciTech Connect

    Sutton, L.N.

    1994-09-26

    This report documents the hazards assessment for the 200 area water treatment plants 283-E and 283-W located on the US DOE Hanford Site. Operation of the water treatment plants is the responsibility of ICF Kaiser Hanford Company (ICF KH). This hazards assessment was conducted to provide emergency planning technical basis for the water treatment plants. This document represents an acceptable interpretation of the implementing guidance document for DOE ORDER 5500.3A which requires an emergency planning hazards assessment for each facility that has the potential to reach or exceed the lowest level emergency classification.

  14. Cruise report for A1-00-SC southern California earthquake hazards project, part A

    USGS Publications Warehouse

    Gutmacher, Christina E.; Normark, William R.; Ross, Stephanie L.; Edwards, Brian D.; Sliter, Ray; Hart, Patrick; Cooper, Becky; Childs, Jon; Reid, Jane A.

    2000-01-01

    A three-week cruise to obtain high-resolution boomer and multichannel seismic-reflection profiles supported two project activities of the USGS Coastal and Marine Geology (CMG) Program: (1) evaluating the earthquake and related geologic hazards posed by faults in the near offshore area of southern California and (2) determining the pathways through which sea-water is intruding into aquifers of Los Angeles County in the area of the Long Beach and Los Angeles harbors. The 2000 cruise, A1-00-SC, is the third major data-collection effort in support of the first objective (Normark et al., 1999a, b); one more cruise is planned for 2002. This report deals primarily with the shipboard operations related to the earthquake-hazard activity. The sea-water intrusion survey is confined to shallow water and the techniques used are somewhat different from that of the hazards survey (see Edwards et al., in preparation).

  15. Earthquakes and faults at Mt. Etna (Italy): time-dependent approach to the seismic hazard of the eastern flank

    NASA Astrophysics Data System (ADS)

    Peruzza, L.; Azzaro, R.; D'Amico, S.; Tuve', T.

    2009-04-01

    A time dependent approach to seismic hazard assessment, based on a renewal model using the Brownian Passage Time (BPT) distribution, has been applied to the best-known seismogenic faults at Mt. Etna volcano. These structures have been characterised by frequent coseismic surface displacement, and a long list of historically well-documented earthquakes occurred in the last 200 years (CMTE catalogue, Azzaro et al., 2000, 2002, 2006). Seismic hazard estimates, given in terms of earthquake rupture forecast, are conditioned to the time elapsed since the last event: impending events are expected on the S. Tecla Fault, and secondly on the Moscatello Fault, both involved in the highly active, geodynamic processes affecting the eastern flank of Mt. Etna. Mean recurrence time of major events is calibrated by merging the inter-event times observed at each fault; aperiodicity is tuned on b-values, following the approach proposed by Zoeller et al. (2008). Finally we compare these mean recurrence times with the values obtained by using only geometrical and kinematic information, as defined in Peruzza et al. (2008) for faults in Italy. Time-dependent hazard assessment is compared with the stationary assumption of seismicity, and validated in a retrospective forward model. Forecasted rates in a 5 years perspective (1st April 2009 to 1st April 2014), on magnitude bins compatible with macroseismic data are available for testing in the frame of the CSEP (Collaboratory for the study of Earthquake Predictability, www.cseptesting.org) project. Azzaro R., Barbano M.S., Antichi B., Rigano R.; 2000: Macroseismic catalogue of Mt. Etna earthquakes from 1832 to 1998. Acta Volcanol., con CD-ROM, 12 (1), 3-36. http://www.ct.ingv.it/Sismologia/macro/default.htm Azzaro R., D'Amico S., Mostaccio A., Scarfì L.; 2002: Terremoti con effetti macrosismici in Sicilia orientale - Calabria meridionale nel periodo Gennaio 1999 - Dicembre 2001. Quad. di Geof., 27, 1-59. Azzaro R., D'Amico S., Mostaccio A

  16. Citizen Seismology Provides Insights into Ground Motions and Hazard from Injection-Induced Earthquakes

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2014-12-01

    The US Geological Survey "Did You Feel It?" (DYFI) system is a highly successful example of citizen seismology. Users around the world now routinely report felt earthquakes via the Web; this information is used to determine Community Decimal Intensity values. These data can be enormously valuable for helping address a key issue that has arisen recently: quantifying the shaking/hazard associated with injection-induced earthquakes. I consider the shaking from 11 moderate (Mw3.9-5.7) earthquakes in the central and eastern United States that are believed to be induced by fluid injection. The distance decay of intensities for all events is consistent with that observed for regional tectonic earthquakes, but for all of the events intensities are lower than values predicted from an intensity prediction equation derived using data from tectonic events. I introduce an effective intensity magnitude, MIE, defined as the magnitude that on average would generate a given intensity distribution. For all 11 events, MIE is lower than the event magnitude by 0.4-1.3 units, with an average difference of 0.8 units. This suggests that stress drops of injection-induced earthquakes are lower than tectonic earthquakes by a factor of 2-10. However, relatively limited data suggest that intensities for epicentral distances less than 10 km are more commensurate with expectations for the event magnitude, which can be explained by the shallow focal depth of the events. The results suggest that damage from injection-induced earthquakes will be especially concentrated in the immediate epicentral region. These results further suggest a potential new discriminant for the identification of induced events. For ecample, while systematic analysis of California earthquakes remains to be done, DYFI data from the 2014 Mw5.1 La Habra, California, earthquake reveal no evidence for unusually low intensities, adding to a growing volume of evidence that this was a natural tectonic event.

  17. Simulations of seismic hazard for the Pacific Northwest of the United States from earthquakes associated with the Cascadia subduction zone

    USGS Publications Warehouse

    Petersen, M.D.; Cramer, C.H.; Frankel, A.D.

    2002-01-01

    We investigate the impact of different rupture and attenuation models for the Cascadia subduction zone by simulating seismic hazard models for the Pacific Northwest of the U.S. at 2% probability of exceedance in 50 years. We calculate the sensitivity of hazard (probabilistic ground motions) to the source parameters and the attenuation relations for both intraslab and interface earthquakes and present these in the framework of the standard USGS hazard model that includes crustal earthquakes. Our results indicate that allowing the deep intraslab earthquakes to occur anywhere along the subduction zone increases the peak ground acceleration hazard near Portland, Oregon by about 20%. Alternative attenuation relations for deep earthquakes can result in ground motions that differ by a factor of two. The hazard uncertainty for the plate interface and intraslab earthquakes is analyzed through a Monte-Carlo logic tree approach and indicates a seismic hazard exceeding 1 g (0.2 s spectral acceleration) consistent with the U.S. National Seismic Hazard Maps in western Washington, Oregon, and California and an overall coefficient of variation that ranges from 0.1 to 0.4. Sensitivity studies indicate that the paleoseismic chronology and the magnitude of great plate interface earthquakes contribute significantly to the hazard uncertainty estimates for this region. Paleoseismic data indicate that the mean earthquake recurrence interval for great earthquakes is about 500 years and that it has been 300 years since the last great earthquake. We calculate the probability of such a great earthquake along the Cascadia plate interface to be about 14% when considering a time-dependent model and about 10% when considering a time-independent Poisson model during the next 50-year interval.

  18. Evolution trends in vulnerability of R/C buildings exposed to earthquake induced landslide hazard

    NASA Astrophysics Data System (ADS)

    Fotopoulou, S.; Pitilakis, K.

    2012-04-01

    The assessment of landslide risk depends on the evaluation of landslide hazard and the vulnerability of exposed structures which both change with time. The real, dynamic vulnerability modeling of structures due to landslides may be significantly affected by aging considerations, anthropogenic actions, cumulative damage from past landslide events and retrofitting measures. The present work aims at the development of an efficient analytical methodology to assess the evolution of building vulnerability with time exposed to earthquake -induced landslide hazard. In particular, the aging of typical RC buildings is considered by including probabilistic models of corrosion deterioration of the RC elements within the vulnerability modeling framework. Two potential adverse corrosion scenarios are examined: chloride and carbonation induced corrosion of the steel reinforcement. An application of the proposed methodology to reference low-rise RC buildings exposed to the combined effect of seismically induced landslide differential displacements and reinforcement corrosion is provided. Both buildings with stiff and flexible foundation system standing near the crest of a potentially precarious soil slope are examined. Non linear static time history analyses of the buildings are performed using a fibre-based finite element code. In this analysis type, the applied loads (displacements) at the foundation level vary in the pseudo-time domain, according to a load pattern prescribed as the differential permanent landslide displacement (versus time) curves triggered by the earthquake. The distribution for the corrosion initiation time is assessed through Monte Carlo simulation using appropriate probabilistic models for the carbonation and the chloride induced corrosion. Then, the loss of area of steel over time due to corrosion of the RC elements is modeled as a reduction in longitudinal reinforcing bar cross-sectional area in the fibre section model. Time dependent structural limit

  19. An evaluation of earthquake hazard parameters in the Iranian Plateau based on the Gumbel III distribution

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Bayrak, Yusuf

    2016-04-01

    The Gumbel's third asymptotic distribution (GIII) of the extreme value method is employed to evaluate the earthquake hazard parameters in the Iranian Plateau. This research quantifies spatial mapping of earthquake hazard parameters like annual and 100-year mode beside their 90 % probability of not being exceeded (NBE) in the Iranian Plateau. Therefore, we used a homogeneous and complete earthquake catalogue during the period 1900-2013 with magnitude M w ≥ 4.0, and the Iranian Plateau is separated into equal area mesh of 1° late × 1° long. The estimated result of annual mode with 90 % probability of NBE is expected to exceed the values of M w 6.0 in the Eastern part of Makran, most parts of Central and East Iran, Kopeh Dagh, Alborz, Azerbaijan, and SE Zagros. The 100-year mode with 90 % probability of NBE is expected to overpass the value of M w 7.0 in the Eastern part of Makran, Central and East Iran, Alborz, Kopeh Dagh, and Azerbaijan. The spatial distribution of 100-year mode with 90 % probability of NBE uncovers the high values of earthquake hazard parameters which are frequently connected with the main tectonic regimes of the studied area. It appears that there is a close communication among the seismicity and the tectonics of the region.

  20. Earthquake Hazard When the Rate Is Non-Stationary: The Challenge of the U. S. Midcontinent

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.; Cochran, E. S.; Llenos, A. L.; McGarr, A.; Michael, A. J.; Mueller, C. S.; Petersen, M. D.; Rubinstein, J. L.

    2014-12-01

    In July 2014, the U. S. Geological Survey released an update of the 2008 National Seismic Hazard Map for the coterminous U. S. The Map provides guidance for the seismic provisions of the building codes and portrays ground motions with a 2% chance of being exceeded in an exposure time of 50 years. Over most of the midcontinent the hazard model is derived by projecting the long-term historic, declustered earthquake rate forward in time. However, parts of the midcontinent have experienced increased seismicity levels since 2009 - locally by 2 orders of magnitude - which is incompatible with the underlying assumption of a constant-rate Poisson process. The 2014 Map acknowledged this problem, and for its intended purpose of underpinning seismic design used seismicity rates that are consistent with the entire historic record. Both the developers of the Map and its critics acknowledge that the remarkable rise of seismicity in Oklahoma and nearby states must be addressed if we are to fully capture the hazard in both space and time. The nature of the space/time distribution of the increased seismicity, as well as numerous published case studies strongly suggest that much of the increase is of anthropogenic origin. If so, the assumptions and procedures used to forecast natural earthquake rates from past rates may not be appropriate. Here we discuss key issues that must be resolved include: the geographic location of areas with elevated seismicity, either active now or potentially active in the future; local geologic conditions including faults and the state of stress; the spatial smoothing of catalog seismicity; the temporal evolution of the earthquake rate change; earthquake sequence statistics including clustering behavior; the magnitude-frequency distribution of the excess earthquakes, particularly to higher and yet unobserved magnitudes; possible source process differences between natural and induced earthquakes; and the appropriate ground motion prediction equations.

  1. The Cascadia Subduction Zone and related subduction systems: seismic structure, intraslab earthquakes and processes, and earthquake hazards

    USGS Publications Warehouse

    Kirby, Stephen H.; Wang, Kelin; Dunlop, Susan

    2002-01-01

    The following report is the principal product of an international workshop titled “Intraslab Earthquakes in the Cascadia Subduction System: Science and Hazards” and was sponsored by the U.S. Geological Survey, the Geological Survey of Canada and the University of Victoria. This meeting was held at the University of Victoria’s Dunsmuir Lodge, Vancouver Island, British Columbia, Canada on September 18–21, 2000 and brought 46 participants from the U.S., Canada, Latin America and Japan. This gathering was organized to bring together active research investigators in the science of subduction and intraslab earthquake hazards. Special emphasis was given to “warm-slab” subduction systems, i.e., those systems involving young oceanic lithosphere subducting at moderate to slow rates, such as the Cascadia system in the U.S. and Canada, and the Nankai system in Japan. All the speakers and poster presenters provided abstracts of their presentations that were a made available in an abstract volume at the workshop. Most of the authors subsequently provided full articles or extended abstracts for this volume on the topics that they discussed at the workshop. Where updated versions were not provided, the original workshop abstracts have been included. By organizing this workshop and assembling this volume, our aim is to provide a global perspective on the science of warm-slab subduction, to thereby advance our understanding of internal slab processes and to use this understanding to improve appraisals of the hazards associated with large intraslab earthquakes in the Cascadia system. These events have been the most frequent and damaging earthquakes in western Washington State over the last century. As if to underscore this fact, just six months after this workshop was held, the magnitude 6.8 Nisqually earthquake occurred on February 28th, 2001 at a depth of about 55 km in the Juan de Fuca slab beneath the southern Puget Sound region of western Washington. The Governor

  2. Review Article: A comparison of flood and earthquake vulnerability assessment indicators

    NASA Astrophysics Data System (ADS)

    de Ruiter, Marleen C.; Ward, Philip J.; Daniell, James E.; Aerts, Jeroen C. J. H.

    2017-07-01

    In a cross-disciplinary study, we carried out an extensive literature review to increase understanding of vulnerability indicators used in the disciplines of earthquake- and flood vulnerability assessments. We provide insights into potential improvements in both fields by identifying and comparing quantitative vulnerability indicators grouped into physical and social categories. Next, a selection of index- and curve-based vulnerability models that use these indicators are described, comparing several characteristics such as temporal and spatial aspects. Earthquake vulnerability methods traditionally have a strong focus on object-based physical attributes used in vulnerability curve-based models, while flood vulnerability studies focus more on indicators applied to aggregated land-use classes in curve-based models. In assessing the differences and similarities between indicators used in earthquake and flood vulnerability models, we only include models that separately assess either of the two hazard types. Flood vulnerability studies could be improved using approaches from earthquake studies, such as developing object-based physical vulnerability curve assessments and incorporating time-of-the-day-based building occupation patterns. Likewise, earthquake assessments could learn from flood studies by refining their selection of social vulnerability indicators. Based on the lessons obtained in this study, we recommend future studies for exploring risk assessment methodologies across different hazard types.

  3. Virtual California, ETAS, and OpenHazards web services: Responding to earthquakes in the age of Big Data

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Schultz, K.; Rundle, J. B.; Glasscoe, M. T.; Donnellan, A.

    2014-12-01

    The response to the 2014 m=6 Napa earthquake showcased data driven services and technologies that aided first responders and decision makers to quickly assess damage, estimate aftershock hazard, and efficiently allocate resources where where they were most needed. These tools have been developed from fundamental research as part of a broad collaboration -- facilitated in no small party by the California Earthquake Clearinghouse, between researchers, policy makers, and executive decision makers and practiced and honed during numerous disaster response exercises over the past several years. On 24 August 2014, and the weeks following the m=6 Napa event, it became evident that these technologies will play an important role in the response to natural (and other) disasters in the 21st century. Given the continued rapid growth of computational capabilities, remote sensing technologies, and data gathering capacities -- including by unpiloted aerial vehicles (UAVs), it is reasonable to expect that both the volume and variety of data available during a response scenario will grow significantly in the decades to come. Inevitably, modern Data Science will be critical to effective disaster response in the 21st century. In this work, we discuss the roles that earthquake simulators, statistical seismicity models, and remote sensing technologies played in the the 2014 Napa earthquake response. We further discuss "Big Data" technologies and data models that facilitate the transformation of raw data into disseminable information and actionable products, and we outline a framework for the next generation of disaster response data infrastructure.

  4. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  5. St. Louis Area Earthquake Hazards Mapping Project - December 2008-June 2009 Progress Report

    USGS Publications Warehouse

    Williams, R.A.; Bauer, R.A.; Boyd, O.S.; Chung, J.; Cramer, C.H.; Gaunt, D.A.; Hempen, G.L.; Hoffman, D.; McCallister, N.S.; Prewett, J.L.; Rogers, J.D.; Steckel, P.J.; Watkins, C.M.

    2009-01-01

    This report summarizes the mission, the project background, the participants, and the progress of the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) for the period from December 2008 through June 2009. During this period, the SLAEHMP held five conference calls and two face-to-face meetings in St. Louis, participated in several earthquake awareness public meetings, held one outreach field trip for the business and government community, collected and compiled new borehole and digital elevation data from partners, and published a project summary.

  6. St. Louis Area Earthquake Hazards Mapping Project - A PowerPoint Presentation

    USGS Publications Warehouse

    Williams, Robert A.

    2009-01-01

    This Open-File Report contains illustrative materials, in the form of PowerPoint slides, used for an oral presentation given at the Earthquake Insight St. Louis, Mo., field trip held on May 28, 2009. The presentation focused on summarizing the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) justification, goals, achievements, and products, for an audience of business and public officials. The individual PowerPoint slides highlight, in an abbreviated format, the topics addressed; they are discussed below and are explained with additional text as appropriate.

  7. Reducing the Risks of Nonstructural Earthquake Damage: A Practical Guide. Earthquake Hazards Reduction Series 1.

    ERIC Educational Resources Information Center

    Reitherman, Robert

    The purpose of this booklet is to provide practical information to owners, operators, and occupants of office and commercial buildings on the vulnerabilities posed by earthquake damage to nonstructural items and the means available to deal with these potential problems. Examples of dangerous nonstructural damages that have occurred in past…

  8. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  9. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed Central

    Kanamori, H

    1996-01-01

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding. Images Fig. 8 PMID:11607657

  10. Assessment of seismic hazards along the northern Gulf of Aqaba

    NASA Astrophysics Data System (ADS)

    Abueladas, Abdel-Rahman Aqel

    Aqaba and Elat are very important port and recreation cities for the Hashemite Kingdom of Jordan and Israel, respectively. The two cities are the most susceptible to damage from a destructive future earthquake because they are located over the tectonically active Dead Sea transform fault (DST) that is the source of most of the major historical earthquakes in the region. The largest twentieth century earthquake on the DST, the magnitude Mw 7.2 Nuweiba earthquake of November 22, 1995, caused damage to structures in both cities. The integration of geological, geophysical, and earthquake engineering studies will help to assess the seismic hazards by determining the location and slip potential of active faults and by mapping areas of high liquefaction susceptibility. Ground Penetrating Radar (GPR) as a high resolution shallow geophysical tool was used to map the shallow active faults in Aqaba, Taba Sabkha area, and Elat. The GPR data revealed the onshore continuation of the Evrona, West Aqaba, Aqaba fault zones, and several transverse faults. The integration of offshore and onshore data confirm the extension of these faults along both sides of the Gulf of Aqaba. A 3D model of GPR data at one site in Aqaba indicates that the NW-trending transverse faults right laterally offset older than NE-trending faults. The most hazardous fault is the Evrona fault which extends north to the Tabs Sabkha. A geographic information system (GIS) database of the seismic hazard was created in order to facilitate the analyzing, manipulation, and updating of the input parameters. Liquefaction potential maps were created for the region based on analysis of borehole data. The liquefaction map shows high and moderate liquefaction susceptibility zones along the northern coast of the Gulf of Aqaba. In Aqaba several hotels are located within a high and moderate liquefaction zones. The Yacht Club, Aqaba, Ayla archaeological site, and a part of commercial area are also situated in a risk area. A part

  11. Cruise report for A1-98-SC southern California Earthquake Hazards Project

    USGS Publications Warehouse

    Normark, William R.; Bohannon, Robert G.; Sliter, Ray; Dunhill, Gita; Scholl, David W.; Laursen, Jane; Reid, Jane A.; Holton, David

    1999-01-01

    The focus of the Southern California Earthquake Hazards project, within the Western Region Coastal and Marine Geology team (WRCMG), is to identify the landslide and earthquake hazards and related ground-deformation processes that can potentially impact the social and economic well-being of the inhabitants of the Southern California coastal region, the most populated urban corridor along the U.S. Pacific margin. The primary objective is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this overall objective, we are investigating the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (see Fig. 1). In addition, the project will examine the Pliocene-Pleistocene record of how this deformation has shifted in space and time. The results of this study should improve our knowledge of shifting deformation for both the long-term (105 to several 106 yr) and short-term (<50 ky) time frames and enable us to identify actively deforming structures that may constitute current significant seismic hazards.

  12. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  13. The Magnitude Frequency Distribution of Induced Earthquakes and Its Implications for Crustal Heterogeneity and Hazard

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.

    2015-12-01

    . Alternatively, the MFD of induced earthquakes may be controlled by small scale stress concentrations in a spatially variable stress field. Resolving the underlying causes of the MFD for induced earthquakes may provide key insights into the hazard posed by induced earthquakes.

  14. Multi Hazard Assessment: The Azores Archipelagos (PT) case

    NASA Astrophysics Data System (ADS)

    Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos

    2016-04-01

    The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall

  15. Assessment and Prediction of Natural Hazards from Satellite Imagery

    PubMed Central

    Gillespie, Thomas W.; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan

    2013-01-01

    Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth’s surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth’s surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space. PMID:25170186

  16. Earthquake catalogs for the 2017 Central and Eastern U.S. short-term seismic hazard model

    USGS Publications Warehouse

    Mueller, Charles S.

    2017-01-01

    The U. S. Geological Survey (USGS) makes long-term seismic hazard forecasts that are used in building codes. The hazard models usually consider only natural seismicity; non-tectonic (man-made) earthquakes are excluded because they are transitory or too small. In the past decade, however, thousands of earthquakes related to underground fluid injection have occurred in the central and eastern U.S. (CEUS), and some have caused damage.  In response, the USGS is now also making short-term forecasts that account for the hazard from these induced earthquakes. Seismicity statistics are analyzed to develop recurrence models, accounting for catalog completeness. In the USGS hazard modeling methodology, earthquakes are counted on a map grid, recurrence models are applied to estimate the rates of future earthquakes in each grid cell, and these rates are combined with maximum-magnitude models and ground-motion models to compute the hazard The USGS published a forecast for the years 2016 and 2017.Here, we document the development of the seismicity catalogs for the 2017 CEUS short-term hazard model.  A uniform earthquake catalog is assembled by combining and winnowing pre-existing source catalogs. The initial, final, and supporting earthquake catalogs are made available here.

  17. Post-earthquake ignition vulnerability assessment of Küçükçekmece District

    NASA Astrophysics Data System (ADS)

    Yildiz, S. S.; Karaman, H.

    2013-12-01

    In this study, a geographic information system (GIS)-based model was developed to calculate the post-earthquake ignition probability of a building, considering damage to the building's interior gas and electrical distribution system and the overturning of appliances. In order to make our model more reliable and realistic, a weighting factor was used to define the possible existence of each appliance or other contents in the given occupancy. A questionnaire was prepared to weigh the relevance of the different components of post-earthquake ignitions using the analytical hierarchy process (AHP). The questionnaire was evaluated by researchers who were experienced in earthquake engineering and post-earthquake fires. The developed model was implemented to HAZTURK's (Hazards Turkey) earthquake loss assessment software, as developed by the Mid-America Earthquake Center with the help of Istanbul Technical University. The developed post-earthquake ignition tool was applied to Küçükçekmece, Istanbul, in Turkey. The results were evaluated according to structure types, occupancy types, the number of storeys, building codes and specified districts. The evaluated results support the theory that post-earthquake ignition probability is inversely proportional to the number of storeys and the construction year, depending upon the building code.

  18. Assessment of landslide hazards resulting from the February 13, 2001, El Salvador earthquake; a report to the government of El Salvador and the U. S. Agency for International Development

    USGS Publications Warehouse

    Baum, Rex L.; Crone, Anthony J.; Escobar, Demetreo; Harp, Edwin L.; Major, Jon J.; Martinez, Mauricio; Pullinger, Carlos; Smith, Mark E.

    2001-01-01

    On February 13, 2001, a magnitude 6.5 earthquake occurred about 40 km eastsoutheast of the capital city of San Salvador in central El Salvador and triggered thousands of landslides in the area east of Lago de Ilopango. The landslides are concentrated in a 2,500-km2 area and are particularly abundant in areas underlain by thick deposits of poorly consolidated, late Pleistocene and Holocene Tierra Blanca rhyolitic tephras that were erupted from Ilopango caldera. Drainages in the tephra deposits are deeply incised, and steep valley walls failed during the strong shaking. Many drainages are clogged with landslide debris that locally buries the adjacent valley floor. The fine grain-size of the tephra facilitates its easy mobilization by rainfall runoff. The potential for remobilizing the landslide debris as debris flows and in floods is significant as this sediment is transported through the drainage systems during the upcoming rainy season. In addition to thousands of shallow failures, two very large landslides occurred that blocked the Rio El Desague and the Rio Jiboa. The Rio El Desague landslide has an estimated volume of 1.5 million m3, and the Rio Jiboa landslide has an estimated volume of 12 million m3. Field studies indicate that catastrophic draining of the Rio El Desague landslide-dammed lake would pose a minimal flooding hazard, whereas catastrophic draining of the Rio Jiboa lake would pose a serious hazard and warrants immediate action. Construction of a spillway across part of the dam could moderate the impact of catastrophic lake draining and the associated flood. Two major slope failures on the northern side of Volcan San Vicente occurred in the upper reaches of Quebrada Del Muerto and the Quebrada El Blanco. The landslide debris in the Quebrada Del Muerto consists dominantly of blocks of well-lithified andesite, whereas the debris in the Quebrada El Blanco consists of poorly consolidated pyroclastic sediment. The large blocks of lithified rock in

  19. Assessment of multi hazards in Semarang city

    NASA Astrophysics Data System (ADS)

    Nugraha, Arief Laila; Hani'ah, Pratiwi, Rosika D.

    2017-07-01

    Semarang city is the centre of the city for the people of Central Java province, where it has been transformed into a centre of economic and administrative government. The condition becomes vulnerable because Semarang city has at least four natural disasters. The Natural disasters also negatively impact for the people of Semarang city. The Natural disasters are floods, tidal flooding, landslides, and drought. To find out which areas are experiencing high levels of threat from the disaster, must be done by mapping multi hazards. Multi hazards mapping is done by using the method of weighting parameters and be processed by GIS from disaster-forming parameters. The next step, the result of mapping multi hazards be overlay to get the value of the level of hazards. To get assessment of Level multi natural hazards, the overlay of the hazards map can be done by two methods, GIS and AHP (Analytical Hierarchy Process) methods. The result will be obtained in high level of affected multi hazards area in the Semarang city is Genuk district, Semarang Utara district, and Tugu District. In wich total area in high level of multi hazards is 61944.14 hectares or 30.77% of total area Semarang city.

  20. Studies of crustal structure, seismic precursors to volcanic eruptions and earthquake hazard in the eastern provinces of the Democratic Republic of Congo

    NASA Astrophysics Data System (ADS)

    Mavonga, T.; Zana, N.; Durrheim, R. J.

    2010-11-01

    In recent decades, civil wars in the eastern provinces of the Democratic Republic of Congo have caused massive social disruptions, which have been exacerbated by volcanic and earthquake disasters. Seismic data were gathered and analysed as part of an effort to monitor the volcanoes and quantitatively assess the earthquake hazard. This information can be used to regulate the settlement of displaced people and to "build back better". In order to investigate volcanic processes in the Virunga area, a local seismic velocity model was derived and used to relocate earthquake hypocenters. It was found that swarm-type seismicity, composed mainly of long-period earthquakes, preceded both the 2004 and 2006 eruptions of Nyamuragira. A steady increase in seismicity was observed to commence ten or eleven months prior to the eruption, which is attributed to the movement of magma in a deep conduit. In the last stage (1 or 2 months) before the eruption, the hypocenters of long-period earthquakes became shallower. Seismic hazard maps were prepared for the DRC using a 90-year catalogue compiled for homogeneous Mw magnitudes, various published attenuation relations, and the EZ-Frisk software package. The highest levels of seismic hazard were found in the Lake Tanganyika Rift seismic zone, where peak ground accelerations (PGA) in excess of 0.32 g, 0.22 g and 0.16 g are expected to occur with 2%, 5% and 10% chance of exceedance in 50 years, respectively.

  1. Earthquake potential at Parkfield, CA inferred from geodetic data spanning two earthquake cycles with assessment of model resolution and uncertainty

    NASA Astrophysics Data System (ADS)

    Murray, J. R.; Langbein, J.

    2005-12-01

    Although some models used in long-term earthquake forecasting treat earthquake occurrence as a time-independent process, it is generally accepted that an earthquake relieves accumulated stress on a fault and that time is required to rebuild stress before another large event. This idea is embodied in the time-predictable model for earthquake recurrence, which is used in hazard forecasting. The related slip-predictable model states that earthquake size is proportional to the time since the last event and the fault stressing rate. Geodetic data offer a means of measuring strain accumulation and release in the Earth's crust throughout the earthquake cycle. Inversion of these data can provide useful inputs, such as estimates of slip in earthquakes or the long-term slip rate, to recurrence models. However, the slip resolution of geodetic observations decreases with depth on the fault. If the results of geodetic modeling are to be properly incorporated into hazard forecasts, it is critical to assess which features of the models are robust by quantifying the model resolution and the uncertainties of estimated parameters. With geodetic measurements for the three most recent earthquakes (in 1934, 1966, and 2004), Parkfield, California is one of the few locales where geodetic data span multiple earthquake cycles. Sparse observations exist for the 1934 - 1966 time period, and geodetic monitoring steadily increased during the 1966 - 2004 interseismic period. Through joint inversion of the variety of Parkfield geodetic measurements (triangulation, trilateration, two-color laser, and GPS) we obtain the most detailed image yet of the evolution of slip on the fault since the 1934 earthquake. Obtaining the model resolution and model covariance matrices is straightforward for linear inversions. However, due to the inclusion of non-negativity constraints, the inversions of Parkfield data are nonlinear. We apply an alternative technique for calculating the model resolution and use the

  2. Earthquake resistant construction of gas and liquid fuel pipeline systems serving, or regulated by, the Federal government. Earthquake hazard reduction series No. 67

    SciTech Connect

    Yokel, F.Y.; Mathey, R.G.

    1992-07-01

    The vulnerability of gas and liquid fuel pipeline systems to damage in past earthquakes, as well as available standards and technologies that can protect these facilities against earthquake damage are reviewed. An overview is presented of measures taken by various Federal Agencies to protect pipeline systems under their jurisdiction against earthquake hazards. It is concluded that the overall performance of pipeline systems in past earthquakes was relatively good, however, older pipelines and above-ground storage tanks were damaged in many earthquakes. Standards and regulations for liquid fuel pipelines contain only general references to seismic loads. Standards and regulations for above-ground fuel storage tanks and for liquefied natural gas facilities contain explicit seismic design provisions. It is recommended that a guideline for earthquake resistant design of gas and liquid fuel pipeline systems be prepared for Federal Agencies to ensure a uniform approach to the protection of these systems.

  3. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  4. Protocol for aquatic hazard assessment of selenium

    SciTech Connect

    Lemly, A.D.

    1995-05-24

    A procedure is described for conducting an aquatic hazard assessment of selenium. Hazard is characterized in terms of the potential for food-chain bioaccumulation and reproductive impairment in fish and aquatic birds, which are the most sensitive biological responses for estimating ecosystem-level impacts of selenium contamination. Five degrees of hazard are possible depemding on the expected environmental concentrations of selenium, exposure of fish and aquatic birds to toxic concentrations, and resultant potential for reproductive impairment. An example is given to illustrate how the protocol is applied to selenium data from a typical contaminant monitoring program.

  5. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  6. Protection of the human race against natural hazards (asteroids, comets, volcanoes, earthquakes)

    NASA Astrophysics Data System (ADS)

    Smith, Joseph V.

    1985-10-01

    Although we justifiably worry about the danger of nuclear war to civilization, and perhaps even to survival of the human race, we tend to consider natural hazards (e.g., comets, asteroids, volcanoes, earthquakes) as unavoidable acts of God. In any human lifetime, a truly catastrophic natural event is very unlikely, but ultimately one will occur. For the first time in human history we have sufficient technical skills to begin protection of Earth from some natural hazards. We could decide collectively throughout the world to reassign resources: in particular, reduction of nuclear and conventional weapons to a less dangerous level would allow concomitant increase of international programs for detection and prevention of natural hazards. Worldwide cooperation to mitigate natural hazards might help psychologically to lead us away from the divisive bickering that triggers wars. Future generations could hail us as pioneers of peace and safety rather than curse us as agents of death and destruction.

  7. Measuring the environmental context of social vulnerability to urban earthquake hazards: An integrative remote sensing and GIS approach

    NASA Astrophysics Data System (ADS)

    Rashed, Tarek Mohamed Gamal Eldin

    Although vulnerability represents an essential concept in the development of mitigation strategies at the local, national, and international levels, there is little consensus among researchers, planners, and disaster managers regarding the best way to undertake vulnerability analysis. The basic objective of this research is to move that discussion forward by integrating remote sensing and GIS analysis into new ways of thinking about urban vulnerability. The research conceptualizes urban vulnerability to be a characteristic of an urban community that can be assessed through a combination of ecological factors associated with the physical conditions of the geographic space in which the urban community is and the social conditions of the population in that place. The basic hypothesis of the research is that these physical and social conditions are so inextricably bound together in many disaster situations that we can use the former as indicative of the latter. The research proposes an approach, through which areas with high levels of vulnerability (hot spots) are first located and differentiated from other areas within a defined urban region. The methodology of this research is tested for the Los Angeles metropolitan area, employing data from the 1990 US census. The findings of this research add to our understanding of how earthquake hazards respond to natural and human-induced changes, and the consequences of land cover alteration on the increasing occurrence worldwide of earthquake disasters. From an empirical viewpoint, the study shows how advanced GIS and remote sensing procedures can be combined to allow planners and decision makers to focus on the more vulnerable communities in their midst, and thus to help develop mitigation measures that could prevent earthquake hazards from becoming major human disasters. Finally, this study tests the importance of using remote sensing data in vulnerability analysis at the local level, thus laying the foundation of

  8. Induced and Natural Seismicity: Earthquake Hazards and Risks in Ohio:

    NASA Astrophysics Data System (ADS)

    Besana-Ostman, G. M.; Worstall, R.; Tomastik, T.; Simmers, R.

    2013-12-01

    To adapt with increasing need to regulate all operations related to both the Utica and Marcellus shale play within the state, ODNR had recently strengthen its regulatory capability through implementation of stricter permit requirements, additional human resources and improved infrastructure. These ODNR's efforts on seismic risk reduction related to induced seismicity led to stricter regulations and many infrastructure changes related particularly to Class II wells. Permit requirement changes and more seismic monitoring stations were implemented together with additional injection data reporting from selected Class II well operators. Considering the possible risks related to seismic events in a region with relatively low seismicity, correlation between limited seismic data and injection volume information were undertaken. Interestingly, initial results showed some indications of both plugging and fracturing episodes. The real-time data transmission from seismic stations and availability of injection volume data enabled ODNR to interact with operators and manage wells dynamically. Furthermore, initial geomorphic and structural analyses indicated possible active faults in the northern and western portion of the state oriented NE-SW. The newly-mapped structures imply possible relatively bigger earthquakes in the region and consequently higher seismic risks. With the above-mentioned recent changes, ODNR have made critical improvement of its principal regulatory role in the state for oil and gas operations but also an important contribution to the state's seismic risk reduction endeavors. Close collaboration with other government agencies and the public, and working together with the well operators enhanced ODNR's capability to build a safety culture and achieve further public and industry participation towards a safer environment. Keywords: Induced seismicity, injection wells, seismic risks

  9. Active tectonics of the Seattle fault and central Puget sound, Washington - Implications for earthquake hazards

    USGS Publications Warehouse

    Johnson, S.Y.; Dadisman, S.V.; Childs, J. R.; Stanley, W.D.

    1999-01-01

    We use an extensive network of marine high-resolution and conventional industry seismic-reflection data to constrain the location, shallow structure, and displacement rates of the Seattle fault zone and crosscutting high-angle faults in the Puget Lowland of western Washington. Analysis of seismic profiles extending 50 km across the Puget Lowland from Lake Washington to Hood Canal indicates that the west-trending Seattle fault comprises a broad (4-6 km) zone of three or more south-dipping reverse faults. Quaternary sediment has been folded and faulted along all faults in the zone but is clearly most pronounced along fault A, the northernmost fault, which forms the boundary between the Seattle uplift and Seattle basin. Analysis of growth strata deposited across fault A indicate minimum Quaternary slip rates of about 0.6 mm/yr. Slip rates across the entire zone are estimated to be 0.7-1.1 mm/yr. The Seattle fault is cut into two main segments by an active, north-trending, high-angle, strike-slip fault zone with cumulative dextral displacement of about 2.4 km. Faults in this zone truncate and warp reflections in Tertiary and Quaternary strata and locally coincide with bathymetric lineaments. Cumulative slip rates on these faults may exceed 0.2 mm/yr. Assuming no other crosscutting faults, this north-trending fault zone divides the Seattle fault into 30-40-km-long western and eastern segments. Although this geometry could limit the area ruptured in some Seattle fault earthquakes, a large event ca. A.D. 900 appears to have involved both segments. Regional seismic-hazard assessments must (1) incorporate new information on fault length, geometry, and displacement rates on the Seattle fault, and (2) consider the hazard presented by the previously unrecognized, north-trending fault zone.

  10. Probabilistic seismic hazard assessment of Italy using kernel estimation methods

    NASA Astrophysics Data System (ADS)

    Zuccolo, Elisa; Corigliano, Mirko; Lai, Carlo G.

    2013-07-01

    A representation of seismic hazard is proposed for Italy based on the zone-free approach developed by Woo (BSSA 86(2):353-362, 1996a), which is based on a kernel estimation method governed by concepts of fractal geometry and self-organized seismicity, not requiring the definition of seismogenic zoning. The purpose is to assess the influence of seismogenic zoning on the results obtained for the probabilistic seismic hazard analysis (PSHA) of Italy using the standard Cornell's method. The hazard has been estimated for outcropping rock site conditions in terms of maps and uniform hazard spectra for a selected site, with 10 % probability of exceedance in 50 years. Both spectral acceleration and spectral displacement have been considered as ground motion parameters. Differences in the results of PSHA between the two methods are compared and discussed. The analysis shows that, in areas such as Italy, characterized by a reliable earthquake catalog and in which faults are generally not easily identifiable, a zone-free approach can be considered a valuable tool to address epistemic uncertainty within a logic tree framework.

  11. Earthquake shaking hazard estimates and exposure changes in the conterminous United States

    USGS Publications Warehouse

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.

    2015-01-01

    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  12. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2009-04-01

    This special issue includes selected papers on the topic of earthquake impact from the sessions held in 2004 in Nice, France and in 2005 in Vienna, Austria at the first and respectivelly the second European Geosciences Union General Assembly. Since its start in 1999, in the Hague, Netherlands, the hazard of earthquakes has been the most popular of the session. The respective calls in 2004 was for: Nature's forces including earthquakes, floods, landslides, high winds and volcanic eruptions can inflict losses to urban settlements and man-made structures such as infrastructure. In Europe, recent years have seen such significant losses from earthquakes in south and south-eastern Europe, floods in central Europe, and wind storms in western Europe. Meanwhile, significant progress has been made in understanding disasters. Several scientific fields contribute to a holistic approach in the evaluation of capacities, vulnerabilities and hazards, the main factors on mitigating urban disasters due to natural hazards. An important part of the session is devoted to assessment of earthquake shaking and loss scenarios, including both physical damage and human causalities. Early warning and rapid damage evaluation are of utmost importance for addressing the safety of many essential facilities, for emergency management of events and for disaster response. In case of earthquake occurrence strong motion networks, data processing and interpretation lead to preliminary estimation (scenarios) of geographical distribution of damages. Factual information on inflicted damage, like those obtained from shaking maps or aerial imagery permit a confrontation with simulation maps of damage in order to define a more accurate picture of the overall losses. Most recent developments towards quantitative and qualitative simulation of natural hazard impacts on urban areas, which provide decision-making support for urban disaster management, and success stories of and lessons learned from disaster

  13. Earthquake damage potential and critical scour depth of bridges exposed to flood and seismic hazards under lateral seismic loads

    NASA Astrophysics Data System (ADS)

    Song, Shin-Tai; Wang, Chun-Yao; Huang, Wen-Hsiu

    2015-12-01

    Many bridges located in seismic hazard regions suffer from serious foundation exposure caused by riverbed scour. Loss of surrounding soil significantly reduces the lateral strength of pile foundations. When the scour depth exceeds a critical level, the strength of the foundation is insufficient to withstand the imposed seismic demand, which induces the potential for unacceptable damage to the piles during an earthquake. This paper presents an analytical approach to assess the earthquake damage potential of bridges with foundation exposure and identify the critical scour depth that causes the seismic performance of a bridge to differ from the original design. The approach employs the well-accepted response spectrum analysis method to determine the maximum seismic response of a bridge. The damage potential of a bridge is assessed by comparing the imposed seismic demand with the strengths of the column and the foundation. The versatility of the analytical approach is illustrated with a numerical example and verified by the nonlinear finite element analysis. The analytical approach is also demonstrated to successfully determine the critical scour depth. Results highlight that relatively shallow scour depths can cause foundation damage during an earthquake, even for bridges designed to provide satisfactory seismic performance.

  14. Recent Achievements of the Neo-Deterministic Seismic Hazard Assessment in the CEI Region

    SciTech Connect

    Panza, G. F.; Kouteva, M.; Vaccari, F.; Peresan, A.; Romanelli, F.; Cioflan, C. O.; Radulian, M.; Marmureanu, G.; Paskaleva, I.; Gribovszki, K.; Varga, P.; Herak, M.; Zaichenco, A.; Zivcic, M.

    2008-07-08

    A review of the recent achievements of the innovative neo-deterministic approach for seismic hazard assessment through realistic earthquake scenarios has been performed. The procedure provides strong ground motion parameters for the purpose of earthquake engineering, based on the deterministic seismic wave propagation modelling at different scales--regional, national and metropolitan. The main advantage of this neo-deterministic procedure is the simultaneous treatment of the contribution of the earthquake source and seismic wave propagation media to the strong motion at the target site/region, as required by basic physical principles. The neo-deterministic seismic microzonation procedure has been successfully applied to numerous metropolitan areas all over the world in the framework of several international projects. In this study some examples focused on CEI region concerning both regional seismic hazard assessment and seismic microzonation of the selected metropolitan areas are shown.

  15. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  16. Citizen Monitoring during Hazards: The Case of Fukushima Radiation after the 2011 Japanese Earthquake

    NASA Astrophysics Data System (ADS)

    Hultquist, C.; Cervone, G.

    2015-12-01

    Citizen-led movements producing scientific environmental information are increasingly common during hazards. After the Japanese earthquake-triggered tsunami in 2011, the government produced airborne remote sensing data of the radiation levels after the Fukushima nuclear reactor failures. Advances in technology enabled citizens to monitor radiation by innovative mobile devices built from components bought on the Internet. The citizen-led Safecast project measured on-ground levels of radiation in the Fukushima prefecture which total 14 million entries to date in Japan. This non-authoritative citizen science collection recorded radiation levels at specific coordinates and times is available online, yet the reliability and validity of the data had not been assessed. The nuclear incident provided a case for assessment with comparable dimensions of citizen science and authoritative data. To perform a comparison of the datasets, standardization was required. The sensors were calibrated scientifically but collected using different units of measure. Radiation decays over time so temporal interpolation was necessary for comparison of measurements as being the same time frame. Finally, the GPS located points were selected within the overlapping spatial extent of 500 meters. This study spatially analyzes and statistically compares citizen-volunteered and government-generated radiation data. Quantitative measures are used to assess the similarity and difference in the datasets. Radiation measurements from the same geographic extents show similar spatial variations which suggests that citizen science data can be comparable with government-generated measurements. Validation of Safecast demonstrates that we can infer scientific data from unstructured and not vested data. Citizen science can provide real-time data for situational awareness which is crucial for decision making during disasters. This project provides a methodology for comparing datasets of radiological measurements

  17. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  18. Seismic Hazard Assessment in the Aspospirgos Area, Athens - Greece

    NASA Astrophysics Data System (ADS)

    Voulgaris, N.; Drakatos, G.; Lekkas, E.; Karastathis, V.; Valadaki, A.; Plessas, S.

    2005-12-01

    The extensive damages and human life loss related to the September 7, 1999 earthquake in the Athens area (Greece) initiated an effort to re-evaluate seismic hazard in various regions around the capital. One of the target areas selected within the framework of the specially designed research project ESTIA was the industrial area of Aspropirgos, where the epicenter of the main shock was located. The multidisciplinary approach towards seismic hazard assessment included a microseismicity survey and detailed geological and tectonic studies in the area in order to delineate and define the recently activated seismic sources in the area. Initially a portable network, consisting of seventeen (17) digital seismographs was installed and operated for 2 months during the autumn of 2004. A total of five hundred forty five (545) earthquakes (M<3) have been recorded. The results of the geological survey in the region were summarised in two maps compiled at a scale of 1:5,000 and 1:25,000, respectively. These data sets were combined with all the available historical and instrumental seismological data and a revised seismic source zone model was defined for the broader area and subsequently used for hazard assessment calculations. The results were presented as maximum expected peak ground acceleration and velocity distribution maps for 475 and 949 years return period or 90% probability of NBE for the next 50 and 100 years respectively. Finally in order to facilitate the implementation of the above results according to the current Greek Aseismic Code the required distribution for the 3 different soil types was mapped using the results of the geological survey. By combining the above types of data the engineer is able to calculate specific design spectra for every site while combination with available vulnerability estimates could lead to more realistic seismic risk calculations. Acknowledgments We would like to thank the General Secretariat for Research and Technology of Greece for

  19. Stochastic Slip Distributions in Seismic Probabilistic Tsunami Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Murphy, Shane; Scala, Antonio; Herrero, Andre; Lorito, Stefano; Festa, Gaetano; Trasatti, Elisa; Romano, Fabrizio; Nielsen, Stefan; Tonini, Roberto; Volpe, Manuela

    2017-04-01

    Accurate representation of heterogeneous slip is critical in Seismic Probabilistic Tsunami Hazard Assessment (SPTHA) for coastlines close to subduction zones. Stochastic slip distributions are generally used to feed aleatory slip variability into SPTHA. These stochastic slip distributions generally contain self-similar or self-affine slip spectra based on seismological and/or geological observations. However earthquakes such as the 2011 Mw 9 Tohoku earthquake, which contained significant slip concentrated at shallow depth, leads to questions as to whether source parameters based on generic observations across a wide range of fault types and environments is valid for a specific fault. This presentation looks at the calculation and evaluation of stochastic source models used in SPTHA. Taking the Tohoku region as a case study we compare the variation with depth of stochastic source models against 2D numerical simulations rupture simulations. A metric entitled the Slip Probability Density Function (SPDF) which measures the spatial coverage of slip across the fault plane in an ensemble of slip distributions is used tom compare the ensembles. We show that for a large collection of 2D dynamic rupture simulations for Tohoku region the shape of SPDF varies greatly depending on whether rupture reaches the shallow part of the subduction interface or not. Using this modification, we perform 500 simulations and computed the conditional probability for the maximum tsunami wave height for a M9 earthquake along the eastern Japanese coastline. Compared with the same conditional probability but calculated using a traditional stochastic source, the modified stochastic source returned a higher maximum wave height and hence larger hazard. However, while the numerical simulations provide insight into the effect that fault geometry and free surface effects play in slip distributions, it also inherently implies a number of assumptions about the state of the subsurface (e.g. slip weakening

  20. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  1. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, B.; Siu, Y. L.; Mitchell, G.

    2015-12-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  2. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  3. The global tsunami hazard due to long return period subduction zone earthquakes

    NASA Astrophysics Data System (ADS)

    Løvholt, Finn; Bonnevie Harbitz, Carl; Glimsdal, Sylfest; Horspool, Nick; Smebye, Helge; de Bono, Andrea; Nadim, Farrokh

    2014-05-01

    Historical tsunamis and paleotsunami evidence indicate that massive megathrust earthquakes lead to the majority of the losses due to tsunamis. There is a need to quantify the tsunami hazard from megathrust events in order to compare tsunamis with other natural hazards on a global level, as previous attempts have been lacking. The global tsunami hazard induced by earthquakes is therefore computed for a return period of 500 years. To this end, the exposed elements at risk such as population, produced capital, and nuclear power plants are determined. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. The methods used to quantify the global hazard are obviously crude, and hence the expected accuracy using global methods are discussed.

  4. Widespread seismicity excitation following the 2011 M=9.0 Tohoku, Japan, earthquake and its implications for seismic hazard

    NASA Astrophysics Data System (ADS)

    Toda, S.; Stein, R. S.; Lin, J.

    2011-12-01

    The 11 March 2011 Tohoku-chiho Taiheiyo-oki earthquake (Tohoku earthquake) was followed by massive offshore aftershocks including 6 M≧7 and 94 M≧6 shocks during the 4.5 months (until July 26). It is also unprecedented that a broad increase in seismicity was observed over inland Japan at distances of up to 425 km from the locus of high seismic slip on the megathrust. Such an increase was not seen for the 2004 M=9.1 Sumatra or 2010 M=8.8 Chile earthquakes, but they lacked the seismic networks necessary to detect such small events. Here we explore the possibility that the rate changes are the product of static Coulomb stress transfer to small faults. We use the nodal planes of M≧3.5 earthquakes as proxies for such small active faults, and find that of fifteen regions averaging ˜80 by 80 km in size, 11 show a positive association between calculated stress changes and the observed seismicity rate change, 3 show a negative correlation, and for one the changes are too small to assess. This work demonstrates that seismicity can turn on in the nominal stress shadow of a mainshock as long as small geometrically diverse active faults exist there, which is likely quite common in areas having complex geologic background like Tohoku. In Central Japan, however, there are several regions where the usual tectonic stress has been enhanced by the Tohoku earthquake, and the moderate and large faults have been brought closer to failure, producing M˜5 to 6 shocks, including Nagano, near Mt. Fuji, Tokyo metropolitan area and its offshore. We confirmed that at least 5 of the seven large, exotic, or remote aftershocks were brought ≧0.3 bars closer to failure. Validated by such correlations, we evaluate the effects of the Tohoku event on the other subduction zones nearby and major active faults inland. The majorities of thrust faults inland Tohoku are brought farther from failure by the M9 event. However, we found that the large sections of the Japan trench megathrust, the outer

  5. The 2011 Mineral, Virginia, earthquake and its significance for seismic hazards in eastern North America: overview and synthesis

    USGS Publications Warehouse

    Horton, J. Wright; Chapman, Martin C.; Green, Russell A.

    2015-01-01

    The earthquake and aftershocks occurred in crystalline rocks within Paleozoic thrust sheets of the Chopawamsic terrane. The main shock and majority of aftershocks delineated the newly named Quail fault zone in the subsurface, and shallow aftershocks defined outlying faults. The earthquake induced minor liquefaction sand boils, but notably there was no evidence of a surface fault rupture. Recurrence intervals, and evidence for larger earthquakes in the Quaternary in this area, remain important unknowns. This event, along with similar events during historical time, is a reminder that earthquakes of similar or larger magnitude pose a real hazard in eastern North America.

  6. Advanced Materials Laboratory hazards assessment document

    SciTech Connect

    Barnett, B.; Banda, Z.

    1995-10-01

    The Department of Energy Order 55OO.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the AML. The entire inventory was screened according to the potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 23 meters. The highest emergency classification is a General Emergency. The Emergency Planning Zone is a nominal area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets.

  7. Techniques for assessing industrial hazards: a manual

    SciTech Connect

    Not Available

    1988-01-01

    This manual provides guidelines for the identification of the potential hazards of new or existing plants or processes in the chemical and energy industries, and for the assessment of the consequences of the release of toxic, flammable, or explosive materials to the atmosphere. It presents a structured, simplified approach for identifying the most-serious potential hazards and for calculating their effect distances or damage ranges. It is the intention that by presenting a simplified approach, the manual should be amenable to use by engineers and scientists with little or no experience of hazard analysis. Further analysis with a view to mitigation of the hazards identified may be appropriate in many cases; at this stage, it may be necessary to seek the advice of a specialist. The basic procedure in a hazard analysis is: identify potential failures, calculate release quantities for each failure, and calculate the impact of each release on people and property. For large plants this can become highly complex, and therefore a simplified method is presented, in which the analysis has been divided into 14 steps. A spreadsheet technique was devised to permit the analyses to be carried out on a programmable calculator or personal computer. After the introductory material, the manual outlines the 14 steps that make up the hazard analysis.

  8. Quantifying potential earthquake and tsunami hazard in the Lesser Antilles subduction zone of the Caribbean region

    USGS Publications Warehouse

    Hayes, Gavin P.; McNamara, Daniel E.; Seidman, Lily; Roger, Jean

    2013-01-01

    In this study, we quantify the seismic and tsunami hazard in the Lesser Antilles subduction zone, focusing on the plate interface offshore of Guadeloupe. We compare potential strain accumulated via GPS-derived plate motions to strain release due to earthquakes that have occurred over the past 110 yr, and compute the resulting moment deficit. Our results suggest that enough strain is currently stored in the seismogenic zone of the Lesser Antilles subduction arc in the region of Guadeloupe to cause a large and damaging earthquake of magnitude Mw ∼ 8.2 ± 0.4. We model several scenario earthquakes over this magnitude range, using a variety of earthquake magnitudes and rupture areas, and utilizing the USGS ShakeMap and PAGER software packages. Strong ground shaking during the earthquake will likely cause loss of life and damage estimated to be in the range of several tens to several hundreds of fatalities and hundreds of millions to potentially billions of U.S. dollars of damage. In addition, such an event could produce a significant tsunami. Modelled tsunamis resulting from these scenario earthquakes predict meter-scale wave amplitudes even for events at the lower end of our magnitude range (M 7.8), and heights of over 3 m in several locations with our favoured scenario (M 8.0, partially locked interface from 15–45 km depth). In all scenarios, only short lead-times (on the order of tens of minutes) would be possible in the Caribbean before the arrival of damaging waves.

  9. The hazard in using probabilistic seismic hazard analysis

    SciTech Connect

    Krinitzsky, E.L. . Geotechnical Lab.)

    1993-11-01

    Earthquake experts rely on probabilistic seismic hazard analysis for everything from emergency-response planning to development of building codes. Unfortunately, says the author, the analysis is defective for the large earthquakes that pose the greater risks. Structures have short lifetimes and the distance over which earthquakes cause damage are relatively small. Exceptions serve to prove the rule. To be useful in engineering, earthquakes hazard assessment must focus narrowly in both time and space.

  10. Earthquake scenario in West Bengal with emphasis on seismic hazard microzonation of the city of Kolkata, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Maiti, S. K.; Devaraj, N.; Srivastava, N.; Mohapatra, L. D.

    2014-09-01

    Seismic microzonation is a process of estimating site-specific effects due to an earthquake on urban centers for its disaster mitigation and management. The state of West Bengal, located in the western foreland of the Assam-Arakan Orogenic Belt, the Himalayan foothills and Surma Valley, has been struck by several devastating earthquakes in the past, indicating the need for a seismotectonic review of the province, especially in light of probable seismic threat to its capital city of Kolkata, which is a major industrial and commercial hub in the eastern and northeastern region of India. A synoptic probabilistic seismic hazard model of Kolkata is initially generated at engineering bedrock (Vs30 ~ 760 m s-1) considering 33 polygonal seismogenic sources at two hypocentral depth ranges, 0-25 and 25-70 km; 158 tectonic sources; appropriate seismicity modeling; 14 ground motion prediction equations for three seismotectonic provinces, viz. the east-central Himalaya, the Bengal Basin and Northeast India selected through suitability testing; and appropriate weighting in a logic tree framework. Site classification of Kolkata performed following in-depth geophysical and geotechnical investigations places the city in D1, D2, D3 and E classes. Probabilistic seismic hazard assessment at a surface-consistent level - i.e., the local seismic hazard related to site amplification performed by propagating the bedrock ground motion with 10% probability of exceedance in 50 years through a 1-D sediment column using an equivalent linear analysis - predicts a peak ground acceleration (PGA) range from 0.176 to 0.253 g in the city. A deterministic liquefaction scenario in terms of spatial distribution of liquefaction potential index corresponding to surface PGA distribution places 50% of the city in the possible liquefiable zone. A multicriteria seismic hazard microzonation framework is proposed for judicious integration of multiple themes, namely PGA at the surface, liquefaction potential

  11. Tsunami hazard assessment for the island of Rhodes, Greece

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Zaniboni, Filippo; Tinti, Stefano

    2013-04-01

    The island of Rhodes is part of the Dodecanese archipelago, and is one of the many islands that are found in the Aegean Sea. The tectonics of the Rhodes area is rather complex, involving both strike-slip and dip-slip (mainly thrust) processes. Tsunami catalogues (e.g. Papadopulos et al, 2007) show the relative high frequency of occurrence of tsunamis in this area, some also destructive, in particular between the coasts of Rhodes and Turkey. In this part of the island is located the town of Rhodes, the capital and also the largest and most populated city. Rhodes is historically famous for the Colossus of Rhodes, collapsed following an earthquake, and nowadays is a popular tourist destination. This work is focused on the hazard assessment evaluation with research performed in the frame of the European project NearToWarn. The hazard is assessed by using the worst-credible case scenario, a method introduced and used to study local tsunami hazard in coastal towns like Catania, Italy, and Alexandria, Egypt (Tinti et al., 2012). The tsunami sources chosen for building scenarios are three: two located in the sea area in front of the Turkish coasts where the events are more frequent represent local sources and were selected in the frame of the European project NearToWarn, while one provides the case of a distant source. The first source is taken from the paper Ebeling et al. (2012) and modified by UNIBO and models the earthquake and small tsunami occurred on 25th April 1957.The second source is a landslide and is derived from the TRANSFER Project "Database of Tsunamigenic Non-Seismic Sources" and coincides with the so-called "Northern Rhodes Slide", possibly responsible for the 24th March 2002 tsunami. The last source is the fault that is located close to the island of Crete believed to be responsible for the tsunami event of 1303 that was reported to have caused damage in the city of Rhodes. The simulations are carried out using the finite difference code UBO-TSUFD that

  12. Multi-disciplinary Hazard Reduction from Earthquakes and Volcanoes in Indonesia - International Research Cooperation Program

    NASA Astrophysics Data System (ADS)

    Kato, Teruyuki

    2010-05-01

    Indonesian and Japanese researchers started a three-year (2009-2011) multi-disciplinary cooperative research project as a part of "Science and Technology Research Partnership for Sustainable Development" supported by the Japanese government. The ultimate goal of this project is to reduce disaster from earthquakes, tsunamis and volcanoes by enhancing capability of forecasting hazards, reducing social vulnerability, and education and outreach activity of research outcomes. We plan to provide platform of collaboration among researchers in natural science, engineering and social sciences, as well as officials in national and local governments. Research activities are grouped into: (1) geological and geophysical surveys of past earthquakes, monitoring current crustal activity, and simulation of future ground motion or tsunamis, (2) short-term and long-term prediction of volcanic eruptions by monitoring Semeru, Guntur and other volcanoes, and development of their evaluation method, (3) studies to establish social infrastructure based on engineering technologies and hazard maps, (4) social, cultural and religious studies to reduce vulnerability of local communities, and (5) studies on education and outreach on disaster reduction and restoration of community. In addition, to coordinate these research activities and to utilize the research results, (6) application of the research and establishment of collaboration mechanism between researchers and the government officials is planned. In addition to mutual visits and collaborative field studies, it is planned to hold annual joint seminars (in Indonesia in 2009 and 2011, in Japan in 2010) that will be broadcasted through internet. Meetings with Joint Coordinating Committee, composed of representatives of relevant Indonesian ministries and institutions as well as project members, will be held annually to oversee the activities. The kick-off workshop was held in Bandung in April 2009 and the research plans from 22 different

  13. Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes

    PubMed Central

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Festa, G.; Trasatti, E.; Tonini, R.; Romano, F.; Molinari, I.; Nielsen, S.

    2016-01-01

    The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style. PMID:27725733

  14. Tsunami Hazards along the Eastern Australian Coast from Potential Earthquakes: Results from Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Ding, R. W.; Yuen, D. A.

    2015-08-01

    Australia is surrounded by the Pacific Ocean and the Indian Ocean and, thus, may suffer from tsunamis due to its proximity to the subduction earthquakes around the boundary of Australian Plate. Potential tsunami risks along the eastern coast, where more and more people currently live, are numerically investigated through a scenario-based method to provide an estimation of the tsunami hazard in this region. We have chosen and calculated the tsunami waves generated at the New Hebrides Trench and the Puysegur Trench, and we further investigated the relevant tsunami hazards along the eastern coast and their sensitivities to various sea floor frictions and earthquake parameters (i.e. the strike, the dip and the slip angles and the earthquake magnitude/rupture length). The results indicate that the Puysegur trench possesses a seismic threat causing wave amplitudes over 1.5 m along the coast of Tasmania, Victoria, and New South Wales, and even reaching over 2.6 m at the regions close to Sydney, Maria Island, and Gabo Island for a certain worse case, while the cities along the coast of Queensland are potentially less vulnerable than those on the southeastern Australian coast.

  15. Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes

    NASA Astrophysics Data System (ADS)

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Festa, G.; Trasatti, E.; Tonini, R.; Romano, F.; Molinari, I.; Nielsen, S.

    2016-10-01

    The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style.

  16. Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes.

    PubMed

    Murphy, S; Scala, A; Herrero, A; Lorito, S; Festa, G; Trasatti, E; Tonini, R; Romano, F; Molinari, I; Nielsen, S

    2016-10-11

    The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style.

  17. Risk assessment of people trapped in earthquake based on km grid: a case study of the 2014 Ludian earthquake

    NASA Astrophysics Data System (ADS)

    Wei, Ben-Yong; Nie, Gao-Zhong; Su, Gui-Wu; Sun, Lei

    2017-04-01

    China is one of the most earthquake prone countries in the world. The priority during earthquake emergency response is saving lives and minimizing casualties. Rapid judgment of the trapped location is the important basis for government to reasonable arrange the emergency rescue forces and resources after the earthquake. Through analyzing the key factors resulting in people trapped, we constructed an assessment model of personal trapped (PTED)in collapsed buildings caused by earthquake disaster. Then taking the 2014 Ludian Earthquake as a case, this study evaluated the distribution of trapped personal during this earthquake using the assessment model based on km grid data. Results showed that, there are two prerequisites for people might be trapped by the collapse of buildings in earthquake: earthquake caused buildings collapse and there are people in building when building collapsing; the PTED model could be suitable to assess the trapped people in collapsed buildings caused by earthquake. The distribution of people trapped by the collapse of buildings in the Ludian earthquake assessed by the model is basically the same as that obtained by the actual survey. Assessment of people trapped in earthquake based on km grid can meet the requirements of search-and-rescue zone identification and rescue forces allocation in the early stage of the earthquake emergency. In future, as the basic data become more complete, assessment of people trapped in earthquake based on km grid should provide more accurate and valid suggestions for earthquake emergency search and rescue.

  18. Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.

    2012-12-01

    An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these

  19. Near-Field ETAS Constraints and Applications to Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Yoder, Mark R.; Rundle, John B.; Glasscoe, Margaret T.

    2015-08-01

    The epidemic type aftershock sequence (ETAS) statistical model of aftershock seismicity combines various earthquake scaling relations to produce synthetic earthquake catalogs, or estimates of aftershock seismicity rates, based on recent earthquake activity. One challenge to ETAS-based hazard assessment is the large number of free parameters involved. In this paper, we introduce an approach to constrain this parameter space from canonical scaling relations, empirical observations, and fundamental physics. We show that ETAS parameters can be estimated as a function of an earthquake's magnitude m based on the finite temporal and spatial extents of the rupture area. This approach facilitates fast ETAS-based estimates of seismicity from large "seed" catalogs, and it is particularly well suited to web-based deployment and otherwise automated implementations. It constitutes a significant improvement over contemporary ETAS by mitigating variability related to instrumentation and subjective catalog selection.

  20. Tsunami hazard and risk assessment in El Salvador

    NASA Astrophysics Data System (ADS)

    González, M.; González-Riancho, P.; Gutiérrez, O. Q.; García-Aguilar, O.; Aniel-Quiroga, I.; Aguirre, I.; Alvarez, J. A.; Gavidia, F.; Jaimes, I.; Larreynaga, J. A.

    2012-04-01

    Tsunamis are relatively infrequent phenomena representing a greater threat than earthquakes, hurricanes and tornadoes, causing the loss of thousands of human lives and extensive damage to coastal infrastructure around the world. Several works have attempted to study these phenomena in order to understand their origin, causes, evolution, consequences, and magnitude of their damages, to finally propose mechanisms to protect coastal societies. Advances in the understanding and prediction of tsunami impacts allow the development of adaptation and mitigation strategies to reduce risk on coastal areas. This work -Tsunami Hazard and Risk Assessment in El Salvador-, funded by AECID during the period 2009-12, examines the state of the art and presents a comprehensive methodology for assessing the risk of tsunamis at any coastal area worldwide and applying it to the coast of El Salvador. The conceptual framework is based on the definition of Risk as the probability of harmful consequences or expected losses resulting from a given hazard to a given element at danger or peril, over a specified time period (European Commission, Schneiderbauer et al., 2004). The HAZARD assessment (Phase I of the project) is based on propagation models for earthquake-generated tsunamis, developed through the characterization of tsunamigenic sources -sismotectonic faults- and other dynamics under study -tsunami waves, sea level, etc.-. The study area is located in a high seismic activity area and has been hit by 11 tsunamis between 1859 and 1997, nine of them recorded in the twentieth century and all generated by earthquakes. Simulations of historical and potential tsunamis with greater or lesser affection to the country's coast have been performed, including distant sources, intermediate and close. Deterministic analyses of the threats under study -coastal flooding- have been carried out, resulting in different hazard maps (maximum wave height elevation, maximum water depth, minimum tsunami

  1. Inundation Mapping and Hazard Assessment of Tectonic and Landslide Tsunamis in Southeast Alaska

    NASA Astrophysics Data System (ADS)

    Suleimani, E.; Nicolsky, D.; Koehler, R. D., III

    2014-12-01

    The Alaska Earthquake Center conducts tsunami inundation mapping for coastal communities in Alaska, and is currently focused on the southeastern region and communities of Yakutat, Elfin Cove, Gustavus and Hoonah. This activity provides local emergency officials with tsunami hazard assessment, planning, and mitigation tools. At-risk communities are distributed along several segments of the Alaska coastline, each having a unique seismic history and potential tsunami hazard. Thus, a critical component of our project is accurate identification and characterization of potential tectonic and landslide tsunami sources. The primary tectonic element of Southeast Alaska is the Fairweather - Queen Charlotte fault system, which has ruptured in 5 large strike-slip earthquakes in the past 100 years. The 1958 "Lituya Bay" earthquake triggered a large landslide into Lituya Bay that generated a 540-m-high wave. The M7.7 Haida Gwaii earthquake of October 28, 2012 occurred along the same fault, but was associated with dominantly vertical motion, generating a local tsunami. Communities in Southeast Alaska are also vulnerable to hazards related to locally generated waves, due to proximity of communities to landslide-prone fjords and frequent earthquakes. The primary mechanisms for local tsunami generation are failure of steep rock slopes due to relaxation of internal stresses after deglaciation, and failure of thick unconsolidated sediments accumulated on underwater delta fronts at river mouths. We numerically model potential tsunami waves and inundation extent that may result from future hypothetical far- and near-field earthquakes and landslides. We perform simulations for each source scenario using the Alaska Tsunami Model, which is validated through a set of analytical benchmarks and tested against laboratory and field data. Results of numerical modeling combined with historical observations are compiled on inundation maps and used for site-specific tsunami hazard assessment by

  2. Some Factors Controlling the Seismic Hazard due to Earthquakes Induced by Fluid Injection at Depth

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2012-12-01

    The maximum seismic moment (or moment magnitude) is an important measure of the seismic hazard associated with earthquakes induced by deep fluid injection. Although it would be advantageous to be able to predict the induced earthquake outcome, including the maximum seismic moment, of a specified fluid injection project in advance, this capability has, to date, proved to be elusive because the geomechanical and hydrological factors that control the seismic response to injection are too poorly understood. Fortunately, the vast majority of activities involving the injection of fluids into deep aquifers do not cause earthquakes that are large enough to be of any consequence. There have been, however, significant exceptions during the past 50 years, starting with the earthquakes induced by injection of wastewater at the Rocky Mountain Arsenal Well, during the 1960s, that caused extensive damage in the Denver, CO, area. Results from numerous case histories of earthquakes induced by injection activities, including wastewater disposal at depth and the development of enhanced geothermal systems, suggest that it may be feasible to estimate bounds on maximum magnitudes based on the volume of injected liquid. For these cases, volumes of injected liquid ranged from approximately 11.5 thousand to 5 million cubic meters and resulted in main shock moment magnitudes from 3.4 to 5.3. Because the maximum seismic moment appears to be linearly proportional to the total volume of injected fluid, this upper bound is expected to increase with time as long as a given injection well remains active. For example, in the Raton Basin, southern Colorado and northern New Mexico, natural gas is produced from an extensive coal bed methane field. The deep injection of wastewater associated with this gas production has induced a sequence of earthquakes starting in August 2001, shortly after the beginning of major injection activities. Most of this seismicity defines a northeast striking plane dipping

  3. Innovation in earthquake and natural hazards research: determining soil liquefaction potential

    SciTech Connect

    Moore, G.B.; Yin, R.K.

    1984-11-01

    This case study analyzes how an innovation in earthquake and natural hazards research was used for practical and policy purposes, why utilization occurred, and what potential policy implications can be drawn. The innovation was the dynamic analysis method, used to identify those soils that are likely to liquefy during earthquakes. The research was designed and undertaken by H. Bolton Seed at the University of California at Berkeley during the 1960s. The research was a major breakthrough in engineering research: liquefaction had never before been reproduced in a laboratory. The work yielded quantitative information about the conditions under which liquefaction occurs. These data were then used to develop procedures for predicting liquefaction; eventually the need to test soil samples in the laboratory was eliminated.

  4. Evaluation of Tsunami Hazards in Kuwait from Possible Earthquake and Landslide Sources considering Effect of Natural Tide

    NASA Astrophysics Data System (ADS)

    Latcharote, P.

    2016-12-01

    Kuwait is one of the most important oil producers to the world and most of population and many vital facilities are located along the coasts. However, even with low or unknown tsunami risk, it is important to investigate tsunami hazards in this country to ensure safety of life and sustain the global economy. This study aimed to evaluate tsunami hazards along the coastal areas of Kuwait from both earthquake and landslide sources using numerical modeling. Tsunami generation and propagation was simulated using the two-layer model and the TUNAMI model. Four cases of earthquake scenarios are expected to generate tsunami along the Makran Subduction Zone (MSZ) based on historical events and worst cases possible to simulate tsunami propagation to the coastal areas of the Arabian Gulf. Case 1 (Mw 8.3) and Case 2 (Mw 8.3) are the replication of the 1945 Makran earthquake, whereas Case 3 (Mw 8.6) and Case 4 (Mw 9.0) are the worst-case scenarios. Tsunami numerical simulation was modelled with mesh size 30 arc-second using bathymetry and topography data from GEBCO. Preliminary results suggested that tsunamis generated by Case 1 and Case 2 will impose very small effects to Kuwait (< 0.1 m) while Case 3 and Case 4 can generate maximum tsunami amplitude up to 0.3 m to 1.0 m after 12 hours from the earthquake. In addition, this study considered tsunamis generated by landslide along the opposite Iranian coast of Kuwait bay. To preliminarily assess tsunami hazards, coastal landslides were assumed occurred at the volume of 1.0-2.0 km3 at three possible locations from their topographic features. The preliminary results revealed that tsunami generated by coastal landslides could impose a significant tsunami impact to Kuwait having maximum tsunami amplitude at the Falika Island in front of Kuwait bay and Azzour power and desalination plant about 0.5 m- 1.1 m depending on landslide volume and energy dissipation. Future works will include more accuracy of tsunami numerical simulation with

  5. Assessment of a Tsunami Hazard for Mediterranean Coast of Egypt

    NASA Astrophysics Data System (ADS)

    Zaytsev, Andrey; Babeyko, Andrey; Yalciner, Ahmet; Pelinovsky, Efim

    2017-04-01

    Analysis of tsunami hazard for Egypt based on historic data and numerical modelling of historic and prognostic events is given. There are 13 historic events for 4000 years, including one instrumental record (1956). Tsunami database includes 12 earthquake tsunamis and 1 event of volcanic origin (Santorini eruption). Tsunami intensity of events (365, 881, 1303, 1870) is estimated as I = 3 led to tsunami wave height more than 6 m. Numerical simulation of some possible scenario of tsunamis of seismic and landslide origin is done with use of NAMI-DANCE software solved the shallow-water equations. The PTHA method (Probabilistic Tsunami Hazard Assessment - Probabilistic assessment of a tsunami hazard) for the Mediterranean Sea developed in (Sorensen M.B., Spada M., Babeyko A., Wiemer S., Grunthal G. Probabilistic tsunami hazard in the Mediterranean Sea. J Geophysical Research, 2012, vol. 117, B01305) is used to evaluate the probability of tsunami occurrence on the Egyptian coast. The synthetic catalogue of prognostic tsunamis of seismic origin with magnitude more than 6.5 includes 84 920 events for 100000 years. For the wave heights more 1 m the curve: exceedance probability - tsunami height can be approximated by exponential Gumbel function with two parameters which are determined for each coastal location in Egypt (totally. 24 points). Prognostic extreme highest events with probability less 10-4 are not satisfied to the Gumbel function (approximately 10 events) and required the special analysis. Acknowledgements: This work was supported EU FP7 ASTARTE Project [603839], and for EP - NS6637.2016.5.

  6. Contributions to Earthquake Hazard Characterization in Canada from Precision GPS Data

    NASA Astrophysics Data System (ADS)

    Dragert, H.; Hyndman, R. D.; Mazzotti, S.; Wang, K.

    2004-05-01

    In the active seismic regions of Canada, the hazard posed by the recurrence of potentially devastating (M >7) earthquakes is not well defined due to the brevity of the instrumental and historical records, the lack of clear paleoseismic evidence for past large events, and the inexact nature of extrapolating the rate of occurrence of frequent small events to the occurrence of rare large events. This serious shortcoming of probabilistic seismic hazard estimation can be addressed through high-precision GPS measurements which can monitor crustal motions and regional crustal strain associated with the build-up of stress before a large earthquake. In southwestern British Columbia, over a decade of observations of motions of GPS sites of the Western Canada Deformation Array (WCDA) and GPS campaign sites have led to improved models of the locked plate interface on the Cascadia Subduction Zone and better estimates of the landward extent for the next megathrust (M~9) rupture. Regional strain rates based on continuous GPS data from the WCDA and PANGA (Pacific Northwest Geodetic Array) show that the recurrence interval for M7 crustal earthquakes is of the order of 400 years, not several decades as once estimated. Continuous GPS data from these arrays have also led to the discovery of "silent slip" or "slow earthquakes" on the deeper plate interface which do not generate impulsive seismic waves but relieve stress over periods of one to two weeks. For southern Vancouver Island and northwestern Washington State, thes