Science.gov

Sample records for assessing earthquake hazards

  1. Earthquake Hazard and Risk Assessment for Turkey

    NASA Astrophysics Data System (ADS)

    Betul Demircioglu, Mine; Sesetyan, Karin; Erdik, Mustafa

    2010-05-01

    Using a GIS-environment to present the results, seismic risk analysis is considered as a helpful tool to support the decision making for planning and prioritizing seismic retrofit intervention programs at large scale. The main ingredients of seismic risk analysis consist of seismic hazard, regional inventory of buildings and vulnerability analysis. In this study, the assessment of the national earthquake hazard based on the NGA ground motion prediction models and the comparisons of the results with the previous models have been considered, respectively. An evaluation of seismic risk based on the probabilistic intensity ground motion prediction for Turkey has been investigated. According to the Macroseismic approach of Giovinazzi and Lagomarsino (2005), two alternative vulnerability models have been used to estimate building damage. The vulnerability and ductility indices for Turkey have been taken from the study of Giovinazzi (2005). These two vulnerability models have been compared with the observed earthquake damage database. A good agreement between curves has been clearly observed. In additional to the building damage, casualty estimations based on three different methods for each return period and for each vulnerability model have been presented to evaluate the earthquake loss. Using three different models of building replacement costs, the average annual loss (AAL) and probable maximum loss ratio (PMLR) due to regional earthquake hazard have been provided to form a basis for the improvement of the parametric insurance model and the determination of premium rates for the compulsory earthquake insurance in Turkey.

  2. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-04-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  3. Assessing the earthquake hazards in urban areas

    USGS Publications Warehouse

    Hays, W.W.; Gori, P.L.; Kockelman, W.J.

    1988-01-01

    Major urban areas in widely scattered geographic locations across the United States are a t varying degrees of risk from earthquakes. the locations of these urban areas include Charleston, South Carolina; Memphis Tennessee; St.Louis, Missouri; Salt Lake City, Utah; Seattle-Tacoma, Washington; Portland, Oregon; and Anchorage, Alaska; even Boston, Massachusetts, and Buffalo New York, have a history of large earthquakes. Cooperative research during the past decade has focused on assessing the nature and degree of the risk or seismic hazard i nthe broad geographic regions around each urban area. The strategy since the 1970's has been to bring together local, State, and Federal resources to solve the problem of assessing seismic risk. Successfl sooperative programs have been launched in the San Francisco Bay and Los Angeles regions in California and the Wasatch Front region in Utah. 

  4. Earthquake Hazard Assessment: Basics of Evaluation

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of

  5. Earthquake Hazard Assessment: an Independent Review

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  6. Spatial earthquake hazard assessment of Evansville, Indiana

    USGS Publications Warehouse

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.

    1997-01-01

    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  7. Shaky grounds of earthquake hazard assessment, forecasting, and prediction

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2012-12-01

    The quality of the fit of a trivial or, conversely, delicately-designed model to the observed natural phenomena is the fundamental pillar stone of any forecasting, including seismic hazard assessment, earthquake forecasting, and prediction. Using precise mathematical and logical systems outside their range of applicability can mislead to scientifically groundless conclusions, which unwise application can be extremely dangerous in assessing expected risk and losses. Are the relationships that are commonly used to assess seismic hazard enough valid to qualify for being useful laws describing earthquake sequences? Seismic evidences accumulated to-date demonstrate clearly that most of the empirical statistical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site. Seismic events, including mega-earthquakes, are clustered displaying behaviors that are far from independent. Their distribution in space is possibly fractal, definitely, far from uniform even in a single fault zone. Evidently, such a situation complicates design of reliable methodologies for earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. The situation is not hopeless due to available geological evidences and deterministic pattern recognition approaches, specifically, when intending to predict predictable, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades.

  8. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  9. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: State of Gujarat, India

    NASA Astrophysics Data System (ADS)

    Nekrasova, Anastasia; Kossobokov, Vladimir; Parvez, Imtiyaz

    2016-04-01

    The Gujarat state of India is one of the most seismically active intercontinental regions of the world. Historically, it has experienced many damaging earthquakes including the devastating 1819 Rann of Kutch and 2001 Bhuj earthquakes. The effect of the later one is grossly underestimated by the Global Seismic Hazard Assessment Program (GSHAP). To assess a more adequate earthquake hazard for the state of Gujarat, we apply Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter recurrence relation taking into account naturally fractal distribution of earthquake loci. USLE has evident implications since any estimate of seismic hazard depends on the size of the territory considered and, therefore, may differ dramatically from the actual one when scaled down to the proportion of the area of interest (e.g. of a city) from the enveloping area of investigation. We cross compare the seismic hazard maps compiled for the same standard regular grid 0.2°×0.2° (i) in terms of design ground acceleration (DGA) based on the neo-deterministic approach, (ii) in terms of probabilistic exceedance of peak ground acceleration (PGA) by GSHAP, and (iii) the one resulted from the USLE application. Finally, we present the maps of seismic risks for the state of Gujarat integrating the obtained seismic hazard, population density based on 2011 census data, and a few model assumptions of vulnerability.

  10. Remote sensing hazard monitoring and assessment in Yushu earthquake disaster

    NASA Astrophysics Data System (ADS)

    Wen, Qi; Xu, Feng; Chen, Shirong

    2011-12-01

    Yushu Earthquake of magnitude 7.1 Richter in 2010 has brought a huge loss of personal lives and properties to China. National Disaster Reduction Center of China implemented the disaster assessment by using remote sensing images and field investigation. Preliminary judgment of disaster scope and damage extent was acquired by change detection. And the building region of hard-hit area Jiegu town was partitioned into 3-level grids in airborne remote sensing images by street, type of use, structure, and about 685 girds were numbered. Hazard assessment expert group were sent to implement field investigation according to each grid. The housing damage scope and extent of loss were defined again integrated field investigation data and local government reported information. Though remote sensing technology has played an important role in huge disaster monitoring and assessment, the automatic capability of disaster information extraction flow, three-dimensional disaster monitoring mode and bidirectional feedback mechanism of products and services should still be further improved.

  11. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  12. Seismic hazard assessment for Myanmar: Earthquake model database, ground-motion scenarios, and probabilistic assessments

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Thant, M.; Maung Maung, P.; Sieh, K.

    2015-12-01

    We have constructed an earthquake and fault database, conducted a series of ground-shaking scenarios, and proposed seismic hazard maps for all of Myanmar and hazard curves for selected cities. Our earthquake database integrates the ISC, ISC-GEM and global ANSS Comprehensive Catalogues, and includes harmonized magnitude scales without duplicate events. Our active fault database includes active fault data from previous studies. Using the parameters from these updated databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and the elapse time of last events), we have determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the modelled ground motions to the felt intensities of earthquakes. Through the case of the 1975 Bagan earthquake, we determined that Atkinson and Moore's (2003) scenario using the ground motion prediction equations (GMPEs) fits the behaviours of the subduction events best. Also, the 2011 Tarlay and 2012 Thabeikkyin events suggested the GMPEs of Akkar and Cagnan (2010) fit crustal earthquakes best. We thus incorporated the best-fitting GMPEs and site conditions based on Vs30 (the average shear-velocity down to 30 m depth) from analysis of topographic slope and microtremor array measurements to assess seismic hazard. The hazard is highest in regions close to the Sagaing Fault and along the Western Coast of Myanmar as seismic sources there have earthquakes occur at short intervals and/or last events occurred a long time ago. The hazard curves for the cities of Bago, Mandalay, Sagaing, Taungoo and Yangon show higher hazards for sites close to an active fault or with a low Vs30, e.g., the downtown of Sagaing and Shwemawdaw Pagoda in Bago.

  13. Stability assessment of structures under earthquake hazard through GRID technology

    NASA Astrophysics Data System (ADS)

    Prieto Castrillo, F.; Boton Fernandez, M.

    2009-04-01

    This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding

  14. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    NASA Astrophysics Data System (ADS)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  15. International Collaboration for Strengthening Capacity to Assess Earthquake Hazard in Indonesia

    NASA Astrophysics Data System (ADS)

    Cummins, P. R.; Hidayati, S.; Suhardjono, S.; Meilano, I.; Natawidjaja, D.

    2012-12-01

    Indonesia has experienced a dramatic increase in earthquake risk due to rapid population growth in the 20th century, much of it occurring in areas near the subduction zone plate boundaries that are prone to earthquake occurrence. While recent seismic hazard assessments have resulted in better building codes that can inform safer building practices, many of the fundamental parameters controlling earthquake occurrence and ground shaking - e.g., fault slip rates, earthquake scaling relations, ground motion prediction equations, and site response - could still be better constrained. In recognition of the need to improve the level of information on which seismic hazard assessments are based, the Australian Agency for International Development (AusAID) and Indonesia's National Agency for Disaster Management (BNPB), through the Australia-Indonesia Facility for Disaster Reduction, have initiated a 4-year project designed to strengthen the Government of Indonesia's capacity to reliably assess earthquake hazard. This project is a collaboration of Australian institutions including Geoscience Australia and the Australian National University, with Indonesian government agencies and universities including the Agency for Meteorology, Climatology and Geophysics, the Geological Agency, the Indonesian Institute of Sciences, and Bandung Institute of Technology. Effective earthquake hazard assessment requires input from many different types of research, ranging from geological studies of active faults, seismological studies of crustal structure, earthquake sources and ground motion, PSHA methodology, and geodetic studies of crustal strain rates. The project is a large and diverse one that spans all these components, and these will be briefly reviewed in this presentation

  16. Improving earthquake hazard assessments in Italy: An alternative to “Texas sharpshooting”

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Panza, Giuliano F.

    2012-12-01

    The 20 May 2012 M = 6.1 earthquake that struck the Emilia region of northern Italy illustrates a common problem afflicting earthquake hazard assessment. It occurred in an area classified as "low seismic hazard" based on the current national seismic hazard map (Gruppo di Lavoro, Redazione della mappa di pericolosità sismica, rapporto conclusivo, 2004, http://zonesismiche.mi.ingv.it/mappa_ps_apr04/italia.html) adopted in 2006. That revision of the seismic code was motivated by the 2002 M = 5.7 earthquake that struck S. Giuliano di Puglia in central Italy, also a previously classified low-hazard area, resulting in damage and casualties. Previous code was updated in 1981-1984 after earlier maps missed the 1980 M = 6.5 Irpinia earthquake.

  17. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  18. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  19. Earthquake Cluster Analysis for Turkey and its Application for Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake clusters are an important element in general seismology and also for the application in seismic hazard assessment. In probabilistic seismic hazard assessment, the occurrence of earthquakes is often linked to an independent Monte Carlo process, following a stationary Poisson model. But earthquakes are dependent and constrained, especially in terms of earthquake swarms, fore- and aftershocks or even larger sequences as observed for the Landers sequence in California or the Darfield-Christchurch sequence in New Zealand. For earthquake catalogues, the element of declustering is an important step to capture earthquake frequencies by avoiding a bias towards small magnitudes due to aftershocks. On the other hand, declustered catalogues for independent probabilistic seismic activity will underestimate the total number of earthquakes by neglecting dependent seismicity. In this study, the effect of clusters on probabilistic seismic hazard assessment is investigated in detail. To capture the features of earthquake clusters, a uniform framework for earthquake cluster analysis is introduced using methodologies of geostatistics and machine learning. These features represent important cluster characteristics like cluster b-values, temporal decay, rupture orientations and many more. Cluster parameters are mapped in space using kriging. Furthermore, a detailed data analysis is undertaken to provide magnitude-dependent relations for various cluster parameters. The acquired features are used to introduce dependent seismicity within stochastic earthquake catalogues. In addition, the development of smooth seismicity maps based on historic databases is in general biased to the more complete recent decades. A filling methodology is introduced which will add dependent seismicity in catalogues where none has been recorded to avoid the above mentioned bias. As a case study, Turkey has been chosen due to its inherent seismic activity and well-recorded data coverage. Clustering

  20. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  1. Long aftershock sequences within continents and implications for earthquake hazard assessment.

    PubMed

    Stein, Seth; Liu, Mian

    2009-11-01

    One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere. PMID:19890328

  2. Recent destructive earthquakes and international collaboration for seismic hazard assessment in the East Asia region

    NASA Astrophysics Data System (ADS)

    Hao, K.; Fujiwara, H.

    2013-12-01

    Recent destructive earthquakes in East-Asia claimed one third of million of people's lives. People learned from the lessons but forgotten after generations even one sculpted on stones. Probabilistic seismic hazard assessment (SHA) is considered as a scientific way to define earthquake zones and to guide urban plan and construction. NIED promoted SHA as a national mission of Japan over 10 years and as an international cooperation to neighbor countries since the 2008 Wenchuan earthquake. We initiated China-Japan-Korea SHA strategic cooperative program for the next generation map supported by MOST-JST-NRF in 2010. We also initiated cooperative program with Taiwan Earthquake Model from 2012, as well many other parties in the world. Consequently NIED proudly joined Global Earthquake Model (GEM) since its SHA's methodologies and technologies were highly valuated. As a representative of Japan, NIED will continue to work closely with all members of GEM not only for the GEM global components, also for its regional programs. Seismic hazard assessment has to be carrying out under existed information with epistemic uncertainty. We routinely improve the existed models to carefully treat active faults, earthquake records, and magnitudes under the newest authorized information provided by Earthquake Research Committee, Headquarters for Earthquake Research Promotion. After the 2011 Tohoku earthquake, we have been re-considering the national SHA maps in even long-term and low probabilities. We have setup a platform of http://www.j-shis.bosai.go.jp/en to exchange the SHA information and share our experiences, lessons and knowledge internationally. Some probabilistic SHA concepts, seismic risk mitigation issues need constantly to be promoted internationally through outreach and media. Major earthquakes in East Asian region which claimed one third of million of people's lives (slab depth with contour (Hayes et al., 2011)).

  3. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  4. Field-based assessment of landslide hazards resulting from the 2015 Gorkha, Nepal earthquake sequence

    NASA Astrophysics Data System (ADS)

    Collins, B. D.; Jibson, R.

    2015-12-01

    The M7.8 2015 Gorkha, Nepal earthquake sequence caused thousands of fatalities, destroyed entire villages, and displaced millions of residents. The earthquake sequence also triggered thousands of landslides in the steep Himalayan topography of Nepal and China; these landslides were responsible for hundreds of fatalities and blocked vital roads, trails, and rivers. With the support of USAID's Office of Foreign Disaster Assistance, the U.S. Geological Survey responded to this crisis by providing landslide-hazard expertise to Nepalese agencies and affected villages. Assessments of landslide hazards following earthquakes are essential to identify vulnerable populations and infrastructure, and inform government agencies working on rebuilding and mitigation efforts. However, assessing landslide hazards over an entire earthquake-affected region (in Nepal, estimated to be ~30,000 km2), and in exceedingly steep, inaccessible topography presents a number of logistical challenges. We focused the scope of our assessment by conducting helicopter- and ground-based landslide assessments in 12 priority areas in central Nepal identified a priori from satellite photo interpretation performed in conjunction with an international consortium of remote sensing experts. Our reconnaissance covered 3,200 km of helicopter flight path, extending over an approximate area of 8,000 km2. During our field work, we made 17 site-specific assessments and provided landslide hazard information to both villages and in-country agencies. Upon returning from the field, we compiled our observations and further identified and assessed 74 river-blocking landslide dams, 12% of which formed impoundments larger than 1,000 m2 in surface area. These assessments, along with more than 11 hours of helicopter-based video, and an overview of hazards expected during the 2015 summer monsoon have been publically released (http://dx.doi.org/10.3133/ofr20151142) for use by in-country and international agencies.

  5. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    NASA Astrophysics Data System (ADS)

    Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf

    2016-04-01

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  6. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  7. Hazard Assessment and Early Warning of Tsunamis: Lessons from the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2012-12-01

    . Tsunami hazard assessments or long-term forecast of earthquakes have not considered such a triggering or simultaneous occurrence of different types of earthquakes. The large tsunami at the Fukushima nuclear power station was due to the combination of the deep and shallow slip. Disaster prevention for low-frequency but large-scale hazard must be considered. The Japanese government established a general policy to for two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, but cause devastating disaster once they occur. For such events, saving people's lives is the first priority and soft measures such as tsunami hazard maps, evacuation facilities or disaster education will be prepared. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared to protect lives and properties of residents as well as economic and industrial activities.

  8. Seismic hazard assessment in the Tibet-Himalayan region based on observed and modeled extreme earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Sokolov, V. Y.

    2013-12-01

    Ground shaking due to recent catastrophic earthquakes are estimated to be significantly higher than that predicted by a probabilistic seismic hazard analysis (PSHA). A reason is that extreme (large magnitude and rare) seismic events are not accounted in PSHA in the most cases due to the lack of information and unknown reoccurrence time of the extremes. We present a new approach to assessment of regional seismic hazard, which incorporates observed (recorded and historic) seismicity and modeled extreme events. We apply this approach to PSHA of the Tibet-Himalayan region. The large magnitude events simulated for several thousand years in models of lithospheric block-and-fault dynamics and consistent with the regional geophysical and geodetic data are employed together with the observed earthquakes for the Monte-Carlo PSHA. Earthquake scenarios are generated stochastically to sample the magnitude and spatial distribution of seismicity (observed and modeled) as well as the distribution of ground motion for each seismic event. The peak ground acceleration (PGA) values (that is, ground shaking at a site), which are expected to be exceeded at least once in 50 years with a probability of 10%, are mapped and compared to those PGA values observed and predicted earlier. The results show that the PGA values predicted by our assessment fit much better the observed ground shaking due to the 2008 Wenchuan earthquake than those predicted by conventional PSHA. Our approach to seismic hazard assessment provides a better understanding of ground shaking due to possible large-magnitude events and could be useful for risk assessment, earthquake engineering purposes, and emergency planning.

  9. Assessment of the Relative Largest Earthquake Hazard Level in the NW Himalaya and its Adjacent Region

    NASA Astrophysics Data System (ADS)

    Tsapanos, Theodoros M.; Yadav, R. B. S.; Olasoglou, Efthalia M.; Singh, Mayshree

    2016-04-01

    In the present study, the level of the largest earthquake hazard is assessed in 28 seismic zones of the NW Himalaya and its vicinity, which is a highly seismically active region of the world. Gumbel's third asymptotic distribution (hereafter as GIII) is adopted for the evaluation of the largest earthquake magnitudes in these seismic zones. Instead of taking in account any type of Mmax, in the present study we consider the ω value which is the largest earthquake magnitude that a region can experience according to the GIII statistics. A function of the form Θ(ω, RP6.0) is providing in this way a relatively largest earthquake hazard scale defined by the letter K(K index). The return periods for the ω values (earthquake magnitudes) 6 or larger (RP6.0) are also calculated. According to this index, the investigated seismic zones are classified into five groups and it is shown that seismic zones 3 (Quetta of Pakistan), 11 (Hindukush), 15 (northern Pamirs), and 23 (Kangra, Himachal Pradesh of India) correspond to a "very high" K index which is 6.

  10. From Earthquake Prediction Research to Time-Variable Seismic Hazard Assessment Applications

    NASA Astrophysics Data System (ADS)

    Bormann, Peter

    2011-01-01

    The first part of the paper defines the terms and classifications common in earthquake prediction research and applications. This is followed by short reviews of major earthquake prediction programs initiated since World War II in several countries, for example the former USSR, China, Japan, the United States, and several European countries. It outlines the underlying expectations, concepts, and hypotheses, introduces the technologies and methodologies applied and some of the results obtained, which include both partial successes and failures. Emphasis is laid on discussing the scientific reasons why earthquake prediction research is so difficult and demanding and why the prospects are still so vague, at least as far as short-term and imminent predictions are concerned. However, classical probabilistic seismic hazard assessments, widely applied during the last few decades, have also clearly revealed their limitations. In their simple form, they are time-independent earthquake rupture forecasts based on the assumption of stable long-term recurrence of earthquakes in the seismotectonic areas under consideration. Therefore, during the last decade, earthquake prediction research and pilot applications have focused mainly on the development and rigorous testing of long and medium-term rupture forecast models in which event probabilities are conditioned by the occurrence of previous earthquakes, and on their integration into neo-deterministic approaches for improved time-variable seismic hazard assessment. The latter uses stress-renewal models that are calibrated for variations in the earthquake cycle as assessed on the basis of historical, paleoseismic, and other data, often complemented by multi-scale seismicity models, the use of pattern-recognition algorithms, and site-dependent strong-motion scenario modeling. International partnerships and a global infrastructure for comparative testing have recently been developed, for example the Collaboratory for the Study of

  11. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  12. Magnitude Problems in Historical Earthquake Catalogs and Their Impact on Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Mahdyiar, M.; Shen-Tu, B.; Shabestari, K.; Guin, J.

    2010-12-01

    A reliable historical earthquake catalog is a critical component for any regional seismic hazard analysis. In Europe, a number of historical earthquake catalogs have been compiled and used in constructing national or regional seismic hazard maps, for instance, Switzerland ECOS catalog by Swiss Seismological Service (2002), Italy CPTI catalog by CPTI Working Group (2004), Greece catalog by Papazachos et al. (2007), and CENEC (central, northern and northwestern Europe) catalog by Grünthal et al. (2009), Turkey catalog by Kalafat et al. (2007), and GSHAP catalog by Global Seismic Hazard Assessment Program (1999). These catalogs spatially overlap with each other to a large extent and employed a uniform magnitude scale (Mw). A careful review of these catalogs has revealed significant magnitude problems which can substantially impact regional seismic hazard assessment: 1) Magnitudes for the same earthquakes in different catalogs are discrepant. Such discrepancies are mainly driven by different regression relationships used to convert other magnitude scales or intensity into Mw. One of the consequences is magnitudes of many events in one catalog are systematically biased higher or lower with respect to those in another catalog. For example, the magnitudes of large historical earthquakes in the Italy CPTI catalog are systematically higher than those in Switzerland ECOS catalog. 2) Abnormally high frequency of large magnitude events is observed for some time period that intensities are the main available data. This phenomenon is observed in Italy CPTI catalog for the time period of 1870 to 1930. This may be due to biased conversion from intensity to magnitude. 3) A systematic bias in magnitude resulted in biased estimations for a- and b-values of the Gutenberg-Richter magnitude frequency relationships. It also affected the determination of upper bound magnitudes for various seismic source zones. All of these issues can lead to skewed seismic hazard results, or inconsistent

  13. Impact of Three-Parameter Weibull Models in Probabilistic Assessment of Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2014-07-01

    This paper investigates the suitability of a three-parameter (scale, shape, and location) Weibull distribution in probabilistic assessment of earthquake hazards. The performance is also compared with two other popular models from same Weibull family, namely the two-parameter Weibull model and the inverse Weibull model. A complete and homogeneous earthquake catalog ( Yadav et al. in Pure Appl Geophys 167:1331-1342, 2010) of 20 events ( M ≥ 7.0), spanning the period 1846 to 1995 from north-east India and its surrounding region (20°-32°N and 87°-100°E), is used to perform this study. The model parameters are initially estimated from graphical plots and later confirmed from statistical estimations such as maximum likelihood estimation (MLE) and method of moments (MoM). The asymptotic variance-covariance matrix for the MLE estimated parameters is further calculated on the basis of the Fisher information matrix (FIM). The model suitability is appraised using different statistical goodness-of-fit tests. For the study area, the estimated conditional probability for an earthquake within a decade comes out to be very high (≥0.90) for an elapsed time of 18 years (i.e., 2013). The study also reveals that the use of location parameter provides more flexibility to the three-parameter Weibull model in comparison to the two-parameter Weibull model. Therefore, it is suggested that three-parameter Weibull model has high importance in empirical modeling of earthquake recurrence and seismic hazard assessment.

  14. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    NASA Astrophysics Data System (ADS)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  15. Sustainable Urban Planning and Risk Assessment of Earthquake Hazards in Turkey

    NASA Astrophysics Data System (ADS)

    Tarhan, C.; Deniz, D.

    2013-05-01

    Cities in the developing world are facing increased risk of disasters and the potential of economic and human losses from natural hazards is being exacerbated by the rate of unplanned urban expansion and influenced by the quality of urban management. Risk assessment has come to be regarded by many analysts as a critical part of the development of sustainable communities. The risk assessment function has been linked to issues such as environmental stewardship and community planning. The crucial point is the linkage between hazard mitigation efforts and urban planning in the context of building sustainable communities. But this conceptual linkage has been difficult to implement in practice. The resolution of this difficulty and a clarification of the essential linkage of hazard mitigation to urban planning will require a broader definition and a reformulation of the risk assessment function. Turkey is one of the countries that support the international sustainability. However, it is hardly related urban planning with sustainability in Turkey. At this point, this paper aims to introduce the integration of sustainability and risk assessment in Turkey. The components of the sustainable communities have been discussed and earthquake risk in Turkey has been explained with the recent past examples. At the end of the study, the relationship between risk assessment and the sustainable urban planning in Turkey has been examined in terms of Turkish urban planning system.

  16. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence

  17. The 1843 earthquake: a maximising scenario for tsunami hazard assessment in the Northern Lesser Antilles?

    NASA Astrophysics Data System (ADS)

    Roger, Jean; Zahibo, Narcisse; Dudon, Bernard; Krien, Yann

    2013-04-01

    The French Caribbean Islands are located over the Lesser Antilles active subduction zone where a handful of earthquakes historically reached magnitude Mw=6.0 and more. According to available catalogs these earthquakes have been sometimes able to trigger devastating local or regional tsunamis, either directly by the shake or indirectly by induced landslides. For example, these islands have severely suffered during the Mw~7.5 Virgin Islands earthquake (1867) triggering several meters high waves in the whole Lesser Antilles Arc and, more recently, during the Mw=6.3 Les Saintes earthquake (2004) followed by a local 1 m high tsunami. However, in 1839 a Mw~7.5 subduction earthquake occured offshore Martinica followed a few years after by the more famous 1843 Mw~8.5 megathrust event, with an epicenter located approximately between Guadeloupe and Antigua, but both without any catastrophic tsunami being reported. In this study we discuss the potential impact of a maximum credible scenario of tsunami generation with such a Mw=8.5 rupture at the subduction interface using available geological information, numerical modeling of tsunami generation and propagation and high resolution bathymetric data within the framework of tsunami hazard assessment for the French West Indies. Despite the fact that the mystery remains unresolved concerning the lack of historical tsunami data especially for the 1843 event, modeling results show that the tsunami impact is not uniformly distributed in the whole archipelago and could show important heterogeneities in terms of maximum wave heights for specific places. This is easily explained by the bathymetry and the presence of several islands around the mainland leading to resonance phenomena, and because of the existence of a fringing coral reef surrounding partially those islands.

  18. Historical Earthquake Records and their Application for Seismic Hazard and Risk Assessment in Tianshui, Gansu Province, Northwestern China

    NASA Astrophysics Data System (ADS)

    Wang, L.; Wang, Z.

    2009-12-01

    Tianshui, located in southeastern Gansu Province of northwestern China, was a center of early Chinese civilization and the birthplace of “Ba Gua” or “eight symbols.” It has a long history of earthquakes and many strong and large earthquakes have occurred there. Earthquakes, ancient or modern ones, have not only been well recorded, but also left marks on many historical landmarks and buildings that can still be seen today. For example, major damage by the 1654 Tianshui earthquake (M8.0) and some minor damage by the 2008 Wenchuan earthquake can be seen in the Maiji Grotto. A new effort to investigate and reexamine the historical macroseismic records is under way, with the aim of better seismic hazard and risk assessment for the Tianshui area. Seismic hazard and risk will be assessed for the Tianshui area using the 2,500 years of intensity observations (records). The results will be used by local governments and communities for developing more effective mitigation policies in the aftermath of the 2008 Wenchuan earthquake. The results will also be compared to hazard and risk assessments derived from other approaches, such as probabilistic and deterministic seismic hazard analyses.

  19. Earthquake Hazard Assessment Based on Geological Data: An approach from Crystalline Terrain of Peninsular India

    NASA Astrophysics Data System (ADS)

    John, B.

    2009-04-01

    Earthquake Hazard Assessment Based on Geological Data: An approach from Crystalline Terrain of Peninsular India Biju John National Institute of Rock Mechanics b_johnp@yahoo.co.in Peninsular India was for long considered as seismically stable. But the recent earthquake sequence of Latur (1993), Jabalpur (1997), Bhuj (2001) suggests this region is among one of the active Stable Continental Regions (SCRs) of the world, where the recurrence intervals is of the order of tens of thousands of years. In such areas, earthquake may happen at unexpected locations, devoid of any previous seismicity or dramatic geomorphic features. Even moderate earthquakes will lead to heavy loss of life and property in the present scenario. So it is imperative to map suspected areas to identify active faults and evaluate its activities, which will be a vital input to seismic hazard assessment of SCR area. The region around Wadakkanchery, Kerala, South India has been experiencing micro seismic activities since 1989. Subsequent studies, by the author, identified a 30 km long WNW-ESE trending reverse fault, dipping south (45°), that influenced the drainage system of the area. The macroscopic and microscopic studies of the fault rocks from the exposures near Desamangalam show an episodic nature of faulting. Dislocations of pegmatitic veins across the fault indicate a cumulative dip displacement of 2.1m in the reverse direction. A minimum of four episodes of faulting were identified in this fault based on the cross cutting relations of different structural elements and from the mineralogic changes of different generations of gouge zones. This suggests that an average displacement of 52cm per event might have occurred for each event. A cyclic nature of faulting is identified in this fault zone in which the inter-seismic period is characterized by gouge induration and fracture sealing aided by the prevailing fluids. Available empirical relations connecting magnitude with displacement and rupture

  20. Metrics, Bayes, and BOGSAT: Recognizing and Assessing Uncertainties in Earthquake Hazard Maps

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Brooks, E. M.; Spencer, B. D.

    2015-12-01

    Recent damaging earthquakes in areas predicted to be relatively safe illustrate the need to assess how seismic hazard maps perform. At present, there is no agreed way of assessing how well a map performed. The metric implicit in current maps, that during a time interval predicted shaking will be exceeded only at a specific fraction of sites, is useful but permits maps to be nominally successful although they significantly underpredict or overpredict shaking, or nominally unsuccessful but predict shaking well. We explore metrics that measure the effects of overprediction and underprediction. Although no single metric fully characterizes map behavior, using several metrics can provide useful insight for comparing and improving maps. A related question is whether to regard larger-than-expected shaking as a low-probability event allowed by a map, or to revise the map to show increased hazard. Whether and how much to revise a map is complicated, because a new map that better describes the past may or may not better predict the future. The issue is like deciding after a coin has come up heads a number of times whether to continue assuming that the coin is fair and the run is a low-probability event, or to change to a model in which the coin is assumed to be biased. This decision can be addressed using Bayes' Rule, so that how much to change depends on the degree of one's belief in the prior model. Uncertainties are difficult to assess for hazard maps, which require subjective assessments and choices among many poorly known or unknown parameters. However, even rough uncertainty measures for estimates/predictions from such models, sometimes termed BOGSATs (Bunch Of Guys Sitting Around Table) by risk analysts, can give users useful information to make better decisions. We explore the extent of uncertainty via sensitivity experiments on how the predicted hazard depends on model parameters.

  1. Assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L., (Edited By); Hays, Walter W.

    2000-01-01

    This report--the second of two volumes--represents an ongoing effort by the U.S. Geological Survey to transfer accurate Earth science information about earthquake hazards along Utah's Wasatch Front to researchers, public officials, design professionals, land-use planners, and emergency managers in an effort to mitigate the effects of these hazards. This volume contains eight chapters on ground-shaking hazards and aspects of loss estimation.

  2. Earthquake risk reduction in the United States: An assessment of selected user needs and recommendations for the National Earthquake Hazards Reduction Program

    SciTech Connect

    1994-12-31

    This Assessment was conducted to improve the National Earthquake Hazards Reduction Program (NEHRP) by providing NEHRP agencies with information that supports their user-oriented setting of crosscutting priorities in the NEHRP strategic planning process. The primary objective of this Assessment was to take a ``snapshot`` evaluation of the needs of selected users throughout the major program elements of NEHRP. Secondary objectives were to conduct an assessment of the knowledge that exists (or is being developed by NEHRP) to support earthquake risk reduction, and to begin a process of evaluating how NEHRP is meeting user needs. An identification of NEHRP`s strengths also resulted from the effort, since those strengths demonstrate successful methods that may be useful to NEHRP in the future. These strengths are identified in the text, and many of them represent important achievements since the Earthquake Hazards Reduction Act was passed in 1977.

  3. Some Contributions of the Neo-Deterministic Seismic Hazard Assessment Approach to Earthquake Risk Assessment for the City of Sofia

    NASA Astrophysics Data System (ADS)

    Paskaleva, Ivanka; Kouteva-Guentcheva, Mihaela; Vaccari, Franco; Panza, Giuliano F.

    2011-03-01

    This paper describes the outcome of the advanced seismic hazard and seismic risk estimates recently performed for the city of Sofia, based on the state-of-the-art of knowledge for this site. Some major results of the neo-deterministic, scenario-based, seismic hazard assessment approach (NDSHA) to the earthquake hazard assessment for the city of Sofia are considered. Further validations of the recently constructed synthetic strong motion database, containing site and seismic source-specific ground motion time histories are performed and discussed. Displacement and acceleration response spectra are considered. The elastic displacement response spectra and displacement demand are discussed with regard to earthquake magnitude, seismic source-to-site distance, seismic source mechanism, and local geological site conditions. The elastic response design spectrum from the standard pseudo-acceleration, versus natural period, T n, format, converted to a capacity diagram in S a - S d format is discussed in the perspective of the Eurocode 8 provisions. A brief overview of the engineering applications of the seismic demand obtained making use of the NDSHA is supplied. Some applications of the outcome of NDSHA procedure for engineering purposes are shown. The obtained database of ground shaking waveforms and time-histories, computed for city of Sofia is used to: (1) extract maximum particle velocities; (2) calculate the space distribution of the horizontal strain factor Log10 ɛ; (3) estimate liquefaction susceptibility in terms of standard penetration test, N values, and initial over burden stress; (4) estimate damage index distribution; and (5) map the distribution of the expected pipe breaks and red-tagged buildings for given scenario earthquakes, etc. The theoretically obtained database, based on the simultaneous treatment of the data from many disciplines, contains data fully suitable for practical use. The proper use of this database can lead to a significant seismic

  4. Assessment of the 1988 Saguenay earthquake: Implications on attenuation functions for seismic hazard analysis

    SciTech Connect

    Toro, G.R.; McGuire, R.K. )

    1991-09-01

    This study investigates the earthquake records from the 1988 Saguenay earthquake and examines the implications of these records with respect to ground-motion models used in seismic-hazard studies in eastern North America (ENA), specifically, to what extent the ground motions from this earthquake support or reject the various attenuation functions used in the EPRI and LLNL seismic-hazard calculations. Section 2 provides a brief description of the EPRI and LLNL attenuation functions for peak acceleration and for spectral velocities. Section 2 compares these attenuation functions the ground motions from the Saguenay earthquake and from other relevant earthquakes. Section 4 reviews available seismological studies about the Saguenay earthquake, in order to understand its seismological characteristics and why some observations may differ from predictions. Section 5 examines the assumptions and methodology used in the development of the attenuation functions selected by LLNL ground-motion expert 5. Finally, Section 6 draws conclusions about the validity of the various sets of attenuation functions, in light of the Saguenay data and of other evidence presented here. 50 refs., 37 figs., 7 tabs.

  5. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  6. Can Apparent Stress be Used to Time-Dependent Seismic Hazard Assessment or Earthquake Forecast? An Ongoing Approach in China

    NASA Astrophysics Data System (ADS)

    Wu, Zhongliang; Jiang, Changsheng; Zhang, Shengfeng

    2016-08-01

    The approach in China since the last 1.5 decade for using apparent stress in time-dependent seismic hazard assessment or earthquake forecast is summarized. Retrospective case studies observe that apparent stress exhibits short-term increase, with time scale of several months, before moderate to strong earthquakes in a large area surrounding the `target earthquake'. Apparent stress is also used to estimate the tendency of aftershock activity. The concept relating apparent stress indirectly to stress level is used to understand the properties of some `precursory' anomalies. Meanwhile, different opinions were reported. Problems in the calculation also existed for some cases. Moreover, retrospective studies have the limitation in their significance as compared to forward forecast test. Nevertheless, this approach, seemingly uniquely carried out in a large scale in mainland China, provides the earthquake catalogs for the predictive analysis of seismicity with an additional degree of freedom, deserving a systematic review and reflection.

  7. A probabilistic approach for earthquake hazard assessment of the Province of Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Orhan, A.; Seyrek, E.; Tosun, H.

    2007-10-01

    The city of Eskişehir in inner-western Turkey has experienced a destructive earthquake with Ms=6.4 in 1956 in addition to many events with magnitudes greater than 5. It is located in a wide basin having young sedimentary units and thick alluvium soils which also include liquefiable sand materials. There is also an active fault passing beneath the city center and the groundwater level is very close to the ground surface. Approximately 600 thousand people are living in the province of Eskişehir. Therefore, the city and its vicinity have a high risk, when earthquake hazard is considered. This paper summarizes the probabilistic seismic hazard analysis (PSHA) which was performed for the province of Eskişehir and introduces seismic hazard maps produced by considering earthquakes with magnitude Ms≥4.0 occurred during the last 100-years and a seismic model composed of four seismic sources. The results of PSHA show that the average peak ground acceleration (PGA) for the city center is 0.40 g for 10 percent probability of exceedance in 50 years, for rock site. The seismic hazard maps were obtained by means of a program of Geographic Information System.

  8. New seafloor map of the Puerto Rico Trench helps assess earthquake and tsunami hazards

    USGS Publications Warehouse

    ten Brink, Uri S.; Danforth, William; Polloni, Christopher; Andrews, Brian D.; Llanes Estrada, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-01-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure l). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S.Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands [McCann et al., 2004]. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918 [Mercado and McCann, 1998]. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico [Mercado et al., 2002; Schwab et al., 1991],although their ages are unknown.

  9. New seafloor map of the Puerto Rico trench helps assess earthquake and tsunami hazards

    NASA Astrophysics Data System (ADS)

    Brink, Uri ten; Danforth, William; Polloni, Christopher; Andrews, Brian; Llanes, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-09-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure l). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S.Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands [McCann et al., 2004]. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918 [Mercado and McCann, 1998]. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico [Mercado et al., 2002; Schwab et al., 1991],although their ages are unknown.

  10. The Effects on Tsunami Hazard Assessment in Chile of Assuming Earthquake Scenarios with Spatially Uniform Slip

    NASA Astrophysics Data System (ADS)

    Carvajal, Matías; Gubler, Alejandra

    2016-06-01

    We investigated the effect that along-dip slip distribution has on the near-shore tsunami amplitudes and on coastal land-level changes in the region of central Chile (29°-37°S). Here and all along the Chilean megathrust, the seismogenic zone extends beneath dry land, and thus, tsunami generation and propagation is limited to its seaward portion, where the sensitivity of the initial tsunami waveform to dislocation model inputs, such as slip distribution, is greater. We considered four distributions of earthquake slip in the dip direction, including a spatially uniform slip source and three others with typical bell-shaped slip patterns that differ in the depth range of slip concentration. We found that a uniform slip scenario predicts much lower tsunami amplitudes and generally less coastal subsidence than scenarios that assume bell-shaped distributions of slip. Although the finding that uniform slip scenarios underestimate tsunami amplitudes is not new, it has been largely ignored for tsunami hazard assessment in Chile. Our simulations results also suggest that uniform slip scenarios tend to predict later arrival times of the leading wave than bell-shaped sources. The time occurrence of the largest wave at a specific site is also dependent on how the slip is distributed in the dip direction; however, other factors, such as local bathymetric configurations and standing edge waves, are also expected to play a role. Arrival time differences are especially critical in Chile, where tsunamis arrive earlier than elsewhere. We believe that the results of this study will be useful to both public and private organizations for mapping tsunami hazard in coastal areas along the Chilean coast, and, therefore, help reduce the risk of loss and damage caused by future tsunamis.

  11. Rapid field-based landslide hazard assessment in response to post-earthquake emergency

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Gambini, Stefano; Cancelliere, Giorgio

    2016-04-01

    On April 25, 2015 a Mw 7.8 earthquake occurred 80 km to the northwest of Kathmandu (Nepal). The largest aftershock, occurred on May 12, 2015, was the Mw 7.3 Nepal earthquake (SE of Zham, China), 80 km to the east of Kathmandu. . The earthquakes killed ~9000 people and severely damaged a 10,000 sqkm region in Nepal and neighboring countries. Several thousands of landslides have been triggered during the event, causing widespread damages to mountain villages and the evacuation of thousands of people. Rasuwa was one of the most damaged districts. This contribution describes landslide hazard analysis of the Saramthali, Yarsa and Bhorle VDCs (122 km2, Rasuwa district). Hazard is expressed in terms of qualitative classes (low, medium, high), through a simple matrix approach that combines frequency classes and magnitude classes. The hazard analysis is based primarily on the experience gained during a field survey conducted in September 2014. During the survey, local knowledge has been systematically exploited through interviews with local people that have experienced the earthquake and the coseismic landslides. People helped us to recognize fractures and active deformations, and allowed to reconstruct a correct chronicle of landslide events, in order to assign the landslide events to the first shock, the second shock, or the post-earthquake 2015 monsoon. The field experience was complemented with a standard analysis of the relationship between potential controlling factors and the distribution of landslides reported in Kargel et al (2016). This analysis allowed recognizing the most important controlling factor. This information was integrated with the field observations to verify the mapped units and to complete the mapping in area not accessible for field activity. Finally, the work was completed with the analysis and the use of a detailed landslide inventory produced by the University of Milano Bicocca that covers most of the area affected by coseismic landslides in

  12. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    USGS Publications Warehouse

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  13. Earthquake hazards: a national threat

    USGS Publications Warehouse

    U.S. Geological Survey

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  14. Assessment of existing and potential landslide hazards resulting from the April 25, 2015 Gorkha, Nepal earthquake sequence

    USGS Publications Warehouse

    Collins, Brian D.; Jibson, Randall W.

    2015-01-01

    This report provides a detailed account of assessments performed in May and June 2015 and focuses on valley-blocking landslides because they have the potential to pose considerable hazard to many villages in Nepal. First, we provide a seismological background of Nepal and then detail the methods used for both external and in-country data collection and interpretation. Our results consist of an overview of landsliding extent, a characterization of all valley-blocking landslides identified during our work, and a description of video resources that provide high resolution coverage of approximately 1,000 kilometers (km) of river valleys and surrounding terrain affected by the Gorkha earthquake sequence. This is followed by a description of site-specific landslide-hazard assessments conducted while in Nepal and includes detailed descriptions of five noteworthy case studies. Finally, we assess the expectation for additional landslide hazards during the 2015 summer monsoon season.

  15. Hazard assessment of long-period ground motions for the Nankai Trough earthquakes

    NASA Astrophysics Data System (ADS)

    Maeda, T.; Morikawa, N.; Aoi, S.; Fujiwara, H.

    2013-12-01

    We evaluate a seismic hazard for long-period ground motions associated with the Nankai Trough earthquakes (M8~9) in southwest Japan. Large interplate earthquakes occurring around the Nankai Trough have caused serious damages due to strong ground motions and tsunami; most recent events were in 1944 and 1946. Such large interplate earthquake potentially causes damages to high-rise and large-scale structures due to long-period ground motions (e.g., 1985 Michoacan earthquake in Mexico, 2003 Tokachi-oki earthquake in Japan). The long-period ground motions are amplified particularly on basins. Because major cities along the Nankai Trough have developed on alluvial plains, it is therefore important to evaluate long-period ground motions as well as strong motions and tsunami for the anticipated Nankai Trough earthquakes. The long-period ground motions are evaluated by the finite difference method (FDM) using 'characterized source models' and the 3-D underground structure model. The 'characterized source model' refers to a source model including the source parameters necessary for reproducing the strong ground motions. The parameters are determined based on a 'recipe' for predicting strong ground motion (Earthquake Research Committee (ERC), 2009). We construct various source models (~100 scenarios) giving the various case of source parameters such as source region, asperity configuration, and hypocenter location. Each source region is determined by 'the long-term evaluation of earthquakes in the Nankai Trough' published by ERC. The asperity configuration and hypocenter location control the rupture directivity effects. These parameters are important because our preliminary simulations are strongly affected by the rupture directivity. We apply the system called GMS (Ground Motion Simulator) for simulating the seismic wave propagation based on 3-D FDM scheme using discontinuous grids (Aoi and Fujiwara, 1999) to our study. The grid spacing for the shallow region is 200 m and

  16. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    NASA Astrophysics Data System (ADS)

    Türker, Tuǧba; Bayrak, Yusuf

    2016-04-01

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn't been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, MS=7.3 and 1897, MS=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for MS magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boǧazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99 with an earthquake

  17. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    NASA Astrophysics Data System (ADS)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  18. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    NASA Astrophysics Data System (ADS)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  19. Unified Scaling Law for Earthquakes: Seismic hazard and risk assessment for Himalayas, Lake Baikal, and Central China regions

    NASA Astrophysics Data System (ADS)

    Nekrasova, Anastasia; Kossobokov, Vladimir; Parvez, Imtiyaz; Tao, Xiaxin

    2015-04-01

    The Unified Scaling Law for Earthquakes (USLE), that generalizes the Gutenberg-Richter recurrence relation, has evident implications since any estimate of seismic hazard depends on the size of the territory that is used for investigation, averaging, and extrapolation into the future. Therefore, the hazard may differ dramatically when scaled down to the proportion of the area of interest (e.g. territory occupied by a city) from the enveloping area of investigation. In fact, given the observed patterns of distributed seismic activity the results of multi-scale analysis embedded in USLE approach demonstrate that traditional estimations of seismic hazard and risks for cities and urban agglomerations are usually underestimated. Moreover, the USLE approach provides a significant improvement when compared to the results of probabilistic seismic hazard analysis, e.g. the maps resulted from the Global Seismic Hazard Assessment Project (GSHAP). We apply the USLE approach to evaluating seismic hazard and risks to population of the three territories of different size representing a sub-continental and two different regional scales of analysis, i.e. the Himalayas and surroundings, Lake Baikal, and Central China regions.

  20. Probabilistic Seismic Hazard Assessment for Iraq Using Complete Earthquake Catalogue Files

    NASA Astrophysics Data System (ADS)

    Ameer, A. S.; Sharma, M. L.; Wason, H. R.; Alsinawi, S. A.

    2005-05-01

    Probabilistic seismic hazard analysis (PSHA) has been carried out for Iraq. The earthquake catalogue used in the present study covers an area between latitude 29° 38.5° N and longitude 39° 50° E containing more than a thousand events for the period 1905 2000. The entire Iraq region has been divided into thirteen seismogenic sources based on their seismic characteristics, geological setting and tectonic framework. The completeness of the seismicity catalogue has been checked using the method proposed by Stepp (1972). The analysis of completeness shows that the earthquake catalogue is not complete below Ms=4.8 for all of Iraq and seismic source zones S1, S4, S5, and S8, while it varies for the other seismic zones. A statistical treatment of completeness of the data file was carried out in each of the magnitude classes. The Frequency Magnitude Distributions (FMD) for the study area including all seismic source zones were established and the minimum magnitude of complete reporting (Mc) were then estimated. For the entire Iraq the Mc was estimated to be about Ms=4.0 while S11 shows the lowest Mc to be about Ms=3.5 and the highest Mc of about Ms=4.2 was observed for S4. The earthquake activity parameters (activity rate λ, b value, maximum regional magnitude mmax) as well as the mean return period (R) with a certain lower magnitude mmin ≥ m along with their probability of occurrence have been determined for all thirteen seismic source zones of Iraq. The maximum regional magnitude mmax was estimated as 7.87 ± 0.86 for entire Iraq. The return period for magnitude 6.0 is largest for source zone S3 which is estimated to be 705 years while the smallest value is estimated as 9.9 years for all of Iraq.

  1. New Seafloor Map of the Puerto Rico Trench Helps Assess Earthquake and Tsunami Hazards

    NASA Astrophysics Data System (ADS)

    ten Brink, Uri; Danforth, William; Polloni, Christopher; Andrews, Brian; Llanes, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-09-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure 1). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S. Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico, although their ages are unknown. The Puerto Rico Trench is atypical of oceanic trenches. Subduction is highly oblique (10°-20°) to the trench axis with a large component of left-lateral strike-slip motion. Similar convergence geometry is observed at the Challenger Deep in the Mariana Trench, the deepest point on Earth. In addition to its extremely deep seafloor, the Puerto Rico Trench is also characterized by the most negative free-air gravity anomaly on Earth, -380 mGal, located 50 km south of the trench, where water depth is 7950 m (Figure 2). A tilted carbonate platform provides evidence for extreme vertical tectonism in the region. This platform was horizontally deposited over Cretaceous to Paleocene arc rocks starting in the Late Oligocene. Then, at 3.5 Ma, the carbonate platform was tilted by 4° toward the trench over a time period of less than 40 kyr, such that its northern edge is at a depth of 4000 m and its reconstructed elevation on land in Puerto Rico is at +1300 m (Figures 1 and 2).

  2. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto

    2016-04-01

    The characteristic earthquake hypothesis is not strongly supported by observational data because of the relatively short duration of historical and even paleoseismological records. For instance, for the Calabria (Southern Italy) region, historical information on strong earthquakes exist for at least two thousand years, but they can be considered complete for M > 6.0 only for the latest few centuries. As a consequence, characteristic earthquakes are seldom reported for individual fault segments, and hazard assessment is not reliably estimated by means of only minor seismicity reported in the historical catalogs. Even if they cannot substitute the information contained in a good historical catalog, physics-based earthquake simulators have become popular in the recent literature, and their application has been justified by a number of reasons. In particular, earthquake simulators can provide interesting information on which renewal models can better describe the recurrence statistics, and how this is affected by features as local fault geometry and kinematics. The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 100,000 events of magnitudes ≥ 4.5. The algorithm on which this simulator is based is constrained by several physical elements, as an average slip rate due to tectonic loading for every single segment in the investigated fault system, the process of rupture growth and termination, and interaction between earthquake sources, including small magnitude events. Events nucleated in one segment are allowed to expand into neighboring segments, if they are separated by a given maximum range of distance. The application of our simulation algorithm to Calabria region provides typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term periodicity of strong earthquakes, short

  3. A new earthquake catalogue for seismic hazard assessment of the NPP (Nuclear Power Plant) Jaslovske Bohunice, Slovakia, site

    NASA Astrophysics Data System (ADS)

    Kysel, Robert; Kristek, Jozef; Moczo, Peter; Csicsay, Kristian; Cipciar, Andrej; Srbecky, Miroslav

    2014-05-01

    According to the IAEA (International Atomic Energy Agency) Safety Guide No. SSG-9, an earthquake catalogue should comprise all information on pre-historical, historical and seismometrically recorded earthquakes in the region which should cover geographic area not smaller than a circle with radius of 300 km around the site. Jaslovske Bohunice is an important economic site. Several nuclear facilities are located in Jaslovske Bohunice - either in operation (NPP V2, national radioactive waste repository) or in decommissioning (NPP A1, NPP V1). Moreover, a new reactor unit is being planned for the site. Jaslovske Bohunice site is not far from the Dobra Voda seismic source zone which has been the most active seismic zone at territory of Slovakia since the beginning of 20th century. Relatively small distances to Austria, Hungary, Czech Republic and Slovak capital Bratislava make the site a prominent priority in terms of seismic hazard assessment. We compiled a new earthquake catalogue for the NPP Jaslovske Bohunice region following the recommendations of the IAEA Safety Guide. The region includes parts of the territories of Slovakia, Hungary, Austria, the Czech Republic and Poland, and it partly extends up to Germany, Slovenia, Croatia and Serbia. The catalogue is based on data from six national earthquake catalogues, two regional earthquake catalogues (ACORN, CENEC) and a catalogue from the local NPP network. The primarily compiled catalogue for the time period 350 - 2011 consists of 9 142 events. We then homogenized and declustered the catalogue. Eventually we checked the catalogue for time completeness. For homogenization, we divided the catalogue into preseismometric (350 - 1900) and seismometric (1901-2011) periods. For earthquakes characterized by the epicentral intensity and local magnitude we adopted relations proposed for homogenization of the CENEC catalogue (Grünthal et al. 2009). Instead of assuming the equivalency between local magnitudes reported by the

  4. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  5. Implications for prediction and hazard assessment from the 2004 Parkfield earthquake.

    PubMed

    Bakun, W H; Aagaard, B; Dost, B; Ellsworth, W L; Hardebeck, J L; Harris, R A; Ji, C; Johnston, M J S; Langbein, J; Lienkaemper, J J; Michael, A J; Murray, J R; Nadeau, R M; Reasenberg, P A; Reichle, M S; Roeloffs, E A; Shakal, A; Simpson, R W; Waldhauser, F

    2005-10-13

    Obtaining high-quality measurements close to a large earthquake is not easy: one has to be in the right place at the right time with the right instruments. Such a convergence happened, for the first time, when the 28 September 2004 Parkfield, California, earthquake occurred on the San Andreas fault in the middle of a dense network of instruments designed to record it. The resulting data reveal aspects of the earthquake process never before seen. Here we show what these data, when combined with data from earlier Parkfield earthquakes, tell us about earthquake physics and earthquake prediction. The 2004 Parkfield earthquake, with its lack of obvious precursors, demonstrates that reliable short-term earthquake prediction still is not achievable. To reduce the societal impact of earthquakes now, we should focus on developing the next generation of models that can provide better predictions of the strength and location of damaging ground shaking. PMID:16222291

  6. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  7. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  8. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  9. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  10. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  11. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  12. Foreshocks and short-term hazard assessment of large earthquakes using complex networks: the case of the 2009 L'Aquila earthquake

    NASA Astrophysics Data System (ADS)

    Daskalaki, Eleni; Spiliotis, Konstantinos; Siettos, Constantinos; Minadakis, Georgios; Papadopoulos, Gerassimos A.

    2016-08-01

    The monitoring of statistical network properties could be useful for the short-term hazard assessment of the occurrence of mainshocks in the presence of foreshocks. Using successive connections between events acquired from the earthquake catalog of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) for the case of the L'Aquila (Italy) mainshock (Mw = 6.3) of 6 April 2009, we provide evidence that network measures, both global (average clustering coefficient, small-world index) and local (betweenness centrality) ones, could potentially be exploited for forecasting purposes both in time and space. Our results reveal statistically significant increases in the topological measures and a nucleation of the betweenness centrality around the location of the epicenter about 2 months before the mainshock. The results of the analysis are robust even when considering either large or off-centered the main event space windows.

  13. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  14. Multicomponent Body and Surface Wave Seismic Analysis using an Urban Land Streamer System: An Integrative Earthquake Hazards Assessment Approach

    NASA Astrophysics Data System (ADS)

    Gribler, G.; Liberty, L. M.

    2014-12-01

    We present earthquake site response results from a 48-channel multicomponent seismic land streamer and large weight drop system. We acquired data along a grid of city streets in western Idaho at a rate of a few km per day where we derived shear wave velocity profiles to a depth of 40-50 m by incorporating vertical and radial geophone signals to capture the complete elliptical Rayleigh wave motion. We also obtained robust p-wave reflection and refraction results by capturing the returned signals that arrive at non-vertical incidence angles that result from the high-velocity road surface layer. By integrating the derived shear wave velocity profiles with p-wave reflection results, we include depositional and tectonic boundaries from the upper few hundred meters into our analysis to help assess whether ground motions may be amplified by shallow bedrock. By including p-wave refraction information into the analysis, we can identify zones of high liquefaction potential by comparing shear wave and p-wave velocity (Vp/Vs) measurements relative to refraction-derived water table depths. The utilization of multicomponent land streamer data improves signal-noise levels over single component data with no additional field effort. The added multicomponent data processing step can be as simple as calculating the magnitude of the vector for surface wave and refraction arrivals or rotating the reflected signals to the maximum emergence angle based on near surface p-wave velocity information. We show example data from a number of Idaho communities where historical earthquakes have been recorded. We also present numerical models and systematic field tests that show the effects of a high velocity road surface layer in surface and body wave measurements. We conclude that multicomponent seismic information derived from seismic land streamers can provide a significant improvement in earthquake hazard assessment over a standard single component approach with only a small addition in

  15. What to do given that earthquake hazard maps often fail

    NASA Astrophysics Data System (ADS)

    Stein, S.; Geller, R.; Liu, M.

    2012-04-01

    The 2011 Tohoku earthquake is another striking example - after the 2008 Wenchuan and 2010 Haiti earthquakes - of highly destructive earthquakes that occurred in areas predicted by earthquake hazard maps to have significantly lower hazard than nearby supposedly high-risk areas which have been essentially quiescent. Given the limited seismic record available and limited understanding of earthquake mechanics, hazard maps have to depend heavily on poorly constrained parameters and the mapmakers' preconceptions. These preconceptions are often incorrect. The Tohoku earthquake and its tsunami were much larger than "expected" by the mappers because of the presumed absence of such large earthquakes in the seismological record. This assumption seemed consistent with a model based on the convergence rate and age of the subducting lithosphere, which predicted at most a low M 8 earthquake. Although this model was invalidated by the 2004 Sumatra earthquake, and paleotsunami deposits showed evidence of three large past earthquakes in the Tohoku region in the past 3000 years, these facts were not incorporated in the hazard mapping. The failure to anticipate the Tohoku and other recent large earthquakes suggests two changes to current hazard mapping practices. First, the uncertainties in hazard map predictions should be assessed and communicated clearly to potential users. Communication of uncertainties would make the maps more useful by letting users decide how much credence to place in the maps. Second, hazard maps should undergo objective testing to compare their predictions to those of null hypotheses based on random regional seismicity. Such testing, which is common and useful in other fields, will hopefully produce measurable improvements. There are likely, however, to be limits on how well hazard maps can ever be made due to the intrinsic variability of earthquake processes.

  16. Earthquake Hazard Mitigation Strategy in Indonesia

    NASA Astrophysics Data System (ADS)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.

    2008-05-01

    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  17. 75 FR 18787 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  18. 75 FR 75457 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-03

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  19. 77 FR 18792 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  20. 76 FR 64325 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... Directive/PPD-8: National Preparedness to National Earthquake Hazards Reduction Program (NEHRP)...

  1. 77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... needs for existing buildings, to review the National Earthquake Hazards Reduction Program (NEHRP)...

  2. 76 FR 18165 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. ] SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... be sent to National Earthquake Hazards Reduction Program Director, National Institute of...

  3. 77 FR 27439 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  4. 75 FR 8042 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-23

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a.... Jack Hayes, National Earthquake Hazards Reduction Program Director, National Institute of Standards...

  5. 77 FR 19224 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  6. EARTHQUAKE HAZARDS IN THE OFFSHORE ENVIRONMENT.

    USGS Publications Warehouse

    Page, Robert A.; Basham, Peter W.

    1985-01-01

    This report discusses earthquake effects and potential hazards in the marine environment, describes and illustrates methods for the evaluation of earthquake hazards, and briefly reviews strategies for mitigating hazards. The report is broadly directed toward engineers, scientists, and others engaged in developing offshore resources. The continental shelves have become a major frontier in the search for new petroleum resources. Much of the current exploration is in areas of moderate to high earthquake activity. If the resources in these areas are to be developed economically and safely, potential earthquake hazards must be identified and mitigated both in planning and regulating activities and in designing, constructing, and operating facilities. Geologic earthquake effects that can be hazardous to marine facilities and operations include surface faulting, tectonic uplift and subsidence, seismic shaking, sea-floor failures, turbidity currents, and tsunamis.

  7. Stochastic ground-motion simulation of two Himalayan earthquakes: seismic hazard assessment perspective

    NASA Astrophysics Data System (ADS)

    Harbindu, Ashish; Sharma, Mukat Lal; Kamal

    2012-04-01

    The earthquakes in Uttarkashi (October 20, 1991, M w 6.8) and Chamoli (March 8, 1999, M w 6.4) are among the recent well-documented earthquakes that occurred in the Garhwal region of India and that caused extensive damage as well as loss of life. Using strong-motion data of these two earthquakes, we estimate their source, path, and site parameters. The quality factor ( Q β ) as a function of frequency is derived as Q β ( f) = 140 f 1.018. The site amplification functions are evaluated using the horizontal-to-vertical spectral ratio technique. The ground motions of the Uttarkashi and Chamoli earthquakes are simulated using the stochastic method of Boore (Bull Seismol Soc Am 73:1865-1894, 1983). The estimated source, path, and site parameters are used as input for the simulation. The simulated time histories are generated for a few stations and compared with the observed data. The simulated response spectra at 5% damping are in fair agreement with the observed response spectra for most of the stations over a wide range of frequencies. Residual trends closely match the observed and simulated response spectra. The synthetic data are in rough agreement with the ground-motion attenuation equation available for the Himalayas (Sharma, Bull Seismol Soc Am 98:1063-1069, 1998).

  8. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    SciTech Connect

    Wong, I.G.; Green, R.K.; Sun, J.I.; Pezzopane, S.K.; Abrahamson, N.A.; Quittmeyer, R.C.

    1996-12-31

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  9. Preliminary Earthquake Hazard Map of Afghanistan

    USGS Publications Warehouse

    Boyd, Oliver S.; Mueller, Charles S.; Rukstales, Kenneth S.

    2007-01-01

    Introduction Earthquakes represent a serious threat to the people and institutions of Afghanistan. As part of a United States Agency for International Development (USAID) effort to assess the resource potential and seismic hazards of Afghanistan, the Seismic Hazard Mapping group of the United States Geological Survey (USGS) has prepared a series of probabilistic seismic hazard maps that help quantify the expected frequency and strength of ground shaking nationwide. To construct the maps, we do a complete hazard analysis for each of ~35,000 sites in the study area. We use a probabilistic methodology that accounts for all potential seismic sources and their rates of earthquake activity, and we incorporate modeling uncertainty by using logic trees for source and ground-motion parameters. See the Appendix for an explanation of probabilistic seismic hazard analysis and discussion of seismic risk. Afghanistan occupies a southward-projecting, relatively stable promontory of the Eurasian tectonic plate (Ambraseys and Bilham, 2003; Wheeler and others, 2005). Active plate boundaries, however, surround Afghanistan on the west, south, and east. To the west, the Arabian plate moves northward relative to Eurasia at about 3 cm/yr. The active plate boundary trends northwestward through the Zagros region of southwestern Iran. Deformation is accommodated throughout the territory of Iran; major structures include several north-south-trending, right-lateral strike-slip fault systems in the east and, farther to the north, a series of east-west-trending reverse- and strike-slip faults. This deformation apparently does not cross the border into relatively stable western Afghanistan. In the east, the Indian plate moves northward relative to Eurasia at a rate of about 4 cm/yr. A broad, transpressional plate-boundary zone extends into eastern Afghanistan, trending southwestward from the Hindu Kush in northeast Afghanistan, through Kabul, and along the Afghanistan-Pakistan border

  10. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky, and Portsmouth, Ohio

    SciTech Connect

    Not Available

    1992-03-01

    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah.

  11. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky and Portsmouth, Ohio

    SciTech Connect

    1997-03-01

    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the, ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah.

  12. Quantifying the Seismic Hazard From Natural and Induced Earthquakes (Invited)

    NASA Astrophysics Data System (ADS)

    Rubinstein, J. L.; Llenos, A. L.; Ellsworth, W. L.; McGarr, A.; Michael, A. J.; Mueller, C. S.; Petersen, M. D.

    2013-12-01

    In the past 12 years, seismicity rates in portions of the central and eastern United States (CEUS) have increased. In 2011, the year of peak activity, three M ≥ 5 earthquakes occurred, causing millions of dollars in damage. Much of the increase in seismicity is believed to have been induced by wastewater from oil and gas activity that is injected deep underground. This includes damaging earthquakes in southern Colorado, central Arkansas, and central Oklahoma in 2011. Earthquakes related to oil and gas activities contribute significantly to the total seismic hazard in some areas of the CEUS, but most of the tens of thousands of wastewater disposal wells in the CEUS do not cause damaging earthquakes. The challenge is to better understand this contribution to the hazard in a realistic way for those wells that are inducing earthquakes or wells that may induce earthquakes in the future. We propose a logic-tree approach to estimate the hazard posed by the change in seismicity that deemphasizes the need to evaluate whether the seismicity is natural or man-made. We first compile a list of areas of increased seismicity, including areas of known induced earthquakes. Using areas of increased seismicity (instead of just induced earthquakes) allows us to assess the hazard over a broader region, avoiding the often-difficult task of judging whether an earthquake sequence is induced. With the zones of increased seismicity defined, we then estimate the earthquake hazard for each zone using a four-branch logic tree: (1) The increased seismicity rate is natural, short-term variation within the longer-term background seismicity rate. Thus, these earthquakes would be added to the catalog when computing the background seismicity rate. (2) The increased seismicity rate represents a new and permanent addition to the background seismicity. In this branch, a new background seismicity rate begins at the time of the change in earthquake rate. (3) Induced earthquakes account for the

  13. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  14. Landslide Hazards After the 2005 Kashmir Earthquake

    NASA Astrophysics Data System (ADS)

    Bulmer, Mark; Farquhar, Tony; Roshan, Masud; Akhtar, Sadar Saeed; Wahla, Sajjad Karamat

    2007-01-01

    The 8 October 2005 Kashmir earthquake killed 87,300 people and disrupted the lives of several million more. By current estimates, 30,000 still live in camps sited more in accordance with short term expedience than with freedom from risk of natural hazards. In December 2006, the international aid community expressed fears that 50,000 people in Northwest Frontier Province may leave their mountain homes this winter as landslides and avalanches block access roads. As the focus of humanitarian assistance shifts toward restoration of Kashmir's infrastructure, it is important that the persistent hazard of landslides within the earthquake affected region be understood and recognized.

  15. Earthquake hazards on the cascadia subduction zone

    SciTech Connect

    Heaton, T.H.; Hartzell, S.H.

    1987-04-10

    Large subduction earthquakes on the Cascadia subduction zone pose a potential seismic hazard. Very young oceanic lithosphere (10 million years old) is being subducted beneath North America at a rate of approximately 4 centimeters per year. The Cascadia subduction zone shares many characteristics with subduction zones in southern Chile, southwestern Japan, and Colombia, where comparably young oceanic lithosphere is also subducting. Very large subduction earthquakes, ranging in energy magnitude (M/sub w/) between 8 and 9.5, have occurred along these other subduction zones. If the Cascadia subduction zone is also storing elastic energy, a sequence of several great earthquakes (M/sub w/ 8) or a giant earthquake (M/sub w/ 9) would be necessary to fill this 1200-kilometer gap. The nature of strong ground motions recorded during subduction earthquakes of M/sub w/ less than 8.2 is discussed. Strong ground motions from even larger earthquakes (M/sub w/ up to 9.5) are estimated by simple simulations. If large subduction earthquakes occur in the Pacific Northwest, relatively strong shaking can be expected over a large region. Such earthquakes may also be accompanied by large local tsunamis. 35 references, 6 figures.

  16. Earthquake hazards on the cascadia subduction zone.

    PubMed

    Heaton, T H; Hartzell, S H

    1987-04-10

    Large subduction earthquakes on the Cascadia subduction zone pose a potential seismic hazard. Very young oceanic lithosphere (10 million years old) is being subducted beneath North America at a rate of approximately 4 centimeters per year. The Cascadia subduction zone shares many characteristics with subduction zones in southern Chile, southwestern Japan, and Colombia, where comparably young oceanic lithosphere is also subducting. Very large subduction earthquakes, ranging in energy magnitude (M(w)) between 8 and 9.5, have occurred along these other subduction zones. If the Cascadia subduction zone is also storing elastic energy, a sequence of several great earthquakes (M(w) 8) or a giant earthquake (M(w) 9) would be necessary to fill this 1200-kilometer gap. The nature of strong ground motions recorded during subduction earthquakes of M(w) less than 8.2 is discussed. Strong ground motions from even larger earthquakes (M(w) up to 9.5) are estimated by simple simulations. If large subduction earthquakes occur in the Pacific Northwest, relatively strong shaking can be expected over a large region. Such earthquakes may also be accompanied by large local tsunamis. PMID:17789780

  17. Seismic hazard and seismic risk assessment based on the unified scaling law for earthquakes: Himalayas and adjacent regions

    NASA Astrophysics Data System (ADS)

    Nekrasova, A. K.; Kossobokov, V. G.; Parvez, I. A.

    2015-03-01

    For the Himalayas and neighboring regions, the maps of seismic hazard and seismic risk are constructed with the use of the estimates for the parameters of the unified scaling law for earthquakes (USLE), in which the Gutenberg-Richter law for magnitude distribution of seismic events within a given area is applied in the modified version with allowance for linear dimensions of the area, namely, log N( M, L) = A + B (5 - M) + C log L, where N( M, L) is the expected annual number of the earthquakes with magnitude M in the area with linear dimension L. The spatial variations in the parameters A, B, and C for the Himalayas and adjacent regions are studied on two time intervals from 1965 to 2011 and from 1980 to 2011. The difference in A, B, and C between these two time intervals indicates that seismic activity experiences significant variations on a scale of a few decades. With a global consideration of the seismic belts of the Earth overall, the estimates of coefficient A, which determines the logarithm of the annual average frequency of the earthquakes with a magnitude of 5.0 and higher in the zone with a linear dimension of 1 degree of the Earth's meridian, differ by a factor of 30 and more and mainly fall in the interval from -1.1 to 0.5. The values of coefficient B, which describes the balance between the number of earthquakes with different magnitudes, gravitate to 0.9 and range from less than 0.6 to 1.1 and higher. The values of coefficient C, which estimates the fractal dimension of the local distribution of epicenters, vary from 0.5 to 1.4 and higher. In the Himalayas and neighboring regions, the USLE coefficients mainly fall in the intervals of -1.1 to 0.3 for A, 0.8 to 1.3 for B, and 1.0 to 1.4 for C. The calculations of the local value of the expected peak ground acceleration (PGA) from the maximal expected magnitude provided the necessary basis for mapping the seismic hazards in the studied region. When doing this, we used the local estimates of the

  18. 76 FR 8712 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... Effectiveness of the National Earthquake Hazards Reduction Program (NEHRP). The agenda may change to...

  19. 76 FR 72905 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... should be sent to National Earthquake Hazards Reduction Program Director, National Institute of...

  20. Increased Earthquake Rates in the Central and Eastern US Portend Higher Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Rubinstein, J. L.; Ellsworth, W. L.; Mueller, C. S.; Michael, A. J.; McGarr, A.; Petersen, M. D.; Weingarten, M.; Holland, A. A.

    2014-12-01

    Since 2009 the central and eastern United States has experienced an unprecedented increase in the rate of M≥3 earthquakes that is unlikely to be due to natural variation. Where the rates have increased so has the seismic hazard, making it important to understand these changes. Areas with significant seismicity increases are limited to areas where oil and gas production take place. By far the largest contributor to the seismicity increase is Oklahoma, where recent studies suggest that these rate changes may be due to fluid injection (e.g., Keranen et al., Geology, 2013; Science, 2014). Moreover, the area of increased seismicity in northern Oklahoma that began in 2013 coincides with the Mississippi Lime play, where well completions greatly increased the year before the seismicity increase. This suggests a link to oil and gas production either directly or from the disposal of significant amounts of produced water within the play. For the purpose of assessing the hazard due to these earthquakes, should they be treated differently from natural earthquakes? Previous studies suggest that induced seismicity may differ from natural seismicity in clustering characteristics or frequency-magnitude distributions (e.g., Bachmann et al., GJI, 2011; Llenos and Michael, BSSA, 2013). These differences could affect time-independent hazard computations, which typically assume that clustering and size distribution remain constant. In Oklahoma, as well as other areas of suspected induced seismicity, we find that earthquakes since 2009 tend to be considerably more clustered in space and time than before 2009. However differences between various regional and national catalogs leave unclear whether there are significant changes in magnitude distribution. Whether they are due to natural or industrial causes, the increased earthquake rates in these areas could increase the hazard in ways that are not accounted for in current hazard assessment practice. Clearly the possibility of induced

  1. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale

    NASA Astrophysics Data System (ADS)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.

    2015-10-01

    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  2. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale

    NASA Astrophysics Data System (ADS)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.

    2016-05-01

    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  3. Tsunami Hazards From Strike-Slip Earthquakes

    NASA Astrophysics Data System (ADS)

    Legg, M. R.; Borrero, J. C.; Synolakis, C. E.

    2003-12-01

    Strike-slip faulting is often considered unfavorable for tsunami generation during large earthquakes. Although large strike-slip earthquakes triggering landslides and then generating substantial tsunamis are now recognized hazards, many continue to ignore the threat from submarine tectonic displacement during strike-slip earthquakes. Historical data record the occurrence of tsunamis from strike-slip earthquakes, for example, 1906 San Francisco, California, 1994 Mindoro, Philippines, and 1999 Izmit, Turkey. Recognizing that strike-slip fault zones are often curved and comprise numerous en echelon step-overs, we model tsunami generation from realistic strike-slip faulting scenarios. We find that tectonic seafloor uplift, at a restraining bend or"pop-up" structure, provides an efficient mechanism to generate destructive local tsunamis; likewise for subsidence at divergent pull-apart basin structures. Large earthquakes on complex strike-slip fault systems may involve both types of structures. The California Continental Borderland is a high-relief submarine part of the active Pacific-North America transform plate boundary. Natural harbors and bays created by long term vertical motion associated with strike-slip structural irregularities are now sites of burgeoning population and major coastal infrastructure. Significant local tsunamis generated by large strike-slip earthquakes pose a serious, and previously unrecognized threat. We model several restraining bend pop-up structures offshore southern California to quantify the local tsunami hazard. Maximum runup derived in our scenarios ranges from one to several meters, similar to runup observed from the 1994 Mindoro, Philippines, (M=7.1) earthquake. The runup pattern is highly variable, with local extremes along the coast. We only model the static displacement field for the strike-slip earthquake source; dynamic effects of moving large island or submerged banks laterally during strike-slip events remains to be examined

  4. Assessing earthquake hazards with fault trench and LiDAR maps in the Puget Lowland, Washington, USA (Invited)

    NASA Astrophysics Data System (ADS)

    Nelson, A. R.; Bradley, L.; Personius, S. F.; Johnson, S. Y.

    2010-12-01

    Deciphering the earthquake histories of faults over the past few thousands of years in tectonically complex forearc regions relies on detailed site-specific as well as regional geologic maps. Here we present examples of site-specific USGS maps used to reconstruct earthquake histories for faults in the Puget Lowland. Near-surface faults and folds in the Puget Lowland accommodate 4-7 mm/yr of north-south shortening resulting from northward migration of forearc blocks along the Cascadia convergent margin. The shortening has produced east-trending uplifts, basins, and associated reverse faults that traverse urban areas. Near the eastern and northern flanks of the Olympic Mountains, complex interactions between north-south shortening and mountain uplift are reflected by normal, oblique-slip, and reverse surface faults. Holocene oblique-slip movement has also been mapped on Whidbey Island and on faults in the foothills of the Cascade Mountains in the northeastern lowland. The close proximity of lowland faults to urban areas may pose a greater earthquake hazard there than do much longer but more distant plate-boundary faults. LiDAR imagery of the densely forested lowland flown over the past 12 years revealed many previously unknown 0.5-m to 6-m-high scarps showing Holocene movement on upper-plate faults. This imagery uses two-way traveltimes of laser light pulses to detect as little as 0.2 m of relative relief on the forest floor. The returns of laser pulses with the longest travel times yield digital elevation models of the ground surface, which we vertically exaggerate and digitally shade from multiple directions at variable transparencies to enhance identification of scarps. Our maps include imagery at scales of 1:40,000 to 1:2500 with contour spacings of 100 m to 0.5 m. Maps of the vertical walls of fault-scarp trenches show complex stratigraphies and structural relations used to decipher the histories of large surface-rupturing earthquakes. These logs (field mapping

  5. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    NASA Astrophysics Data System (ADS)

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in

  6. National Earthquake Hazards Program at a Crossroads

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    The U.S.National Earthquake Hazards Reduction Program, which turns 25 years old on 1 October 2003, is passing through two major transitions, which experts said either could weaken or strengthen the program. On 1 March, a federal government reorganization placed NEHRP's lead agency,the Federal Emergency Management Agency (FEMA),within the new Department of Homeland Security (DHS). A number of earthquake scientists and engineers expressed concern that NEHRP, which already faces budgetary and organizational challenges, and lacks visibility,could end up being marginalized in the bureaucratic shuffle. Some experts, though,as well as agency officials, said they hope DHS will recognize synergies between dealing with earthquakes and terrorist attacks.

  7. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L., (Edited By)

    1993-01-01

    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and

  8. Earthquakes Pose a Serious Hazard in Afghanistan

    USGS Publications Warehouse

    Crone, Anthony J.

    2007-01-01

    This report is USGS Afghanistan Project No. 155. This study was funded by an Interagency Agreement between the U.S. Agency for International Development (USAID) and the U.S. Geological Survey. Afghanistan is located in the geologically active part of the world where the northward-moving Indian plate is colliding with the southern part of the Eurasian plate at a rate of about 1.7 inches per year. This collision has created the world's highest mountains and causes slips on major faults that generate large, often devastating earthquakes. Every few years a powerful earthquake causes significant damage or fatalities. New construction needs to be designed to accommodate the hazards posed by strong earthquakes. The U.S. Geological Survey has developed a preliminary seismic-hazard map of Afghanistan. Although the map is generalized, it provides government officials, engineers, and private companies who are interested in participating in Afghanistan's growth with crucial information about the location and nature of seismic hazards.

  9. Evaluating fault rupture hazard for strike-slip earthquakes

    USGS Publications Warehouse

    Petersen, M.; Cao, T.; Dawson, Tim; Frankel, A.; Wills, C.; Schwartz, D.

    2004-01-01

    We present fault displacement data, regressions, and a methodology to calculate in both a probabilistic and deterministic framework the fault rupture hazard for strike-slip faults. To assess this hazard we consider: (1) the size of the earthquake and probability that it will rupture to the surface, (2) the rate of all potential earthquakes on the fault (3) the distance of the site along and from the mapped fault, (4) the complexity of the fault and quality of the fault mapping, (5) the size of the structure that will be placed at the site, and (6) the potential and size of displacements along or near the fault. Probabilistic fault rupture hazard analysis should be an important consideration in design of structures or lifelines that are located within about 50m of well-mapped active faults.

  10. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-17

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet....m. The primary purpose of this meeting is to receive information on NEHRP earthquake...

  11. Earthquake Hazard and Risk in Alaska

    NASA Astrophysics Data System (ADS)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  12. Assessment of tsunami hazard to the U.S. East Coast using relationships between submarine landslides and earthquakes

    USGS Publications Warehouse

    ten Brink, U.S.; Lee, H.J.; Geist, E.L.; Twichell, D.

    2009-01-01

    Submarine landslides along the continental slope of the U.S. Atlantic margin are potential sources for tsunamis along the U.S. East coast. The magnitude of potential tsunamis depends on the volume and location of the landslides, and tsunami frequency depends on their recurrence interval. However, the size and recurrence interval of submarine landslides along the U.S. Atlantic margin is poorly known. Well-studied landslide-generated tsunamis in other parts of the world have been shown to be associated with earthquakes. Because the size distribution and recurrence interval of earthquakes is generally better known than those for submarine landslides, we propose here to estimate the size and recurrence interval of submarine landslides from the size and recurrence interval of earthquakes in the near vicinity of the said landslides. To do so, we calculate maximum expected landslide size for a given earthquake magnitude, use recurrence interval of earthquakes to estimate recurrence interval of landslide, and assume a threshold landslide size that can generate a destructive tsunami. The maximum expected landslide size for a given earthquake magnitude is calculated in 3 ways: by slope stability analysis for catastrophic slope failure on the Atlantic continental margin, by using land-based compilation of maximum observed distance from earthquake to liquefaction, and by using land-based compilation of maximum observed area of earthquake-induced landslides. We find that the calculated distances and failure areas from the slope stability analysis is similar or slightly smaller than the maximum triggering distances and failure areas in subaerial observations. The results from all three methods compare well with the slope failure observations of the Mw = 7.2, 1929 Grand Banks earthquake, the only historical tsunamigenic earthquake along the North American Atlantic margin. The results further suggest that a Mw = 7.5 earthquake (the largest expected earthquake in the eastern U

  13. A Procedure for Rapid Localized Earthquake Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Holliday, J. R.; Rundle, J. B.

    2010-12-01

    In this presentation, we introduce various ground shaking and building response models. We then discuss the forecasting capabilities of different models submitted to the Collaboratory for the Study of Earthquake Predictability (CSEP) and show how they can be used as inputs for these models. Finally, we discuss how outputs from such multi- tiered calculations would prove invaluable for real-time and scenario-based hazard assessment and for cost-benefit analysis of possible mitigation actions.

  14. Late Holocene liquefaction features in the Dominican Republic: A powerful tool for earthquake hazard assessment in the northeastern Caribbean

    USGS Publications Warehouse

    Tuttle, M.P.; Prentice, C.S.; Dyer-Williams, K.; Pena, L.R.; Burr, G.

    2003-01-01

    Several generations of sand blows and sand dikes, indicative of significant and recurrent liquefaction, are preserved in the late Holocene alluvial deposits of the Cibao Valley in northern Dominican Republic. The Cibao Valley is structurally controlled by the Septentrional fault, an onshore section of the North American-Caribbean strike-slip plate boundary. The Septentrional fault was previously studied in the central part of the valley, where it sinistrally offsets Holocene terrace risers and soil horizons. In the eastern and western parts of the valley, the Septentrional fault is buried by Holocene alluvial deposits, making direct study of the structure difficult. Liquefaction features that formed in these Holocene deposits as a result of strong ground shaking provide a record of earthquakes in these areas. Liquefaction features in the eastern Cibao Valley indicate that at least one historic earthquake, probably the moment magnitude, M 8, 4 August 1946 event, and two to four prehistoric earthquakes of M 7 to 8 struck this area during the past 1100 yr. The prehistoric earthquakes appear to cluster in time and could have resulted from rupture of the central and eastern sections of the Septentrional fault circa A.D. 1200. Liquefaction features in the western Cibao Valley indicate that one historic earthquake, probably the M 8, 7 May 1842 event, and two prehistoric earthquakes of M 7-8 struck this area during the past 1600 yr. Our findings suggest that rupture of the Septentrional fault circa A.D. 1200 may have extended beyond the central Cibao Valley and generated an earthquake of M 8. Additional information regarding the age and size distribution of liquefaction features is needed to reconstruct the prehistoric earthquake history of Hispaniola and to define the long-term behavior and earthquake potential of faults associated with the North American-Caribbean plate boundary.

  15. Earthquake Hazard and Risk in New Zealand

    NASA Astrophysics Data System (ADS)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  16. Earthquake Hazard for Aswan High Dam Area

    NASA Astrophysics Data System (ADS)

    Ismail, Awad

    2016-04-01

    Earthquake activity and seismic hazard analysis are important components of the seismic aspects for very essential structures such as major dams. The Aswan High Dam (AHD) created the second man-made reservoir in the world (Lake Nasser) and is constructed near urban areas pose a high-risk potential for downstream life and property. The Dam area is one of the seismically active regions in Egypt and is occupied with several cross faults, which are dominant in the east-west and north-south. Epicenters were found to cluster around active faults in the northern part of Lake and AHD location. The space-time distribution and the relation of the seismicity with the lake water level fluctuations were studied. The Aswan seismicity separates into shallow and deep seismic zones, between 0 and 14 and 14 and 30 km, respectively. These two seismic zones behave differently over time, as indicated by the seismicity rate, lateral extent, b-value, and spatial clustering. It is characterized by earthquake swarm sequences showing activation of the clustering-events over time and space. The effect of the North African drought (1982 to present) is clearly seen in the reservoir water level. As it decreased and left the most active fault segments uncovered, the shallow activity was found to be more sensitive to rapid discharging than to the filling. This study indicates that geology, topography, lineations in seismicity, offsets in the faults, changes in fault trends and focal mechanisms are closely related. No relation was found between earthquake activity and both-ground water table fluctuations and water temperatures measured in wells located around the Kalabsha area. The peak ground acceleration is estimated in the dam site based on strong ground motion simulation. This seismic hazard analyses have indicated that AHD is stable with the present seismicity. The earthquake epicenters have recently took place approximately 5 km west of the AHD structure. This suggests that AHD dam must be

  17. 77 FR 75610 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... Director. Any draft meeting materials will be posted prior to the meeting on the National...

  18. 78 FR 8109 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... Director. Any draft meeting materials will be posted prior to the meeting on the National...

  19. Bad Assumptions or Bad Luck: Tohoku's embarrassing lessons for earthquake hazard mapping

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Geller, R. J.; Liu, M.

    2011-12-01

    Important tools in preparation for natural disasters include long term forecasts, short term predictions, and real time warnings. How well these can be done for different disasters - earthquakes, tsunamis, volcanoes, storms, and floods - differ dramatically, calling for careful analysis. The challenge is illustrated by the 2011 Tohoku earthquake. This was another striking example - after the 2008 Wenchuan and 2010 Haiti earthquakes - of destructive earthquakes that occurred in areas predicted by hazard maps to have significantly lower hazard than nearby supposedly high-risk areas which have been essentially quiescent. Given the limited seismic record available and limited understanding of earthquake mechanics, hazard maps often have to depend heavily on poorly constrained parameters and the mapmakers' preconceptions. When these prove incorrect, maps do poorly. The Tohoku earthquake and tsunami were much larger than "expected" by mappers because of the presumed absence of such large earthquakes in the seismological record. This assumption seemed consistent with a model based on the convergence rate and age of the subducting lithosphere, which predicted at most a low M 8 earthquake. Although this model was invalidated by the 2004 Sumatra earthquake, and paleotsunami deposits showed evidence of three large past earthquakes in the Tohoku region in the past 3000 years, these facts were not incorporated in the hazard mapping. The failure to anticipate the Tohoku and other recent large earthquakes suggests two changes to current hazard mapping practices. First, uncertainties in hazard map predictions should be assessed and communicated clearly to users. Communication of uncertainties would make the maps more useful by letting users decide how much credence to place in the maps. Second, maps should undergo objective testing to compare their predictions to those of null hypotheses based on random regional seismicity. Such testing, which is common and useful in other fields

  20. Comprehensive Seismic Monitoring for Emergency Response and Hazards Assessment: Recent Developments at the USGS National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Buland, R. P.; Guy, M.; Kragness, D.; Patton, J.; Erickson, B.; Morrison, M.; Bryon, C.; Ketchum, D.; Benz, H.

    2009-12-01

    The USGS National Earthquake Information Center (NEIC) has put into operation a new generation of seismic acquisition, processing and distribution subsystems that seamlessly integrate regional, national and global seismic network data for routine monitoring of earthquake activity and response to large, damaging earthquakes. The system, Bulletin Hydra, was designed to meet Advanced National Seismic System (ANSS) design goals to handle thousands of channels of real-time seismic data, compute and distribute time-critical seismic information for emergency response applications, and manage the integration of contributed earthquake products and information, arriving from near-real-time up to six weeks after an event. Bulletin Hydra is able meet these goals due to a modular, scalable, and flexible architecture that supports on-the-fly consumption of new data, readily allows for the addition of new scientific processing modules, and provides distributed client workflow management displays. Through the Edge subsystem, Bulletin Hydra accepts waveforms in half a dozen formats. In addition, Bulletin Hydra accepts contributed seismic information including hypocenters, magnitudes, moment tensors, unassociated and associated picks, and amplitudes in a variety of formats including earthworm import/export pairs and EIDS. Bulletin Hydra has state-driven algorithms for computing all IASPEI standard magnitudes (e.g. mb, mb_BB, ML, mb_LG, Ms_20, and Ms_BB) as well as Md, Ms(VMAX), moment tensor algorithms for modeling different portions of the wave-field at different distances (e.g. teleseismic body-wave, centroid, and regional moment tensors), and broadband depth. All contributed and derived data are centrally managed in an Oracle database. To improve on single station observations, Bulletin Hydra also does continuous real-time beam forming of high-frequency arrays. Finally, workflow management displays are used to assist NEIC analysts in their day-to-day duties. All combined

  1. Active Fault Mapping of Naga-Disang Thrust (Belt of Schuppen) for Assessing Future Earthquake Hazards in NE India

    NASA Astrophysics Data System (ADS)

    Kumar, A.

    2014-12-01

    We observe the geodynamic appraisal of Naga-Disang Thrust North East India. The Disang thrust extends NE-SW over a length of 480 km and it defines the eastern margin of Neogene basin. It branches out from Haflong-Naga thrust and in the NE at Bulbulia in the right bank of Noa Dihing River, it is terminated by Mishmi thrust, which extends into Myanmar as 'Sagaing fault,which dip generally towards SE. It extends between Dauki fault in the SW and Mishmi thrust in the NE. When the SW end of 'Belt of Schuppen' moved upwards and towards east along the Dauki fault, the NE end moved downwards and towards west along the Mishmi thrust, causing its 'S' shaped bending. The SRTM generated DEM is used to map the topographic expression of the schuppen belt, where these thrusts are significantly marked by topographic break. Satellite imagery map also shows presence lineaments supporting the post tectonic activities along Naga-Disang Thrusts. The southern part of 'Belt of Schuppen' extends along the sheared western limb of southerly plunging Kohima synform, a part of Indo Burma Ranges (IBR) and it is seismically active.The crustal velocity at SE of Schuppen is 39.90 mm/yr with a azimuth of 70.780 at Lumami, 38.84 mm/yr (Azimuth 54.09) at Senapati and 36.85 mm/yr (Azimuth 54.09) at Imphal. The crustal velocity at NW of Schuppen belt is 52.67 mm/yr (Azimuth 57.66) near Dhauki Fault in Meghalaya. It becomes 43.60 mm/yr (Azimuth76.50) - 44.25 (Azimuth 73.27) at Tiding and Kamlang Nagar around Mishmi thrust. The presence of Schuppen is marked by a change in high crustal velocity from Indian plate to low crustal velocity in Mishmi Suture as well as Indo Burma Ranges. The difference in crustal velocities results in building up of strain along the Schuppen which may trigger a large earthquake in the NE India in future. The belt of schuppean seems to be seismically active, however, the enough number of large earthquakes are not recorded. These observations are significant on Naga

  2. Tsunami Hazards Along the Chinese Coast from Potential Earthquakes

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Santos, A.; Shi, Y.; Wang, M.; Yuen, D. A.

    2006-12-01

    The recent Indonesian earthquake has awakened great concerns about destructive hazards along the Chinese coast. Scientists have provided a clear record of past tsunamis along East China that clearly indicate the potential for future tsunami damage to China. In this work we will assess from analyzing the probability for tsunami waves to hit the Chinese coast in the next century from large earthquakes coming from the neighboring subducting plate boundaries. This analysis is important because of the sharp increase in coastal population density in China, the intense development of harbors and the exploitation of mineral resources in coastal areas, ranging from Xiamen in the north to Hainan in the south. The probability seismic studies for the South China Sea and adjacent field were based on the relationship of Gutenberg- Richter (G-R) relationship between the number of local earthquake and magnitude. We studied the earthquakes of the global subduction belt. We found the earthquakes of the global subduction belt follow the G-R relationship. The plate boundary model came from P. Bird (2002). According to the historical earthquakes of South China Sea and adjacent field (From NEIC), and the tectonic and focal mechanism (HCMT), the studied field is divided two partitions. The latitude of the first partition is N (12-19 deg.); the second is N(19-23 deg.). There are twelve large earthquakes in the two partitions. Their magnitudes are bigger than 6. The probabilities of earthquakes in the South China Sea are computed by the local G-R relationship. They would determine the seismically-induced tsunami probability. In our study the linear shallow water equation is used for integrating the twelve earthquake induced tsunamis. The numerical scheme for the linear equations is the staggered leap-frog method. The code has been provided by Dr. Fumihiko Imamura, Tokohu University, Japan. We combined the probability of three segments of wave height, 2.0 to 1.0 meter, 1.0 to 0.5 meters

  3. Prediction of earthquake hazard by hidden Markov model (around Bilecik, NW Turkey)

    NASA Astrophysics Data System (ADS)

    Can, Ceren; Ergun, Gul; Gokceoglu, Candan

    2014-09-01

    Earthquakes are one of the most important natural hazards to be evaluated carefully in engineering projects, due to the severely damaging effects on human-life and human-made structures. The hazard of an earthquake is defined by several approaches and consequently earthquake parameters such as peak ground acceleration occurring on the focused area can be determined. In an earthquake prone area, the identification of the seismicity patterns is an important task to assess the seismic activities and evaluate the risk of damage and loss along with an earthquake occurrence. As a powerful and flexible framework to characterize the temporal seismicity changes and reveal unexpected patterns, Poisson hidden Markov model provides a better understanding of the nature of earthquakes. In this paper, Poisson hidden Markov model is used to predict the earthquake hazard in Bilecik (NW Turkey) as a result of its important geographic location. Bilecik is in close proximity to the North Anatolian Fault Zone and situated between Ankara and Istanbul, the two biggest cites of Turkey. Consequently, there are major highways, railroads and many engineering structures are being constructed in this area. The annual frequencies of earthquakes occurred within a radius of 100 km area centered on Bilecik, from January 1900 to December 2012, with magnitudes (M) at least 4.0 are modeled by using Poisson-HMM. The hazards for the next 35 years from 2013 to 2047 around the area are obtained from the model by forecasting the annual frequencies of M ≥ 4 earthquakes.

  4. Prediction of earthquake hazard by hidden Markov model (around Bilecik, NW Turkey)

    NASA Astrophysics Data System (ADS)

    Can, Ceren Eda; Ergun, Gul; Gokceoglu, Candan

    2014-09-01

    Earthquakes are one of the most important natural hazards to be evaluated carefully in engineering projects, due to the severely damaging effects on human-life and human-made structures. The hazard of an earthquake is defined by several approaches and consequently earthquake parameters such as peak ground acceleration occurring on the focused area can be determined. In an earthquake prone area, the identification of the seismicity patterns is an important task to assess the seismic activities and evaluate the risk of damage and loss along with an earthquake occurrence. As a powerful and flexible framework to characterize the temporal seismicity changes and reveal unexpected patterns, Poisson hidden Markov model provides a better understanding of the nature of earthquakes. In this paper, Poisson hidden Markov model is used to predict the earthquake hazard in Bilecik (NW Turkey) as a result of its important geographic location. Bilecik is in close proximity to the North Anatolian Fault Zone and situated between Ankara and Istanbul, the two biggest cites of Turkey. Consequently, there are major highways, railroads and many engineering structures are being constructed in this area. The annual frequencies of earthquakes occurred within a radius of 100 km area centered on Bilecik, from January 1900 to December 2012, with magnitudes ( M) at least 4.0 are modeled by using Poisson-HMM. The hazards for the next 35 years from 2013 to 2047 around the area are obtained from the model by forecasting the annual frequencies of M ≥ 4 earthquakes.

  5. Nationwide tsunami hazard assessment project in Japan

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2014-12-01

    In 2012, we began a project of nationwide Probabilistic Tsunami Hazard Assessment (PTHA) in Japan to support various measures (Fujiwara et al., 2013, JpGU; Hirata et al., 2014, AOGS). The most important strategy in the nationwide PTHA is predominance of aleatory uncertainty in the assessment but use of epistemic uncertainty is limited to the minimum, because the number of all possible combinations among epistemic uncertainties diverges quickly when the number of epistemic uncertainties in the assessment increases ; we consider only a type of earthquake occurrence probability distribution as epistemic uncertainty. We briefly show outlines of the nationwide PTHA as follows; (i) we consider all possible earthquakes in the future, including those that the Headquarters for Earthquake Research Promotion (HERP) of Japanese Government, already assessed. (ii) We construct a set of simplified earthquake fault models, called "Characterized Earthquake Fault Models (CEFMs)", for all of the earthquakes by following prescribed rules (Toyama et al., 2014, JpGU; Korenaga et al., 2014, JpGU). (iii) For all of initial water surface distributions caused by a number of the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. (iv) Finally, we integrate information about the tsunamis calculated from the numerous CEFMs to get nationwide tsunami hazard assessments. One of the most popular representations of the integrated information is a tsunami hazard curve for coastal tsunami heights, incorporating uncertainties inherent in tsunami simulation and earthquake fault slip heterogeneity (Abe et al., 2014, JpGU). We will show a PTHA along the eastern coast of Honshu, Japan, based on approximately 1,800 tsunami sources located within the subduction zone along the Japan Trench, as a prototype of the nationwide PTHA. This study is supported by part of the research

  6. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  7. Fault Imaging with High-Resolution Seismic Reflection for Earthquake Hazard and Geothermal Resource Assessment in Reno, Nevada

    SciTech Connect

    Frary, Roxanna

    2012-05-05

    The Truckee Meadows basin is situated adjacent to the Sierra Nevada microplate, on the western boundary of the Walker Lane. Being in the transition zone between a range-front normal fault on the west and northwest-striking right-lateral strike slip faults to the east, there is no absence of faulting in this basin. The Reno- Sparks metropolitan area is located in this basin, and with a signi cant population living here, it is important to know where these faults are. High-resolution seismic reflection surveys are used for the imaging of these faults along the Truckee River, across which only one fault was previously mapped, and in southern Reno near and along Manzanita Lane, where a swarm of short faults has been mapped. The reflection profiles constrain the geometries of these faults, and suggest additional faults not seen before. Used in conjunction with depth to bedrock calculations and gravity measurements, the seismic reflection surveys provide de nitive locations of faults, as well as their orientations. O sets on these faults indicate how active they are, and this in turn has implications for seismic hazard in the area. In addition to seismic hazard, the faults imaged here tell us something about the conduits for geothermal fluid resources in Reno.

  8. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  9. California Earthquakes: Science, Risks, and the Politics of Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Shedlock, Kaye M.

    "Politics" should be the lead word in the sub-title of this engrossing study of the emergence and growth of the California and federal earthquake hazard reduction infrastructures. Beginning primarily with the 1906 San Francisco earthquake, scientists, engineers, and other professionals cooperated and clashed with state and federal officials, the business community, " boosters," and the general public to create programs, agencies, and commissions to support earthquake research and hazards mitigation. Moreover, they created a "regulatory-state" apparatus that governs human behavior without sustained public support for its creation. The public readily accepts that earthquake research and mitigation are government responsibilities. The government employs or funds the scientists, engineers, emergency response personnel, safety officials, building inspectors, and others who are instrumental in reducing earthquake hazards. This book clearly illustrates how, and why all of this came to pass.

  10. Workshop on evaluation of earthquake hazards and risk in the Puget Sound and Portland areas

    SciTech Connect

    Hays, W.W.; Kitzmiller, C.

    1988-01-01

    Three tasks were undertaken in the forum provided by the workshop: (1) assessing the present state-of-knowledge of earthquake hazards in Washington and Oregon including scientific, engineering, and hazard-reduction components; (2) determining the need for additional scientific, engineering, and societal response information to implement an effective earthquake-hazard reduction program; and (3) developing a strategy for implementing programs to reduce potential earthquake losses and to foster preparedness and mitigation. Thirty-five papers were given at the workshop and each of these has been abstracted for the U.S. Department of Energy's Energy Data Base (EDB). In addition, the volume includes a glossary of technical terms used in earthquake engineering in Appendix A.

  11. Seismic hazard from instrumentally recorded, historical and simulated earthquakes: Application to the Tibet-Himalayan region

    NASA Astrophysics Data System (ADS)

    Sokolov, Vladimir; Ismail-Zadeh, Alik

    2015-08-01

    We present a new approach to assessment of regional seismic hazard, which accounts for observed (instrumentally recorded and historic) earthquakes, as well as for seismic events simulated for a significantly longer period of time than that of observations. We apply this approach to probabilistic seismic hazard analysis (PSHA) for the Tibet-Himalayan region. The large magnitude synthetic events, which are consistent with the geophysical and geodetic data, together with the observed earthquakes are employed for the Monte-Carlo PSHA. Earthquake scenarios for hazard assessment are generated stochastically to sample the magnitude and spatial distribution of seismicity, as well as the distribution of ground motion for each seismic event. The peak ground acceleration values, which are estimated for the return period of 475 yr, show that the hazard level associated with large events in the Tibet-Himalayan region significantly increases if the long record of simulated seismicity is considered in the PSHA. The magnitude and the source location of the 2008 Wenchuan M = 7.9 earthquake are among the range of those described by the seismic source model accepted in our analysis. We analyze the relationship between the ground motion data obtained in the earthquake's epicentral area and the obtained PSHA estimations using a deaggregation technique. The proposed approach provides a better understanding of ground shaking due to possible large-magnitude events and could be useful for risk assessment, earthquake engineering purposes, and emergency planning.

  12. Guide and Checklist for Nonstructural Earthquake Hazards in California Schools.

    ERIC Educational Resources Information Center

    2003

    The recommendations included in this document are intended to reduce seismic hazards associated with the non-structural components of schools buildings, including mechanical systems, ceiling systems, partitions, light fixtures, furnishings, and other building contents. It identifies potential earthquake hazards and provides recommendations for…

  13. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  14. Tank farms hazards assessment

    SciTech Connect

    Broz, R.E.

    1994-09-30

    Hanford contractors are writing new facility specific emergency procedures in response to new and revised US Department of Energy (DOE) Orders on emergency preparedness. Emergency procedures are required for each Hanford facility that has the potential to exceed the criteria for the lowest level emergency, an Alert. The set includes: (1) a facility specific procedure on Recognition and Classification of Emergencies, (2) area procedures on Initial Emergency Response and, (3) an area procedure on Protective Action Guidance. The first steps in developing these procedures are to identify the hazards at each facility, identify the conditions that could release the hazardous material, and calculate the consequences of the releases. These steps are called a Hazards Assessment. The final product is a document that is similar in some respects to a Safety Analysis Report (SAR). The document could br produced in a month for a simple facility but could take much longer for a complex facility. Hanford has both types of facilities. A strategy has been adopted to permit completion of the first version of the new emergency procedures before all the facility hazards Assessments are complete. The procedures will initially be based on input from a task group for each facility. This strategy will but improved emergency procedures in place sooner and therefore enhance Hanford emergency preparedness. The purpose of this document is to summarize the applicable information contained within the Waste Tank Facility ``Interim Safety Basis Document, WHC-SD-WM-ISB-001`` as a resource, since the SARs covering Waste Tank Operations are not current in all cases. This hazards assessment serves to collect, organize, document and present the information utilized during the determination process.

  15. Development of Earthquake Hazard Maps in Managua, Nicaragua

    NASA Astrophysics Data System (ADS)

    Nishii, O.; Katayama, I.; Strauch, W.; Guzman, C.; Chávez, G.

    2007-05-01

    We developed 1/50,000 scale earthquake hazard maps in Managua by deterministic and probabilistic approach, compiling available data. This is a part of the result obtained by the technical cooperation - The Study for Establishment of Base Maps for GIS in the Republic of Nicaragua - during the year from 2004 to 2006 executed by Japan International Cooperation Agency with INETER as a counterpart agency upon the request by the government of the Republic of Nicaragua. We firstly collected and studied available earthquake catalogues. Among these catalogues, the historical earthquake catalogue by INETER (1505 - 1992) and instrumental earthquake catalogue by INETER (1993 - 2001) are the most comprehensive. Therefore these catalogs are selected as the base catalog and corrected and improved using other catalogues. Finally, these catalogs are unified, and then separated into two new catalogs namely Volcanic Catalogs and Non-volcanic Catalog. Then we considered three types of scenario earthquakes. For earthquake scenario from active fault, we used Aeropuerto Fault and Cofradia Fault. The location and magnitude of each fault are determined using USGS fault map and empirical formula on its length and magnitude. For earthquake scenario by volcanic earthquake, we used earthquake from Masaya volcano (M=6.0) and the one from Apyoque volcano (M=6.0). Magnitudes of these earthquakes are estimated from the past reports of the hazards. As for the probabilistic approach, based on the newly improved the Non-volcanic Catalog, hazard curve analysis is performed at the Center of Managua City. As a result, the 100-years-return period earthquake is obtained as 110 gal with the standard deviation of 28 gal. For the ground motion attenuation, three types of attenuation laws were tested to estimate maximum accelerations and MM Intensities at Managua by major earthquakes. As a result, we found that combined law of Joyner-Boore (1981) and Young et al. (1997) are appropriately applicable to the

  16. Reinvestigating the Mission Creek Fault: Holocene slip rates in the northern Coachella Valley and implications for southern California earthquake hazard assessment

    NASA Astrophysics Data System (ADS)

    Wersan, Louis Samuel

    Coachella Valley. Constraining active slip on the Mission Creek fault has significant implications for southern California fault modeling and earthquake hazard assessment, and allows quantification of maximum strain transfer in the Coachella Valley from the Mission Creek fault to the Eastern California Shear Zone (˜9 mm/yr).

  17. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  18. Deterministic and Nondeterministic Behavior of Earthquakes and Hazard Mitigation Strategy

    NASA Astrophysics Data System (ADS)

    Kanamori, H.

    2014-12-01

    Earthquakes exhibit both deterministic and nondeterministic behavior. Deterministic behavior is controlled by length and time scales such as the dimension of seismogenic zones and plate-motion speed. Nondeterministic behavior is controlled by the interaction of many elements, such as asperities, in the system. Some subduction zones have strong deterministic elements which allow forecasts of future seismicity. For example, the forecasts of the 2010 Mw=8.8 Maule, Chile, earthquake and the 2012 Mw=7.6, Costa Rica, earthquake are good examples in which useful forecasts were made within a solid scientific framework using GPS. However, even in these cases, because of the nondeterministic elements uncertainties are difficult to quantify. In some subduction zones, nondeterministic behavior dominates because of complex plate boundary structures and defies useful forecasts. The 2011 Mw=9.0 Tohoku-Oki earthquake may be an example in which the physical framework was reasonably well understood, but complex interactions of asperities and insufficient knowledge about the subduction-zone structures led to the unexpected tragic consequence. Despite these difficulties, broadband seismology, GPS, and rapid data processing-telemetry technology can contribute to effective hazard mitigation through scenario earthquake approach and real-time warning. A scale-independent relation between M0 (seismic moment) and the source duration, t, can be used for the design of average scenario earthquakes. However, outliers caused by the variation of stress drop, radiation efficiency, and aspect ratio of the rupture plane are often the most hazardous and need to be included in scenario earthquakes. The recent development in real-time technology would help seismologists to cope with, and prepare for, devastating tsunamis and earthquakes. Combining a better understanding of earthquake diversity and modern technology is the key to effective and comprehensive hazard mitigation practices.

  19. PUREX facility hazards assessment

    SciTech Connect

    Sutton, L.N.

    1994-09-23

    This report documents the hazards assessment for the Plutonium Uranium Extraction Plant (PUREX) located on the US Department of Energy (DOE) Hanford Site. Operation of PUREX is the responsibility of Westinghouse Hanford Company (WHC). This hazards assessment was conducted to provide the emergency planning technical basis for PUREX. DOE Order 5500.3A requires an emergency planning hazards assessment for each facility that has the potential to reach or exceed the lowest level emergency classification. In October of 1990, WHC was directed to place PUREX in standby. In December of 1992 the DOE Assistant Secretary for Environmental Restoration and Waste Management authorized the termination of PUREX and directed DOE-RL to proceed with shutdown planning and terminal clean out activities. Prior to this action, its mission was to reprocess irradiated fuels for the recovery of uranium and plutonium. The present mission is to establish a passively safe and environmentally secure configuration at the PUREX facility and to preserve that condition for 10 years. The ten year time frame represents the typical duration expended to define, authorize and initiate follow-on decommissioning and decontamination activities.

  20. Roaming earthquakes in China highlight midcontinental hazards

    NASA Astrophysics Data System (ADS)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  1. Seismic survey probes urban earthquake hazards in Pacific Northwest

    USGS Publications Warehouse

    Fisher, M.A.; Brocher, T.M.; Hyndman, R.D.; Trehu, A.M.; Weaver, C.S.; Creager, K.C.; Crosson, R.S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B.C.; Hammer, P.T.; Childs, J. R.; Cochrane, G.R.; Chopra, S.; Walia, R.

    1999-01-01

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region. The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  2. Seismic survey probes urban earthquake hazards in Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Fisher, M. A.; Brocher, T. M.; Hyndman, R. D.; Trehu, A. M.; Weaver, C. S.; Creager, K. C.; Crosson, R. S.; Parsons, T.; Cooper, A. K.; Mosher, D.; Spence, G.; Zelt, B. C.; Hammer, P. T.; ten Brink, U.; Pratt, T. L.; Miller, K. C.; Childs, J. R.; Cochrane, G. R.; Chopra, S.; Walia, R.

    A multidisciplinary seismic survey earlier this year in the Pacific Northwest is expected to reveal much new information about the earthquake threat to U.S. and Canadian urban areas there. A disastrous earthquake is a very real possibility in the region.The survey, known as the Seismic Hazards Investigation in Puget Sound (SHIPS), engendered close cooperation among geologists, biologists, environmental groups, and government agencies. It also succeeded in striking a fine balance between the need to prepare for a great earthquake and the requirement to protect a coveted marine environment while operating a large airgun array.

  3. The Wenchuan, China M8.0 Earthquake: A Lesson and Implication for Seismic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2008-12-01

    The Wenchuan, China M8.0 earthquake caused great damage and huge casualty. 69,197 people were killed, 374,176 people were injured, and 18,341 people are still missing. The estimated direct economic loss is about 126 billion U.S. dollar. The Wenchuan earthquake again demonstrated that earthquake does not kill people, but the built environments and induced hazards, landslides in particular, do. Therefore, it is critical to strengthen the built environments, such buildings and bridges, and to mitigate the induced hazards in order to avoid such disaster. As a part of the so-called North-South Seismic Zone in China, the Wenchuan earthquake occurred along the Longmen Shan thrust belt which forms a boundary between the Qinghai-Tibet Plateau and the Sichuan basin, and there is a long history (~4,000 years) of seismicity in the area. The historical records show that the area experienced high intensity (i.e., greater than IX) in the past several thousand years. In other words, the area is well-known to have high seismic hazard because of its tectonic setting and seismicity. However, only intensity VII (0.1 to 0.15g PGA) has been considered for seismic design for the built environments in the area. This was one of the main reasons that so many building collapses, particularly the school buildings, during the Wenchuan earthquake. It is clear that the seismic design (i.e., the design ground motion or intensity) is not adequate in the Wenchuan earthquake stricken area. A lesson can be learned from the Wenchuan earthquake on the seismic hazard and risk assessment. A lesson can also be learned from this earthquake on seismic hazard mitigation and/or seismic risk reduction.

  4. Comprehensive baseline hazard assessments

    SciTech Connect

    Warren, S.B.; Amundson, T.M.

    1994-10-01

    Westinghouse Hanford Company (WHC) has developed and implemented a cost effective/value-added program/process that assists in fulfilling key elements of the Occupational Safety and Health Administration`s (OSHA) voluntary Protection Program (VPP) requirements. WHC is the prime contractor for the US Department of Energy (US DOE) at the Hanford site, located in Richland, Washington. The site consists of over 560 square miles, contains over 1100 facilities and has an employment of approximately 18,000. WHC is currently in the application review phase for the US DOE equivalent of OSHA-VPP ``merit`` program status. The program involves setting up a team consisting of industrial safety and health (industrial hygienists) professionals, members of the maintenance and operations work force, and facility management. This team performs a workplace hazard characterization/analysis and then applies a risk assessment approach to prioritize observed and potential hazards in need of abatement. The process involves using checklists that serve as a guide for evaluation/inspection criteria. Forms are used to document meetings, field observations, instrument calibration and performance testing. Survey maps are generated to document quality records of measurement results. A risk assessment code matrix with a keyword index was developed to facilitate consistency. The end product is useful in communicating hazards to facility management, health and safety professionals, audit/appraisal groups, and most importantly, facility workers.

  5. Central US earthquake catalog for hazard maps of Memphis, Tennessee

    USGS Publications Warehouse

    Wheeler, R.L.; Mueller, C.S.

    2001-01-01

    An updated version of the catalog that was used for the current national probabilistic seismic-hazard maps would suffice for production of large-scale hazard maps of the Memphis urban area. Deaggregation maps provide guidance as to the area that a catalog for calculating Memphis hazard should cover. For the future, the Nuttli and local network catalogs could be examined for earthquakes not presently included in the catalog. Additional work on aftershock removal might reduce hazard uncertainty. Graphs of decadal and annual earthquake rates suggest completeness at and above magnitude 3 for the last three or four decades. Any additional work on completeness should consider the effects of rapid, local population changes during the Nation's westward expansion. ?? 2001 Elsevier Science B.V. All rights reserved.

  6. Seismic hazard assessments at Islamic Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Khalil, A. E.; Deif, A.; Abdel Hafiez, H. E.

    2015-12-01

    Islamic Cairo is one of the important Islamic monumental complexes in Egypt, near the center of present-day metropolitan Cairo. The age of these buildings is up to one thousand years. Unfortunately, many of the buildings are suffering from huge mishandling that may lead to mass damage. Many buildings and masjids were partially and totally collapsed because of 12th October 1992 Cairo earthquake that took place at some 25 km from the study area with a magnitude Mw = 5.8. Henceforth, potential damage assessments there are compulsory. The deterministic and probabilistic techniques were used to predict the expected future large earthquakes' strong-motion characteristics in the study area. The current study started with compiling the available studies concerned with the distribution of the seismogenic sources and earthquake catalogs. The deterministic method is used to provide a description of the largest earthquake effect on the area of interest, while the probabilistic method, on the other hand, is used to define the uniform hazard curves at three time periods 475, 950, 2475 years. Both deterministic and probabilistic results were obtained for bedrock conditions and the resulted hazard levels were deaggregated to identify the contribution of each seismic source to the total hazard. Moreover, the results obtained show that the expected seismic activities combined with the present situation of the buildings pose high alert to rescue both the cultural heritage and expected human losses.

  7. Earthquake induced landslide hazard field observatory in the Avcilar peninsula

    NASA Astrophysics Data System (ADS)

    Bigarre, Pascal; Coccia, Stella; Theoleyre, Fiona; Ergintav, Semih; Özel, Oguz; Yalçinkaya, Esref; Lenti, Luca; Martino, Salvatore; Gamba, Paolo; Zucca, Francesco; Moro, Marco

    2015-04-01

    SAR temporal series has been undertaken, providing global but accurate Identification and characterization of gravitational phenomena covering the aera. Evaluation of the resolution and identification of landslide hazard-related features using space multispectral/hyperspectral image data has been realized. Profit has been gained from a vast drilling and geological - geotechnical survey program undertaken by the Istanbul Metropolitan Area, to get important data to complete the geological model of the landslide as well as one deep borehole to set up permanent instrumentation on a quite large slow landslide, fully encircled by a dense building environment. The selected landslide was instrumented in 2014 with a real-time observational system including GPS, rainfall, piezometer and seismic monitoring. Objective of this permanent monitoring system is three folds: first to detect and quantify interaction between seismic motion, rainfall and mass movement, building a database opened to the scientific community in the future, second to help to calibrate dynamic numerical geomechanical simulations intending to study the sensitivity to seismic loading, and last but not least. Last but not least important geophysical field work has been conducted to assess seismic site effects already noticed during the 1999 earthquake .Data, metadata and main results are from now progressively compiled and formatted for appropriate integration in the cloud monitoring infrastructure for data sharing.

  8. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  9. 2016 one-year seismic hazard forecast for the Central and Eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a 1-year seismic hazard forecast for 2016 for the Central and Eastern United States (CEUS) that includes contributions from both induced and natural earthquakes. The model assumes that earthquake rates calculated from several different time windows will remain relatively stationary and can be used to forecast earthquake hazard and damage intensity for the year 2016. This assessment is the first step in developing an operational earthquake forecast for the CEUS, and the analysis could be revised with updated seismicity and model parameters. Consensus input models consider alternative earthquake catalog durations, smoothing parameters, maximum magnitudes, and ground motion estimates, and represent uncertainties in earthquake occurrence and diversity of opinion in the science community. Ground shaking seismic hazard for 1-percent probability of exceedance in 1 year reaches 0.6 g (as a fraction of standard gravity [g]) in northern Oklahoma and southern Kansas, and about 0.2 g in the Raton Basin of Colorado and New Mexico, in central Arkansas, and in north-central Texas near Dallas. Near some areas of active induced earthquakes, hazard is higher than in the 2014 USGS National Seismic Hazard Model (NHSM) by more than a factor of 3; the 2014 NHSM did not consider induced earthquakes. In some areas, previously observed induced earthquakes have stopped, so the seismic hazard reverts back to the 2014 NSHM. Increased seismic activity, whether defined as induced or natural, produces high hazard. Conversion of ground shaking to seismic intensity indicates that some places in Oklahoma, Kansas, Colorado, New Mexico, Texas, and Arkansas may experience damage if the induced seismicity continues unabated. The chance of having Modified Mercalli Intensity (MMI) VI or greater (damaging earthquake shaking) is 5–12 percent per year in north-central Oklahoma and southern Kansas, similar to the chance of damage caused by natural earthquakes

  10. 2016 One-Year Seismic Hazard Forecast for the Central and Eastern United States from Induced and Natural Earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a 1-year seismic hazard forecast for 2016 for the Central and Eastern United States (CEUS) that includes contributions from both induced and natural earthquakes. The model assumes that earthquake rates calculated from several different time windows will remain relatively stationary and can be used to forecast earthquake hazard and damage intensity for the year 2016. This assessment is the first step in developing an operational earthquake forecast for the CEUS, and the analysis could be revised with updated seismicity and model parameters. Consensus input models consider alternative earthquake catalog durations, smoothing parameters, maximum magnitudes, and ground motion estimates, and represent uncertainties in earthquake occurrence and diversity of opinion in the science community. Ground shaking seismic hazard for 1-percent probability of exceedance in 1 year reaches 0.6 g (as a fraction of standard gravity [g]) in northern Oklahoma and southern Kansas, and about 0.2 g in the Raton Basin of Colorado and New Mexico, in central Arkansas, and in north-central Texas near Dallas. Near some areas of active induced earthquakes, hazard is higher than in the 2014 USGS National Seismic Hazard Model (NHSM) by more than a factor of 3; the 2014 NHSM did not consider induced earthquakes. In some areas, previously observed induced earthquakes have stopped, so the seismic hazard reverts back to the 2014 NSHM. Increased seismic activity, whether defined as induced or natural, produces high hazard. Conversion of ground shaking to seismic intensity indicates that some places in Oklahoma, Kansas, Colorado, New Mexico, Texas, and Arkansas may experience damage if the induced seismicity continues unabated. The chance of having Modified Mercalli Intensity (MMI) VI or greater (damaging earthquake shaking) is 5–12 percent per year in north-central Oklahoma and southern Kansas, similar to the chance of damage caused by natural earthquakes

  11. Community Exposure and Sensitivity to Earthquake Hazards in Washington State

    NASA Astrophysics Data System (ADS)

    Ratliff, J.; Wood, N. J.; Weaver, C. S.

    2011-12-01

    Communities in Washington State are potentially threatened by earthquakes from many sources, including the Cascadia Subduction zone and myriad inland faults (Seattle fault, Tacoma fault, etc.). The USGS Western Geographic Science Center, in collaboration with the State of Washington Military Department Emergency Management Division, has been working to identify Washington community vulnerability to twenty-one earthquake scenarios to provide assistance for mitigation, preparedness, and outreach. We calculate community earthquake exposure and sensitivity by overlaying demographic and economic data with peak ground acceleration values of each scenario in a geographic information system. We summarize community and county earthquake vulnerability to assist emergency managers by the number of earthquake scenarios affecting each area, as well as the number of residents, occupied households, businesses (individual and sector), and employees in each predicted Modified Mercalli Intensity value (ranging from V to IX). Percentages based on community, county, and scenario totals also provide emergency managers insight to community sensitivity to the earthquake scenarios. Results indicate significant spatial and temporal residential variations as well as spatial economic variations in exposure and sensitivity to earthquake hazards in the State of Washington, especially for communities west of the Cascade Range.

  12. An intelligent simulation system for earthquake disaster assessment

    NASA Astrophysics Data System (ADS)

    Tang, Aiping; Wen, Aihua

    2009-05-01

    This paper presents an intelligent simulation system for an earthquake disaster assessment system based on a development platform of a Geographic Information System (GIS) and Artificial Intelligence (AI). This system is designed to identify the weakness of the structure and infrastructure system in pre-earthquake conditions, quickly assess earthquake damage and make an intelligent emergency response for the public and government during the earthquake and post-earthquake. The system includes the following functions: intelligent seismic hazard assessment, earthquake damage and loss evaluation, optimizing emergency response and post-earthquake recovering plan. The principle, design criteria, structure, functions and test results of this system are described in this paper. Based on its functional characteristics, this system is composed of four parts: an information database, analytical modules, an intelligent decision-making sub-system and a friendly user interface. There are 132 coverages and 78 analytical modules included in the information database and analytical modules. With this system, seismic disaster mitigation strategies can be verified during a pre-earthquake, and be executed at the time of an earthquake and post-earthquake; the earthquake resisting capacities for an entire city and all of its communities can be greatly enhanced. To check its reliability and its efficiency, this system has been tested based on a scenario earthquake event in one city, and the related results have also been given in this paper. At the present, this system has been installed and used in Daqing City, China. After running for almost 10 years, this system has successfully been used in rehearsing of seismic disaster mitigation and post-earthquake emergency response. Simultaneously, an optimizing aseismic retrofitting plan in Daqing City has been executed based on results from this system.

  13. Seismic Hazard Assessment of the Sheki-Ismayilli Region, Azerbaijan

    SciTech Connect

    Ayyubova, Leyla J.

    2006-03-23

    Seismic hazard assessment is an important factor in disaster management of Azerbaijan Republic. The Shaki-Ismayilli region is one of the earthquake-prone areas in Azerbaijan. According to the seismic zoning map, the region is located in intensity IX zone. Large earthquakes in the region take place along the active faults. The seismic activity of the Shaki-Ismayilli region is studied using macroseismic and instrumental data, which cover the period between 1250 and 2003. Several principal parameters of earthquakes are analyzed: maximal magnitude, energetic class, intensity, depth of earthquake hypocenter, and occurrence. The geological structures prone to large earthquakes are determined, and the dependence of magnitude on the fault length is shown. The large earthquakes take place mainly along the active faults. A map of earthquake intensity has been developed for the region, and the potential seismic activity of the Shaki-Ismayilli region has been estimated.

  14. Geophysical variables and behavior: LX. Lonquimay and Alhué, Chile: tension from volcanic and earthquake hazard.

    PubMed

    Larraín, P; Simpson-Housley, P

    1990-02-01

    This study assesses the effect of trait anxiety scores on subjects' responses to volcanic eruption hazard and earthquake hazard in Lonquimay and Alhué, respectively. Lonquimay is located in the southern Chilean Andes and Alhué is located in central Chile in the Coastal Range. The former was afflicted by a volcanic eruption which commenced on Christmas Day 1988 and the latter by an earthquake on March 3, 1985. Expectations of high damage and fear from a radio hazard prediction were associated with high trait-anxiety scores in the Alhué sample while positive adjustments to extenuate the hazard effect reached significance for the Lonquimay sample. PMID:2326130

  15. Ice Mass Fluctuations and Earthquake Hazard

    NASA Technical Reports Server (NTRS)

    Sauber, J.

    2006-01-01

    In south central Alaska, tectonic strain rates are high in a region that includes large glaciers undergoing ice wastage over the last 100-150 years [Sauber et al., 2000; Sauber and Molnia, 2004]. In this study we focus on the region referred to as the Yakataga segment of the Pacific-North American plate boundary zone in Alaska. In this region, the Bering and Malaspina glacier ablation zones have average ice elevation decreases from 1-3 meters/year (see summary and references in Molnia, 2005). The elastic response of the solid Earth to this ice mass decrease alone would cause several mm/yr of horizontal motion and uplift rates of up to 10-12 mm/yr. In this same region observed horizontal rates of tectonic deformation range from 10 to 40 mm/yr to the north-northwest and the predicted tectonic uplift rates range from -2 mm/year near the Gulf of Alaska coast to 12mm/year further inland [Savage and Lisowski, 1988; Ma et al, 1990; Sauber et al., 1997, 2000, 2004; Elliot et al., 2005]. The large ice mass changes associated with glacial wastage and surges perturb the tectonic rate of deformation at a variety of temporal and spatial scales. The associated incremental stress change may enhance or inhibit earthquake occurrence. We report recent (seasonal to decadal) ice elevation changes derived from data from NASA's ICESat satellite laser altimeter combined with earlier DEM's as a reference surface to illustrate the characteristics of short-term ice elevation changes [Sauber et al., 2005, Muskett et al., 2005]. Since we are interested in evaluating the effect of ice changes on faulting potential, we calculated the predicted surface displacement changes and incremental stresses over a specified time interval and calculated the change in the fault stability margin using the approach given by Wu and Hasegawa [1996]. Additionally, we explored the possibility that these ice mass fluctuations altered the seismic rate of background seismicity. Although we primarily focus on

  16. Maximum Earthquake Magnitude Assessments by Japanese Government Committees (Invited)

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2013-12-01

    The 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history and such a gigantic earthquake was not foreseen around Japan. After the 2011 disaster, various government committees in Japan have discussed and assessed the maximum credible earthquake size around Japan, but their values vary without definite consensus. I will review them with earthquakes along the Nankai Trough as an example. The Central Disaster Management Council, under Cabinet Office, set up a policy for the future tsunami disaster mitigation. The possible future tsunamis are classified into two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, for which saving people's lives is the first priority with soft measures such as tsunami hazard maps, evacuation facilities or disaster education. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared. The assessments of L1 and L2 events are left to local governments. The CDMC also assigned M 9.1 as the maximum size of earthquake along the Nankai trough, then computed the ground shaking and tsunami inundation for several scenario earthquakes. The estimated loss is about ten times the 2011 disaster, with maximum casualties of 320,000 and economic loss of 2 trillion dollars. The Headquarters of Earthquake Research Promotion, under MEXT, was set up after the 1995 Kobe earthquake and has made long-term forecast of large earthquakes and published national seismic hazard maps. The future probability of earthquake occurrence, for example in the next 30 years, was calculated from the past data of large earthquakes, on the basis of characteristic earthquake model. The HERP recently revised the long-term forecast of Naknai trough earthquake; while the 30 year probability (60 - 70 %) is similar to the previous estimate, they noted the size can be M 8 to 9, considering the variability of past

  17. Integrating Real-time Earthquakes into Natural Hazard Courses

    NASA Astrophysics Data System (ADS)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.

    2001-12-01

    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm

  18. Mitigation of earthquake hazards using seismic base isolation systems

    SciTech Connect

    Wang, C.Y.

    1994-06-01

    This paper deals with mitigation of earthquake hazards using seismic base-isolation systems. A numerical algorithm is described for system response analysis of isolated structures with laminated elastomer bearings. The focus of this paper is on the adaptation of a nonlinear constitutive equation for the isolation bearing, and the treatment of foundation embedment for the soil-structure-interaction analysis. Sample problems are presented to illustrate the mitigating effect of using base-isolation systems.

  19. Probabilistic Tsunami Hazard Assessment for Nuclear Power Plants in Japan

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2012-12-01

    Tsunami hazard assessments for nuclear power stations (NPS) in Japan had been conducted by a deterministic method, but probabilistic methods are being adopted following the accident of Fukushima Daiichi NPS. The deterministic tsunami hazard assessment (DTHA), proposed by Japan Society of Civil Engineers in 2002 (Yanagisawa et al., 2007, Pageoph) considers various uncertainties by parameter studies. The design tsunami height at Fukushima NPS was set as 6.1 m, based on parameter studies by varying location, depth, and strike, dip and slip angles of the 1938 off-Fukushima earthquake (M 7.4). The maximum tsunami height for a hypothetical "tsunami earthquake" off Fukushima, similar to the 1896 Sanriku earthquake (Mt 8.2), and that for the 869 Jogan earthquake model (Mw 8.4) were estimated as 15.7 m and 8.9 m, respectively, before the 2011 accident (TEPCO report, 2012). The actual tsunami height at the Fukushima NPS on March 11, 2011 was 12 to 16 m. A probabilistic tsunami hazard assessment (PTHA) has been also proposed by JSCE (An'naka et al., 2007, Pageoph), and recently adopted in "Implementation Standard of Tsunami Probabilistic Risk Assessment (PRA) of NPPs" published in 2012 by Atomic Energy Society of Japan. In PTHA, tsunami hazard curves, or probability of exeedance for tsunami heights, are constructed by integrating over aleatory uncertainties. The epistemic uncertainties are treated as branches of logic trees. The logic-tree branches for the earthquake source include the earthquake type, magnitude range, recurrence interval and the parameters of BPT distribution for the recurrent earthquakes. Because no "tsunami earthquake" was recorded off the Fukushima NPS, whether or not a "tsunami earthquake" occurs along the Japan trench off Fukushima, was a one of logic-tree branches, and the weight was determined by experts' opinions. Possibilities for multi-segment earthquakes are now added as logic-tree branches, after the 2011 Tohoku earthquake, which is considered as

  20. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation

  1. Awareness and understanding of earthquake hazards at school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  2. Earthquake Hazard in the New Madrid Seismic Zone Remains a Concern

    USGS Publications Warehouse

    Frankel, A.D.; Applegate, D.; Tuttle, M.P.; Williams, R.A.

    2009-01-01

    There is broad agreement in the scientific community that a continuing concern exists for a major destructive earthquake in the New Madrid seismic zone. Many structures in Memphis, Tenn., St. Louis, Mo., and other communities in the central Mississippi River Valley region are vulnerable and at risk from severe ground shaking. This assessment is based on decades of research on New Madrid earthquakes and related phenomena by dozens of Federal, university, State, and consulting earth scientists. Considerable interest has developed recently from media reports that the New Madrid seismic zone may be shutting down. These reports stem from published research using global positioning system (GPS) instruments with results of geodetic measurements of strain in the Earth's crust. Because of a lack of measurable strain at the surface in some areas of the seismic zone over the past 14 years, arguments have been advanced that there is no buildup of stress at depth within the New Madrid seismic zone and that the zone may no longer pose a significant hazard. As part of the consensus-building process used to develop the national seismic hazard maps, the U.S. Geological Survey (USGS) convened a workshop of experts in 2006 to evaluate the latest findings in earthquake hazards in the Eastern United States. These experts considered the GPS data from New Madrid available at that time that also showed little to no ground movement at the surface. The experts did not find the GPS data to be a convincing reason to lower the assessment of earthquake hazard in the New Madrid region, especially in light of the many other types of data that are used to construct the hazard assessment, several of which are described here.

  3. Seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir; Nekrasova, Anastasia

    2014-05-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. Any kind of risk estimates R(g) at location g results from a convolution of the natural hazard H(g) with the exposed object under consideration O(g) along with its vulnerability V(O(g)). Note that g could be a point, or a line, or a cell on or under the Earth surface and that distribution of hazards, as well as objects of concern and their vulnerability, could be time-dependent. There exist many different risk estimates even if the same object of risk and the same hazard are involved. It may result from the different laws of convolution, as well as from different kinds of vulnerability of an object of risk under specific environments and conditions. Both conceptual issues must be resolved in a multidisciplinary problem oriented research performed by specialists in the fields of hazard, objects of risk, and object vulnerability, i.e. specialists in earthquake engineering, social sciences and economics. To illustrate this general concept, we first construct seismic hazard assessment maps based on the Unified Scaling Law for Earthquakes (USLE). The parameters A, B, and C of USLE, i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an area of linear size L, are used to estimate the expected maximum

  4. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  5. Seismic hazard assessment in Aswan, Egypt

    NASA Astrophysics Data System (ADS)

    Deif, A.; Hamed, H.; Ibrahim, H. A.; Abou Elenean, K.; El-Amin, E.

    2011-12-01

    The study of earthquake activity and seismic hazard assessment around Aswan is very important due to the proximity of the Aswan High Dam. The Aswan High Dam is based on hard Precambrian bedrock and is considered to be the most important project in Egypt from the social, agricultural and electrical energy production points of view. The seismotectonic settings around Aswan strongly suggest that medium to large earthquakes are possible, particularly along the Kalabsha, Seiyal and Khor El-Ramla faults. The seismic hazard for Aswan is calculated utilizing the probabilistic approach within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for eight ground motion spectral periods and for a return period of 475 years, which is deemed appropriate for structural design standards in the Egyptian building codes. The results were also displayed in terms of uniform hazard spectra for rock sites at the Aswan High Dam for return periods of 475 and 2475 years. In addition, the ground-motion levels are also deaggregated at the dam site, in order to provide insight into which events are the most important for hazard estimation. The peak ground acceleration ranges between 36 and 152 cm s-2 for return periods of 475 years (equivalent to 90% probability of non-exceedance in 50 years). Spectral hazard values clearly indicate that compared with countries of high seismic risk, the seismicity in the Aswan region can be described as low at most sites to moderate in the area between the Kalabsha and Seyial faults.

  6. Nationwide Assessment of Seismic Hazard for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.; Mumladze, T.

    2014-12-01

    The work presents a framework for assessment of seismic hazards on national level for the Georgia. Based on a historical review of the compilation of seismic hazard zoning maps for the Georgia became evident that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation. The methodology for the probabilistic assessment of seismic hazard used here includes the following steps: produce comprehensive catalogue of historical earthquakes (up to 1900) and the period of instrumental observations with uniform scale of magnitudes; produce models of seismic source zones (SSZ) and their parameterization; develop appropriate ground motion prediction equation (GMPE) models; develop seismic hazard curves for spectral amplitudes at each period and maps in digital format. Firstly, the new seismic catalog of Georgia was created, with 1700 eqs from ancient times on 2012, Mw³4.0. Secondly, were allocated seismic source zones (SSZ). The identification of area SSZ was obtained on the bases of structural geology, parameters of seismicity and seismotectonics. In constructing the SSZ, the slope of the appropriate active fault plane, the width of the dynamic influence of the fault, power of seismoactive layer are taken into account. Finally each SSZ was defined with the parameters: the geometry, the percentage of focal mechanism, predominant azimuth and dip angle values, activity rates, maximum magnitude, hypocenter depth distribution, lower and upper seismogenic depth values. Thirdly, seismic hazard maps were calculated based on modern approach of selecting and ranking global and regional ground motion prediction equation for region. Finally, probabilistic seismic hazard assessment in terms of ground acceleration were calculated for the territory of Georgia. On the basis of obtained area seismic sources probabilistic seismic hazard maps were calculated showing peak ground acceleration (PGA) and spectral accelerations (SA) at

  7. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  8. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  9. Volcano and earthquake hazards in the Crater Lake region, Oregon

    USGS Publications Warehouse

    Bacon, Charles R.; Mastin, Larry G.; Scott, Kevin M.; Nathenson, Manuel

    1997-01-01

    Crater Lake lies in a basin, or caldera, formed by collapse of the Cascade volcano known as Mount Mazama during a violent, climactic eruption about 7,700 years ago. This event dramatically changed the character of the volcano so that many potential types of future events have no precedent there. This potentially active volcanic center is contained within Crater Lake National Park, visited by 500,000 people per year, and is adjacent to the main transportation corridor east of the Cascade Range. Because a lake is now present within the most likely site of future volcanic activity, many of the hazards at Crater Lake are different from those at most other Cascade volcanoes. Also significant are many faults near Crater Lake that clearly have been active in the recent past. These faults, and historic seismicity, indicate that damaging earthquakes can occur there in the future. This report describes the various types of volcano and earthquake hazards in the Crater Lake area, estimates of the likelihood of future events, recommendations for mitigation, and a map of hazard zones. The main conclusions are summarized below.

  10. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    SciTech Connect

    Blanchard, A.

    2000-02-28

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program.

  11. Lessons on Seismic Hazard Estimation from the 2003 Bingol, Turkey Earthquake

    NASA Astrophysics Data System (ADS)

    Nalbant, S. S.; Steacy, S.; McCloskey, J.

    2003-12-01

    In a 2002 paper the stress state along the East Anatolian Fault Zone (EAFZ) was estimated by the addition of long term tectonic loading to the static stressing effect of a series of large historical earthquakes. The results clearly indicated two areas of particular concern. The first extended along the EAFZ between the cities of Kahraman Maras and Malatya and the second along the trend of the EAFZ between the cities of Elazig and Bingol. The Bingol (M6.4, 1 May 2003) earthquake occurred within this second area with a focal mechanism which was consistent with left lateral rupture of a buried segment of the EAFZ, prompting suggestions that this represented a success for the idea of using Coulomb Stress Modelling to assess seismic hazard. This success, however, depended on the confirmation of the orientation of the earthquake fault; in the event, and in the absence of surface ruptures, aftershock distributions unambiguously showed that the event was a right lateral failure on an unmapped structure conjugate to the EAFZ. The Bingol earthquake was, therefore, not encouraged by the stress field modelled in the 2002 study. Here we reflect on the lessons learned from this case. We identify three possible reasons for the discrepancy between the calculations and the occurrence of the Bingol earthquake. Firstly, historical earthquakes used in the 2002 study may have been incorrectly modelled in either size or location. Secondly, earthquakes not included in the study, due to either their size or occurrence time, may have had a significant effect on the stress field. Or, finally, the secular stress used to load the faults was inappropriate. We argue that it is through a combination of historical seismology guided and constrained by structural geology, directed paleoseismology and coupled with stress modelling which has been informed by detailed GPS data that an integrated seismic hazard program might have the best chance of success.

  12. Assessing Earthquake Risks in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2014-10-01

    Megaquakes, which are subduction earthquakes with magnitudes of 8 or greater, occur about every 500 years on average along the Cascadia Subduction Zone in the Pacific Northwest. The earthquakes and related tsunamis can cause enormous damage. However, they may not be the most urgent seismic threat in the region, according to John Clague, a professor and expert on natural hazards at Simon Fraser University (SFU) in British Columbia.

  13. How Can Museum Exhibits Enhance Earthquake and Tsunami Hazard Resiliency?

    NASA Astrophysics Data System (ADS)

    Olds, S. E.

    2015-12-01

    Creating a natural disaster-ready community requires interoperating scientific, technical, and social systems. In addition to the technical elements that need to be in place, communities and individuals need to be prepared to react when a natural hazard event occurs. Natural hazard awareness and preparedness training and education often takes place through informal learning at science centers and formal k-12 education programs as well as through awareness raising via strategically placed informational tsunami warning signs and placards. Museums and science centers are influential in raising science literacy within a community, however can science centers enhance earthquake and tsunami resiliency by providing hazard science content and preparedness exhibits? Museum docents and informal educators are uniquely situated within the community. They are transmitters and translators of science information to broad audiences. Through interaction with the public, docents are well positioned to be informants of the knowledge beliefs, and feelings of science center visitors. They themselves are life-long learners, both constantly learning from the museum content around them and sharing this content with visitors. They are also members of a community where they live. In-depth interviews with museum informal educators and docents were conducted at a science center in coastal Pacific Northwest. This region has a potential to be struck by a great 9+ Mw earthquake and subsequent tsunami. During the interviews, docents described how they applied learning from natural hazard exhibits at a science visitor center to their daily lives. During the individual interviews, the museum docents described their awareness (knowledge, attitudes, and behaviors) of natural hazards where they live and work, the feelings evoked as they learned about their hazard vulnerability, the extent to which they applied this learning and awareness to their lives, such as creating an evacuation plan, whether

  14. Using a physics-based earthquake simulator to evaluate seismic hazard in NW Iran

    NASA Astrophysics Data System (ADS)

    khodaverdian, A.; Zafarani, H.; Rahimian, M.

    2016-04-01

    NW Iran is a region of active deformation in the Eurasia-Arabia collision zone. This high strain field has caused intensive faulting accompanied by several major (M 6> 6.5) earthquakes as it is evident from historical records. Whereas seismic data (i.e. instrumental and historical catalogs) are either short, or inaccurate and inhomogeneous, physics-based long-term simulations are beneficial to better assess seismic hazard. In the present study, a deterministic seismicity model, which consists of major active faults, is first constructed, and used to generate a synthetic catalog of large-magnitude (M 6> 5.5) earthquakes. The frequency-magnitude distribution of the synthetic earthquake catalog, which is based on the physical characteristic and slip rate of the mapped faults, is consistent with the empirical distribution evaluated using record of instrumental and historical events. The obtained results are also in accordance with paleoseismic studies and other independent kinematic deformation models of the Iranian Plateau. Using the synthetic catalog, characteristic magnitude for all 16 active faults in the study area is determined. Magnitude and epicenter of these earthquakes are comparable with the historical records. Large earthquake recurrence times and their variations are evaluated, either for an individual fault or for the region as a whole. Goodness-of-fitness tests revealed that recurrence times can be well described by the Weibull distribution. Time-dependent conditional probabilities for large earthquakes in the study area are also estimated for different time intervals The resulting synthetic catalog can be utilized as a useful dataset for hazard and risk assessment instead of short, incomplete, and inhomogeneous available catalogs.

  15. Using a physics-based earthquake simulator to evaluate seismic hazard in NW Iran

    NASA Astrophysics Data System (ADS)

    Khodaverdian, A.; Zafarani, H.; Rahimian, M.

    2016-07-01

    NW Iran is a region of active deformation in the Eurasia-Arabia collision zone. This high strain field has caused intensive faulting accompanied by several major (M > 6.5) earthquakes as it is evident from historical records. Whereas seismic data (i.e. instrumental and historical catalogues) are either short, or inaccurate and inhomogeneous, physics-based long-term simulations are beneficial to better assess seismic hazard. In this study, a deterministic seismicity model, which consists of major active faults, is first constructed, and used to generate a synthetic catalogue of large-magnitude (M > 5.5) earthquakes. The frequency-magnitude distribution of the synthetic earthquake catalogue, which is based on the physical characteristic and slip rate of the mapped faults, is consistent with the empirical distribution evaluated using record of instrumental and historical events. The obtained results are also in accordance with palaeoseismic studies and other independent kinematic deformation models of the Iranian Plateau. Using the synthetic catalogue, characteristic magnitude for all 16 active faults in the study area is determined. Magnitude and epicentre of these earthquakes are comparable with the historical records. Large earthquake recurrence times and their variations are evaluated, either for an individual fault or for the region as a whole. Goodness-of-fitness tests revealed that recurrence times can be well described by the Weibull distribution. Time-dependent conditional probabilities for large earthquakes in the study area are also estimated for different time intervals. The resulting synthetic catalogue can be utilized as a useful data set for hazard and risk assessment instead of short, incomplete and inhomogeneous available catalogues.

  16. Cruise report for 01-99-SC: southern California earthquake hazards project

    USGS Publications Warehouse

    Normark, William R.; Reid, Jane A.; Sliter, Ray W.; Holton, David; Gutmacher, Christina E.; Fisher, Michael A.; Childs, Jonathan R.

    1999-01-01

    The focus of the Southern California Earthquake Hazards project is to identify the landslide and earthquake hazards and related ground-deformation processes occurring in the offshore areas that have significant potential to impact the inhabitants of the Southern California coastal region. The project activity is supported through the Coastal and Marine Geology Program of the Geologic Division of the U. S. Geological Survey (USGS) and is a component of the Geologic Division's Science Strategy under Goal 1—Conduct Geologic Hazard Assessments for Mitigation Planning (Bohlen et al., 1998). The project research is specifically stated under Activity 1.1.2 of the Science Strategy: Earthquake Hazard Assessments and Loss Reduction Products in Urban Regions. This activity involves "research, seismic and geodetic monitoring, field studies, geologic mapping, and analyses needed to provide seismic hazard assessments of major urban centers in earthquake-prone regions including adjoining coastal and offshore areas." The southern California urban areas, which form the most populated urban corridor along the U.S. Pacific margin, are among a few specifically designated for special emphasis under the Division's science strategy (Bohlen et al., 1998). The primary objective of the project is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this objective, we are conducting field investigations to observe the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (Fig. 1). In addition, acoustic imaging should help determine the subsurface dimensions of the faults and identify the size and frequency of submarine landslides, both of which are necessary for evaluating the potential for

  17. Numerical earthquake model of the 20 April 2015 southern Ryukyu subduction zone M6.4 event and its impact on seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong

    2015-10-01

    The M6.4 earthquake that took place on the 20 April 2015 off the shore of eastern Taiwan was the largest event in the vicinity of Taiwan during 2015. The mainshock was located in the southern Ryukyu subduction zone, which is the interface between the Philippine Sea Plate and the Eurasian Plate. People in Taipei experienced strong ground shaking for more than 40 s, even though the epicenter was located more than 150 km away. In order to understand the origin of ground motions from this earthquake and how it caused such strong shaking in Taipei, a numerical earthquake model is analyzed, including models of source rupture and wave propagation. First, a joint source inversion was performed using teleseismic body wave and local ground motion data. Source inversion results show that a large slip occurred near the hypocenter, which rapidly released seismic energy in the first 2 s. Then, the rupture propagated toward the shallow fault plane. A large amount of seismic energy was released during this rupture stage that slipped for more than 8 s before the end of the rupture. The estimated stress drop is 2.48 MPa, which is consistent with values for subduction zone earthquakes. Forward simulation using this inverted source rupture model and a 3D seismic velocity model based on the spectral-element method was then performed. Results indicate that the strong ground motion in Taipei resulted from two factors: (1) the Taipei basin amplification effect and (2) the specific source radiation pattern. The results of this numerical earthquake model imply that future subduction zone events that occur in offshore eastern Taiwan are likely to cause relatively strong ground shaking in northern Taiwan, especially in the Taipei metropolitan area.

  18. Lower-bound magnitude for probabilistic seismic hazard assessment

    SciTech Connect

    McCann, M.W. Jr.; Reed, J.W. and Associates, Inc., Mountain View, CA )

    1989-10-01

    This report provides technical information to determine the lower-bound earthquake magnitude (LBM) for use in probabilistic seismic hazard (PSH) computations that are applied to nuclear plant applications. The evaluations consider the seismologic characteristics of earthquake experience at similar facilities and insights from probabilistic risk analysis. The recommendations for LBM satisfy the two basic precepts: (1) there is a reasonable engineering assurance that the likelihood of damage due to earthquakes smaller than the LBM is negligible, and (2) any small risk due to earthquakes smaller than the LBM is compensated by conservatisms in PSH results for larger earthquakes. Theoretical and empirical ground motion studies demonstrate that ground shaking duration and spectral shape are a strong function of earthquake magnitude. Small earthquakes have short duration and spectral shapes centered at high frequencies as compared to nuclear power plant design spectra which are typical of moderate and large earthquakes. Analysis of earthquake experience data shows damage to heavy industrial facilities, taken as analogs to nuclear plant structures and components, occurs for earthquakes having moment magnitude M larger than 5.1. Probabilistic seismic risk and margins studies show nuclear plant structures and adequately anchored ductile components to be rugged for moderate-size earthquakes with broad design-type spectral shapes. They may, therefore, be considered rugged for small earthquakes. Finally, nonlinear analysis of the damage effectiveness of strong-motion recordings shows that potential damage does not occur for earthquakes smaller than about M5.6. These results support a conservative LBM of M5.0 for application to nuclear power plant PSH assessments. 144 refs., 78 figs., 34 tabs.

  19. Salient beliefs about earthquake hazards and household preparedness.

    PubMed

    Becker, Julia S; Paton, Douglas; Johnston, David M; Ronan, Kevin R

    2013-09-01

    Prior research has found little or no direct link between beliefs about earthquake risk and household preparedness. Furthermore, only limited work has been conducted on how people's beliefs influence the nature and number of preparedness measures adopted. To address this gap, 48 qualitative interviews were undertaken with residents in three urban locations in New Zealand subject to seismic risk. The study aimed to identify the diverse hazard and preparedness-related beliefs people hold and to articulate how these are influenced by public education to encourage preparedness. The study also explored how beliefs and competencies at personal, social, and environmental levels interact to influence people's risk management choices. Three main categories of beliefs were found: hazard beliefs; preparedness beliefs; and personal beliefs. Several salient beliefs found previously to influence the preparedness process were confirmed by this study, including beliefs related to earthquakes being an inevitable and imminent threat, self-efficacy, outcome expectancy, personal responsibility, responsibility for others, and beliefs related to denial, fatalism, normalization bias, and optimistic bias. New salient beliefs were also identified (e.g., preparedness being a "way of life"), as well as insight into how some of these beliefs interact within the wider informational and societal context. PMID:23339741

  20. Earthquake Scaling and Development of Ground Motion Prediction for Earthquake Hazard Mitigation in Taiwan

    NASA Astrophysics Data System (ADS)

    Ma, K.; Yen, Y.

    2011-12-01

    For earthquake hazard mitigation toward risk management, integration study from development of source model to ground motion prediction is crucial. The simulation for high frequency component ( > 1 Hz) of strong ground motions in the near field was not well resolved due to the insufficient resolution in velocity structure. Using the small events as Green's functions (i.e. empirical Green's function (EGF) method) can resolve the problem of lack of precise velocity structure to replace the path effect evaluation. If the EGF is not available, a stochastic Green's function (SGF) method can be employed. Through characterizing the slip models derived from the waveform inversion, we directly extract the parameters needed for the ground motion prediction in the EGF method or the SGF method. The slip models had been investigated from Taiwan dense strong motion and global teleseismic data. In addition, the low frequency ( < 1 Hz) can obtained numerically by the Frequency-Wavenumber (FK) method. Thus, broadband frequency strong ground motion can be calculated by a hybrid method that combining a deterministic FK method for the low frequency simulation and the EGF or SGF method for high frequency simulation. Characterizing the definitive source parameters from the empirical scaling study can provide directly to the ground motion simulation. To give the ground motion prediction for a scenario earthquake, we compiled the earthquake scaling relationship from the inverted finite-fault models of moderate to large earthquakes in Taiwan. The studies show the significant involvement of the seismogenic depth to the development of rupture width. In addition to that, several earthquakes from blind fault show distinct large stress drop, which yield regional high PGA. According to the developing scaling relationship and the possible high stress drops for earthquake from blind faults, we further deploy the hybrid method mentioned above to give the simulation of the strong motion in

  1. Evansville Area Earthquake Hazards Mapping Project (EAEHMP) - Progress Report, 2008

    USGS Publications Warehouse

    Boyd, Oliver S.; Haase, Jennifer L.; Moore, David W.

    2009-01-01

    Maps of surficial geology, deterministic and probabilistic seismic hazard, and liquefaction potential index have been prepared by various members of the Evansville Area Earthquake Hazard Mapping Project for seven quadrangles in the Evansville, Indiana, and Henderson, Kentucky, metropolitan areas. The surficial geologic maps feature 23 types of surficial geologic deposits, artificial fill, and undifferentiated bedrock outcrop and include alluvial and lake deposits of the Ohio River valley. Probabilistic and deterministic seismic hazard and liquefaction hazard mapping is made possible by drawing on a wealth of information including surficial geologic maps, water well logs, and in-situ testing profiles using the cone penetration test, standard penetration test, down-hole shear wave velocity tests, and seismic refraction tests. These data were compiled and collected with contributions from the Indiana Geological Survey, Kentucky Geological Survey, Illinois State Geological Survey, United States Geological Survey, and Purdue University. Hazard map products are in progress and are expected to be completed by the end of 2009, with a public roll out in early 2010. Preliminary results suggest that there is a 2 percent probability that peak ground accelerations of about 0.3 g will be exceeded in much of the study area within 50 years, which is similar to the 2002 USGS National Seismic Hazard Maps for a firm rock site value. Accelerations as high as 0.4-0.5 g may be exceeded along the edge of the Ohio River basin. Most of the region outside of the river basin has a low liquefaction potential index (LPI), where the probability that LPI is greater than 5 (that is, there is a high potential for liquefaction) for a M7.7 New Madrid type event is only 20-30 percent. Within the river basin, most of the region has high LPI, where the probability that LPI is greater than 5 for a New Madrid type event is 80-100 percent.

  2. Errors in Seismic Hazard Assessment are Creating Huge Human Losses

    NASA Astrophysics Data System (ADS)

    Bela, J.

    2015-12-01

    The current practice of representing earthquake hazards to the public based upon their perceived likelihood or probability of occurrence is proven now by the global record of actual earthquakes to be not only erroneous and unreliable, but also too deadly! Earthquake occurrence is sporadic and therefore assumptions of earthquake frequency and return-period are both not only misleading, but also categorically false. More than 700,000 people have now lost their lives (2000-2011), wherein 11 of the World's Deadliest Earthquakes have occurred in locations where probability-based seismic hazard assessments had predicted only low seismic low hazard. Unless seismic hazard assessment and the setting of minimum earthquake design safety standards for buildings and bridges are based on a more realistic deterministic recognition of "what can happen" rather than on what mathematical models suggest is "most likely to happen" such future huge human losses can only be expected to continue! The actual earthquake events that did occur were at or near the maximum potential-size event that either already had occurred in the past; or were geologically known to be possible. Haiti's M7 earthquake, 2010 (with > 222,000 fatalities) meant the dead could not even be buried with dignity. Japan's catastrophic Tohoku earthquake, 2011; a M9 Megathrust earthquake, unleashed a tsunami that not only obliterated coastal communities along the northern Japanese coast, but also claimed > 20,000 lives. This tsunami flooded nuclear reactors at Fukushima, causing 4 explosions and 3 reactors to melt down. But while this history of huge human losses due to erroneous and misleading seismic hazard estimates, despite its wrenching pain, cannot be unlived; if faced with courage and a more realistic deterministic estimate of "what is possible", it need not be lived again. An objective testing of the results of global probability based seismic hazard maps against real occurrences has never been done by the

  3. Secondary impact hazard assessment

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A series of light gas gun shots (4 to 7 km/sec) were performed with 5 mg nylon and aluminum projectiles to determine the size, mass, velocity, and spatial distribution of spall and ejecta from a number of graphite/epoxy targets. Similar determinations were also performed on a few aluminum targets. Target thickness and material were chosen to be representative of proposed Space Station structure. The data from these shots and other information were used to predict the hazard to Space Station elements from secondary particles resulting from impacts of micrometeoroids and orbital debris on the Space Station. This hazard was quantified as an additional flux over and above the primary micrometeoroid and orbital debris flux that must be considered in the design process. In order to simplify the calculations, eject and spall mass were assumed to scale directly with the energy of the projectile. Other scaling systems may be closer to reality. The secondary particles considered are only those particles that may impact other structure immediately after the primary impact. The addition to the orbital debris problem from these primary impacts was not addressed. Data from this study should be fed into the orbital debris model to see if Space Station secondaries make a significant contribution to orbital debris. The hazard to a Space Station element from secondary particles above and beyond the micrometeoroid and orbital debris hazard is categorized in terms of two factors: (1) the 'view factor' of the element to other Space Station structure or the geometry of placement of the element, and (2) the sensitivity to damage, stated in terms of energy. Several example cases were chosen, the Space Station module windows, windows of a Shuttle docked to the Space Station, the habitat module walls, and the photovoltaic solar cell arrays. For the examples chosen the secondary flux contributed no more than 10 percent to the total flux (primary and secondary) above a given calculated

  4. Secondary impact hazard assessment

    NASA Astrophysics Data System (ADS)

    1986-06-01

    A series of light gas gun shots (4 to 7 km/sec) were performed with 5 mg nylon and aluminum projectiles to determine the size, mass, velocity, and spatial distribution of spall and ejecta from a number of graphite/epoxy targets. Similar determinations were also performed on a few aluminum targets. Target thickness and material were chosen to be representative of proposed Space Station structure. The data from these shots and other information were used to predict the hazard to Space Station elements from secondary particles resulting from impacts of micrometeoroids and orbital debris on the Space Station. This hazard was quantified as an additional flux over and above the primary micrometeoroid and orbital debris flux that must be considered in the design process. In order to simplify the calculations, eject and spall mass were assumed to scale directly with the energy of the projectile. Other scaling systems may be closer to reality. The secondary particles considered are only those particles that may impact other structure immediately after the primary impact. The addition to the orbital debris problem from these primary impacts was not addressed. Data from this study should be fed into the orbital debris model to see if Space Station secondaries make a significant contribution to orbital debris. The hazard to a Space Station element from secondary particles above and beyond the micrometeoroid and orbital debris hazard is categorized in terms of two factors: (1) the 'view factor' of the element to other Space Station structure or the geometry of placement of the element, and (2) the sensitivity to damage, stated in terms of energy. Several example cases were chosen, the Space Station module windows, windows of a Shuttle docked to the Space Station, the habitat module walls, and the photovoltaic solar cell arrays. For the examples chosen the secondary flux contributed no more than 10 percent to the total flux (primary and secondary) above a given calculated

  5. Transparent Global Seismic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen

    2013-04-01

    Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits

  6. Earthquake Hazard Mitigation and Real-Time Warnings of Tsunamis and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2015-09-01

    With better understanding of earthquake physics and the advent of broadband seismology and GPS, seismologists can forecast the future activity of large earthquakes on a sound scientific basis. Such forecasts are critically important for long-term hazard mitigation, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainties, and unexpected events will inevitably occur. Recent developments in real-time seismology helps seismologists cope with and prepare for such unexpected events, including tsunamis and earthquakes. For a tsunami warning, the required warning time is fairly long (usually 5 min or longer) and enables use of a rigorous method for this purpose. Significant advances have already been made. In contrast, early warning of earthquakes is far more challenging because the required warning time is very short (as short as three seconds). Despite this difficulty the methods used for regional warnings have advanced substantially, and several systems have been already developed and implemented. A future strategy for more challenging, rapid (a few second) warnings, which are critically important for saving properties and lives, is discussed.

  7. Probabilistic Seismic Hazard Assessment from Incomplete and Uncertain Data

    NASA Astrophysics Data System (ADS)

    Smit, Ansie; Kijko, Andrzej

    2016-04-01

    A question that frequently arises with seismic hazard assessment is why are our assessments so poor? Often the answer is that in many cases the standard applied methodologies do not take into account the nature of seismic event catalogs. In reality these catalogues are incomplete with uncertain magnitude estimates and a significant discrepancy between the empirical data and applied occurrence model. Most probabilistic seismic hazard analysis procedures require knowledge of at least three seismic source parameters: the mean seismic activity rate λ, the Gutenberg-Richter b-value, and the area-characteristic (seismogenic source) maximum possible earthquake magnitude Mmax. In almost all currently used seismic hazard assessment procedures utilizing these three parameters, it's explicitly assumed that all three remain constant over a specified time and space. However, closer examination of most earthquake catalogues indicates that there are significant spatial and temporal variations in the seismic activity rate λ as well as the Gutenberg-Richter b-value. In the proposed methodology the maximum likelihood estimation of these earthquake hazard parameters takes into account the incompleteness of catalogues, uncertainty in the earthquake magnitude determination as well as the uncertainty associated with the applied earthquake occurrence models. The uncertainty in the earthquake occurrence models are introduced by assuming that both, the mean, seismic activity rate λ and the b-value of Gutenberg-Richter are random variables, each described by the Gamma distribution. The approach results in the extension of the classic frequency-magnitude Gutenberg-Richter relation and the Poisson distribution of number of earthquakes, with their compounded counterparts. The proposed procedure is applied in the estimation of the seismic parameters for the area of Ceres-Tulbagh, South Africa, which experienced the strongest earthquake in the country's recorded history. In this example it is

  8. Large Historical Earthquakes and Tsunami Hazards in the Western Mediterranean: Source Characteristics and Modelling

    NASA Astrophysics Data System (ADS)

    Harbi, Assia; Meghraoui, Mustapha; Belabbes, Samir; Maouche, Said

    2010-05-01

    The western Mediterranean region was the site of numerous large earthquakes in the past. Most of these earthquakes are located at the East-West trending Africa-Eurasia plate boundary and along the coastline of North Africa. The most recent recorded tsunamigenic earthquake occurred in 2003 at Zemmouri-Boumerdes (Mw 6.8) and generated ~ 2-m-high tsunami wave. The destructive wave affected the Balearic Islands and Almeria in southern Spain and Carloforte in southern Sardinia (Italy). The earthquake provided a unique opportunity to gather instrumental records of seismic waves and tide gauges in the western Mediterranean. A database that includes a historical catalogue of main events, seismic sources and related fault parameters was prepared in order to assess the tsunami hazard of this region. In addition to the analysis of the 2003 records, we study the 1790 Oran and 1856 Jijel historical tsunamigenic earthquakes (Io = IX and X, respectively) that provide detailed observations on the heights and extension of past tsunamis and damage in coastal zones. We performed the modelling of wave propagation using NAMI-DANCE code and tested different fault sources from synthetic tide gauges. We observe that the characteristics of seismic sources control the size and directivity of tsunami wave propagation on both northern and southern coasts of the western Mediterranean.

  9. Earthquake and Flood Risk Assessments for Europe and Central Asia

    NASA Astrophysics Data System (ADS)

    Murnane, R. J.; Daniell, J. E.; Ward, P.; Winsemius, H.; Tijssen, A.; Toro, J.

    2015-12-01

    We report on a flood and earthquake risk assessment for 32 countries in Europe and Central Asia with a focus on how current flood and earthquake risk might evolve in the future due to changes in climate, population, and GDP. The future hazard and exposure conditions used for the risk assessment are consistent with selected IPCC AR5 Representative Concentration Pathways (RCPs) and Shared Socioeconomic Pathways (SSPs). Estimates of 2030 and 2080 population and GDP are derived using the IMAGE model forced by the socioeconomic conditions associated with the SSPs. Flood risk is modeled using the probabilistic GLOFRIS global flood risk modeling cascade which starts with meteorological fields derived from reanalysis data or climate models. For 2030 and 2080 climate conditions, the meteorological fields are generated from five climate models forced by the RCP4.5 and RCP8.5 scenarios. Future flood risk is estimated using population and GDP exposures consistent with the SSP2 and SSP3 scenarios. Population and GDP are defined as being affected by a flood when a grid cell receives any depth of flood inundation. The earthquake hazard is quantified using a 10,000-year stochastic catalog of over 15.8 million synthetic earthquake events of at least magnitude 5. Ground motion prediction and estimates of local site conditions are used to determine PGA. Future earthquake risk is estimated using population and GDP exposures consistent with all five SSPs. Population and GDP are defined as being affected by an earthquake when a grid cell experiences ground motion equaling or exceeding MMI VI. For most countries, changes in exposure alter flood risk to a greater extent than changes in climate. For both flood and earthquake, the spread in risk grows over time. There are large uncertainties due to the methodology; however, the results are not meant to be definitive. Instead they will be used to initiate discussions with governments regarding efforts to manage disaster risk.

  10. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    USGS Publications Warehouse

    Thenhaus, P.C.

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-1812 in the area of New Madrid, Missouri. they are considered to be the greatest earthquakes in the conterminous U.S because they were felt and caused damage at far greater distances than any other earthquakes in U.S history. The large population currently living within the damage area of these earthquakes means that widespread destruction and loss of life is likely if the sequence were repeated. In contrast to California, where the earthquakes are felt frequently, the damaging earthquakes that have occurred in the Easter U.S-in 155 (Cape Ann, Mass.), 1811-12 (New Madrid, Mo.), 1886 (Charleston S.C) ,and 1897 (Giles County, Va.- are generally regarded as only historical phenomena (fig. 1). The social memory of these earthquakes no longer exists. A fundamental problem in the Eastern U.S, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the "tools" that geoscientists have to study the region. The so-called earthquake hazard is defined  by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. the term "earthquake risk," on the other hand, refers to aspects of the expected damage to manmade strctures and to lifelines as a result of the earthquake hazard.  

  11. Hazard maps of earthquake induced permanent displacements validated by site numerical simulation

    NASA Astrophysics Data System (ADS)

    Vessia, Giovanna; Pisano, Luca; Parise, Mario; Tromba, Giuseppe

    2016-04-01

    Hazard maps of seismically induced instability at the urban scale can be drawn by means of GIS spatial interpolation tools starting from (1) a Digital terrain model (DTM) and (2) geological and geotechnical hydro-mechanical site characterization. These maps are commonly related to a fixed return period of the natural phenomenon under study, or to a particular hazard scenario from the most significant past events. The maps could be used to guide the planning activity as well as the emergency actions, but the main limit of such maps is that typically no reliability analyses is performed. Spatial variability and uncertainties in subsoil properties, poor description of geomorphological evidence of active instability, and geometrical approximations and simplifications in DTMs, among the others, could be responsible for inaccurate maps. In this study, a possible method is proposed to control and increase the overall reliability of an hazard scenario map for earthquake-induced slope instability. The procedure can be summarized as follows: (1) GIS Statistical tools are used to improve the spatial distribution of the hydro-mechanical properties of the surface lithologies; (2) Hazard maps are drawn from the preceding information layer on both groundwater and mechanical properties of surficial deposits combined with seismic parameters propagated by means of Ground Motion Propagation Equations; (3) Point numerical stability analyses carried out by means of the Finite Element Method (e.g. Geostudio 2004) are performed to anchor hazard maps prediction to point quantitative analyses. These numerical analyses are used to generate a conversion scale from urban to point estimates in terms of permanent displacements. Although this conversion scale differs from case to case, it could be suggested as a general method to convert the results of large scale map analyses to site hazard assessment. In this study, the procedure is applied to the urban area of Castelfranci (Avellino province

  12. Multi-hazards risk assessment at different levels

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2012-04-01

    Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The

  13. Earthquake Hazard and Risk in Sub-Saharan Africa: current status of the Global Earthquake model (GEM) initiative in the region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay; Midzi, Vunganai; Ateba, Bekoa; Mulabisana, Thifhelimbilu; Marimira, Kwangwari; Hlatywayo, Dumisani J.; Akpan, Ofonime; Amponsah, Paulina; Georges, Tuluka M.; Durrheim, Ray

    2013-04-01

    Large magnitude earthquakes have been observed in Sub-Saharan Africa in the recent past, such as the Machaze event of 2006 (Mw, 7.0) in Mozambique and the 2009 Karonga earthquake (Mw 6.2) in Malawi. The December 13, 1910 earthquake (Ms = 7.3) in the Rukwa rift (Tanzania) is the largest of all instrumentally recorded events known to have occurred in East Africa. The overall earthquake hazard in the region is on the lower side compared to other earthquake prone areas in the globe. However, the risk level is high enough for it to receive attention of the African governments and the donor community. The latest earthquake hazard map for the sub-Saharan Africa was done in 1999 and updating is long overdue as several development activities in the construction industry is booming allover sub-Saharan Africa. To this effect, regional seismologists are working together under the GEM (Global Earthquake Model) framework to improve incomplete, inhomogeneous and uncertain catalogues. The working group is also contributing to the UNESCO-IGCP (SIDA) 601 project and assessing all possible sources of data for the catalogue as well as for the seismotectonic characteristics that will help to develop a reasonable hazard model in the region. In the current progress, it is noted that the region is more seismically active than we thought. This demands the coordinated effort of the regional experts to systematically compile all available information for a better output so as to mitigate earthquake risk in the sub-Saharan Africa.

  14. Regional liquefaction hazard evaluation following the 2010-2011 Christchurch (New Zealand) earthquake sequence

    NASA Astrophysics Data System (ADS)

    Begg, John; Brackley, Hannah; Irwin, Marion; Grant, Helen; Berryman, Kelvin; Dellow, Grant; Scott, David; Jones, Katie; Barrell, David; Lee, Julie; Townsend, Dougal; Jacka, Mike; Harwood, Nick; McCahon, Ian; Christensen, Steve

    2013-04-01

    Following the damaging 4 Sept 2010 Mw7.1 Darfield Earthquake, the 22 Feb 2011 Christchurch Earthquake and subsequent damaging aftershocks, we completed a liquefaction hazard evaluation for c. 2700 km2 of the coastal Canterbury region. Its purpose was to distinguish at a regional scale areas of land that, in the event of strong ground shaking, may be susceptible to damaging liquefaction from areas where damaging liquefaction is unlikely. This information will be used by local government for defining liquefaction-related geotechnical investigation requirements for consent applications. Following a review of historic records of liquefaction and existing liquefaction assessment maps, we undertook comprehensive new work that included: a geologic context from existing geologic maps; geomorphic mapping using LiDAR and integrating existing soil map data; compilation of lithological data for the surficial 10 m from an extensive drillhole database; modelling of depth to unconfined groundwater from existing subsurface and surface water data. Integrating and honouring all these sources of information, we mapped areas underlain by materials susceptible to liquefaction (liquefaction-prone lithologies present, or likely, in the near-surface, with shallow unconfined groundwater) from areas unlikely to suffer widespread liquefaction damage. Comparison of this work with more detailed liquefaction susceptibility assessment based on closely spaced geotechnical probes in Christchurch City provides a level of confidence in these results. We tested our susceptibility map by assigning a matrix of liquefaction susceptibility rankings to lithologies recorded in drillhole logs and local groundwater depths, then applying peak ground accelerations for four earthquake scenarios from the regional probabilistic seismic hazard model (25 year return = 0.13g; 100 year return = 0.22g; 500 year return = 0.38g and 2500 year return = 0.6g). Our mapped boundary between liquefaction-prone areas and areas

  15. Comprehensive seismic hazard assessment of Tripura and Mizoram states

    NASA Astrophysics Data System (ADS)

    Sitharam, T. G.; Sil, Arjun

    2014-06-01

    Northeast India is one of the most highly seismically active regions in the world with more than seven earthquakes on an average per year of magnitude 5.0 and above. Reliable seismic hazard assessment could provide the necessary design inputs for earthquake resistant design of structures in this region. In this study, deterministic as well as probabilistic methods have been attempted for seismic hazard assessment of Tripura and Mizoram states at bedrock level condition. An updated earthquake catalogue was collected from various national and international seismological agencies for the period from 1731 to 2011. The homogenization, declustering and data completeness analysis of events have been carried out before hazard evaluation. Seismicity parameters have been estimated using G-R relationship for each source zone. Based on the seismicity, tectonic features and fault rupture mechanism, this region was divided into six major subzones. Region specific correlations were used for magnitude conversion for homogenization of earthquake size. Ground motion equations (Atkinson and Boore 2003; Gupta 2010) were validated with the observed PGA (peak ground acceleration) values before use in the hazard evaluation. In this study, the hazard is estimated using linear sources, identified in and around the study area. Results are presented in the form of PGA using both DSHA (deterministic seismic hazard analysis) and PSHA (probabilistic seismic hazard analysis) with 2 and 10% probability of exceedance in 50 years, and spectral acceleration (T = 0. 2 s, 1.0 s) for both the states (2% probability of exceedance in 50 years). The results are important to provide inputs for planning risk reduction strategies, for developing risk acceptance criteria and financial analysis for possible damages in the study area with a comprehensive analysis and higher resolution hazard mapping.

  16. A New Proposal for Tsunami Hazard Map Explicitly Indicating Uncertainty of Tsunami Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.; Suppasri, A.; Imamura, F.

    2014-12-01

    The tsunami caused by the 2011 Great East Japan Earthquake mainly inundated the Tohoku coastal areas, most of which exceeded inundation area specified in tsunami hazard maps. A report by the Japanese government for the IAEA Ministerial Conference on Nuclear Safety clearly stated that there is difficulty of quantitatively assessing natural disaster risk associated with a rare event such as tsunami because of uncertainty, and sufficient efforts have not been made so far to enhance the public confidence in the risk assessment by explicitly indicating the uncertinty of the assessment. Based on the statement, we propose a new method for explicitly indicating the uncertainty of tsunami hazard assessment in the tsunami hazard map. Firstly, we estimated stochastic wave height along the Tohoku coastal areas using a method for probabilistic tsuami hazard assessment in order to quantitatively assess the uncertainty of coastal wave heights. We selected eleven earthquake-generic areas along the Japan trench as the areas that could generate tsunamis. Secondly, in order to calculate tsunami inundation area due to the average coastal wave height for one return period, we identified the earthquake fault that generate the target wave height and conducted numerical simulation using non-linear long-wave equations with inputting their fault parameters. On the other hand, in order to calculate tsunami inundation area due to fractile coastal wave height that consider the uncertainty of the assessment, we generated a hypothetical earthquake fault that the dislocation of which was uniformly increased or decreased by multiplying a constant number according to the change of each fractile wave height, and conducted numerical simulation in the same way. As a result, there were big differences among tsunami inudation areas due to 0.05 fractile, simple average and 0.95 fractile wave height at coastal points even though the assumed wave height generate by one target return period. A preliminary

  17. Tsunami hazard warning and risk prediction based on inaccurate earthquake source parameters

    NASA Astrophysics Data System (ADS)

    Goda, Katsuichiro; Abilova, Kamilla

    2016-02-01

    This study investigates the issues related to underestimation of the earthquake source parameters in the context of tsunami early warning and tsunami risk assessment. The magnitude of a very large event may be underestimated significantly during the early stage of the disaster, resulting in the issuance of incorrect tsunami warnings. Tsunamigenic events in the Tohoku region of Japan, where the 2011 tsunami occurred, are focused on as a case study to illustrate the significance of the problems. The effects of biases in the estimated earthquake magnitude on tsunami loss are investigated using a rigorous probabilistic tsunami loss calculation tool that can be applied to a range of earthquake magnitudes by accounting for uncertainties of earthquake source parameters (e.g., geometry, mean slip, and spatial slip distribution). The quantitative tsunami loss results provide valuable insights regarding the importance of deriving accurate seismic information as well as the potential biases of the anticipated tsunami consequences. Finally, the usefulness of rigorous tsunami risk assessment is discussed in defining critical hazard scenarios based on the potential consequences due to tsunami disasters.

  18. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    U.S. Geological Survey

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  19. Tsunami Hazard Assessment in Guam

    NASA Astrophysics Data System (ADS)

    Arcas, D.; Uslu, B.; Titov, V.; Chamberlin, C.

    2008-12-01

    The island of Guam is located approximately 1500 miles south of Japan, in the vicinity of the Mariana Trench. It is surrounded in close proximity by three subduction zones, Nankai-Taiwan, East Philippines and Mariana Trench that pose a considerable near to intermediate field tsunami threat. Tsunami catalogues list 14 tsunamigenic earthquake with Mw≥8.0 since 1900 only in this region, (Soloviev and Go, 1974; Lander, 1993; Iida, 1984; Lander and Lowell, 2002), however the island has not been significantly affected by some of the largest far-field events of the past century, such as the 1952 Kamchatka, 1960 Chile, and the 1964 Great Alaska earthquake. An assessment of the tsunami threat to the island from both near and far field sources, using forecast tools originally developed at NOAA's Pacific Marine Environmental Laboratory (PMEL) for real-time forecasting of tsunamis is presented here. Tide gauge records from 1952 Kamchatka, 1964 Alaska, and 1960 Chile earthquakes at Apra Harbor are used to validate our model set up, and to explain the limited impact of these historical events on Guam. Identification of worst-case scenarios, and determination of tsunamigenic effective source regions are presented for five vulnerable locations on the island via a tsunami sensitivity study. Apra Harbor is the site of a National Ocean Service (NOS) tide gauge and the biggest harbor on the island. Tumon Bay, Pago Bay, Agana Bay and Inarajan Bay are densely populated areas that require careful investigation. The sensitivity study shows that earthquakes from Eastern Philippines present a major threat to west coast facing sites, whereas the Marina Trench poses the biggest concern to the east coast facing sites.

  20. A Quantitative Appraisal of Earthquake Hazard Parameters Evaluated from Bayesian Approach for Different Regions in Iranian Plateau

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Türker, Tügba; Bayrak, Yusuf

    2016-06-01

    In this study, we used the program for seismic hazard Bayesian estimate which was elaborated by Alexey Lyubushin. Our study is the next in the sequence of applications of this software to seismic hazard assessment in different regions of the world. However, earthquake hazard parameters of maximum regional magnitude (M_{ max }), β value and seismic activity rate or intensity ( λ) and their uncertainties for the 15 different source regions in Iranian Plateau have been evaluated with the help of a complete and homogeneous earthquake catalogue during the period 1900-2014 with M_{{w}} ≥4.0. The estimated M_{ max } values varies between 6.25 and 8.37. Lowest value is observed in the Zagros foredeep whereas highest value is observed in the Makran. Also, it is observed that there is a strong relationship between the estimated maximum earthquake magnitudes estimated by Bayesian approach and maximum observed magnitudes. Moreover, in this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in 15 different source regions. Based on computed earthquake hazard parameters, the prerequisite guides to the earthquake estimation of the parameters referred to as the most seismically active regions of Iranian Plateau. The Makran and East Iran show earthquake magnitude greater than 8.0 in next 100-years with 90 % probability level as compared to other regions, which declares that these regions are more susceptible to occurrence of large earthquakes. The outcomes which obtained in the study may have useful implications in the probabilistic seismic hazard studies of Iranian Plateau.

  1. A Quantitative Appraisal of Earthquake Hazard Parameters Evaluated from Bayesian Approach for Different Regions in Iranian Plateau

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Türker, Tügba; Bayrak, Yusuf

    2016-03-01

    In this study, we used the program for seismic hazard Bayesian estimate which was elaborated by Alexey Lyubushin. Our study is the next in the sequence of applications of this software to seismic hazard assessment in different regions of the world. However, earthquake hazard parameters of maximum regional magnitude (M_{ max }), β value and seismic activity rate or intensity (λ) and their uncertainties for the 15 different source regions in Iranian Plateau have been evaluated with the help of a complete and homogeneous earthquake catalogue during the period 1900-2014 with M_{w} ≥4.0. The estimated M_{ max } values varies between 6.25 and 8.37. Lowest value is observed in the Zagros foredeep whereas highest value is observed in the Makran. Also, it is observed that there is a strong relationship between the estimated maximum earthquake magnitudes estimated by Bayesian approach and maximum observed magnitudes. Moreover, in this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in 15 different source regions. Based on computed earthquake hazard parameters, the prerequisite guides to the earthquake estimation of the parameters referred to as the most seismically active regions of Iranian Plateau. The Makran and East Iran show earthquake magnitude greater than 8.0 in next 100-years with 90 % probability level as compared to other regions, which declares that these regions are more susceptible to occurrence of large earthquakes. The outcomes which obtained in the study may have useful implications in the probabilistic seismic hazard studies of Iranian Plateau.

  2. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-06-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake (M w 8.1) and the 15 August, 1950 Assam earthquake (M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration (S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  3. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  4. Susceptibility assessment of earthquake-triggered landslide in Wenchuan

    NASA Astrophysics Data System (ADS)

    Tao, Shu; Hu, Deyong; Zhao, Wenji

    2010-11-01

    The Ms 8.0 Wenchuan earthquake, occurred on 12 May 2008 in Sichuan Province, collapsed a great many houses and injured hundreds of thousands of people. Undoubtedly, it can be predicted that secondary earthquake landslides will draw much attention during a long time after the earthquake due to the severe geological hazard. In order to remove threat from the secondary disasters effectively, this study used techniques of remote sensing and GIS to generate susceptibility maps, taking the case of Wenchuan County. Seven factors controlling landslide occurrence have been taken account into the susceptibility assessment, including elevation, slop, aspect, lithology, seismic intensity, distance to faults and rivers. According to the probability that predicts the possibility of landslide occurrence calculated applying information value method and logistic regression separately, the study zone was ultimately categorized into five classes, specifically, "extremely low", "low", "moderate", "high" and "very high". These results have been proved to reflect closely the spatial distributions of landslides in the study area.

  5. Susceptibility assessment of earthquake-triggered landslide in Wenchuan

    NASA Astrophysics Data System (ADS)

    Tao, Shu; Hu, Deyong; Zhao, Wenji

    2009-09-01

    The Ms 8.0 Wenchuan earthquake, occurred on 12 May 2008 in Sichuan Province, collapsed a great many houses and injured hundreds of thousands of people. Undoubtedly, it can be predicted that secondary earthquake landslides will draw much attention during a long time after the earthquake due to the severe geological hazard. In order to remove threat from the secondary disasters effectively, this study used techniques of remote sensing and GIS to generate susceptibility maps, taking the case of Wenchuan County. Seven factors controlling landslide occurrence have been taken account into the susceptibility assessment, including elevation, slop, aspect, lithology, seismic intensity, distance to faults and rivers. According to the probability that predicts the possibility of landslide occurrence calculated applying information value method and logistic regression separately, the study zone was ultimately categorized into five classes, specifically, "extremely low", "low", "moderate", "high" and "very high". These results have been proved to reflect closely the spatial distributions of landslides in the study area.

  6. Tsunami Hazard in Crescent City, California from Kuril Islands earthquakes

    NASA Astrophysics Data System (ADS)

    Dengler, L.; Uslu, B.; Barberopoulou, A.

    2007-12-01

    On November 15, Crescent City in Del Norte County, California was hit by a series of tsunami surges generated by the M = 8.3 Kuril Islands earthquake causing an estimated 9.7 million (US dollars) in damages to the small boat basin. This was the first significant tsunami loss on US territory since the 1964 Alaska tsunami. The damage occurred nearly 8 hours after the official tsunami alert bulletins had been cancelled. The tsunami caused no flooding and did not exceed the ambient high tide level. All of the damage was caused by strong currents, estimated at 12 to 15 knots, causing the floating docks to be pinned against the pilings and water to flow over them. The event highlighted problems in warning criteria and communications for a marginal event with the potential for only localized impacts, the vulnerability of harbors from a relatively modest tsunami, and the particular exposure of the Crescent City harbor area to tsunamis. It also illustrated the poor understanding of local officials of the duration of tsunami hazard. As a result of the November tsunami, interim changes were made by WCATWC to address localized hazards in areas like Crescent City. On January 13, 2007 when a M = 8.1 earthquake occurred in the Kuril Islands, a formal procedure was in place for hourly conference calls between WCATWC, California State Office of Emergency Services officials, local weather Service Offices and local emergency officials, significantly improving the decision making process and the communication among the federal, state and local officials. Kuril Island tsunamis are relatively common at Crescent City. Since 1963, five tsunamis generated by Kuril Island earthquakes have been recorded on the Crescent City tide gauge, two with amplitudes greater than 0.5 m. We use the MOST model to simulate the 2006, 2007 and 1994 events and to examine the difference between damaging and non-damaging events at Crescent City. Small changes in the angle of the rupture zone results can result

  7. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    NASA Astrophysics Data System (ADS)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  8. The OPAL Project: Open source Procedure for Assessment of Loss using Global Earthquake Modelling software

    NASA Astrophysics Data System (ADS)

    Daniell, James

    2010-05-01

    This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an "Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software" (OPAL). The OPAL procedure has been developed to provide a framework for optimisation of a Global Earthquake Modelling process through: 1) Overview of current and new components of earthquake loss assessment (vulnerability, hazard, exposure, specific cost and technology); 2) Preliminary research, acquisition and familiarisation with all available ELE software packages; 3) Assessment of these 30+ software packages in order to identify the advantages and disadvantages of the ELE methods used; and 4) Loss analysis for a deterministic earthquake (Mw7.2) for the Zeytinburnu district, Istanbul, Turkey, by applying 3 software packages (2 new and 1 existing): a modified displacement-based method based on DBELA (Displacement Based Earthquake Loss Assessment), a capacity spectrum based method HAZUS (HAZards United States) and the Norwegian HAZUS-based SELENA (SEismic Loss EstimatioN using a logic tree Approach) software which was adapted for use in order to compare the different processes needed for the production of damage, economic and social loss estimates. The modified DBELA procedure was found to be more computationally expensive, yet had less variability, indicating the need for multi-tier approaches to global earthquake loss estimation. Similar systems planning and ELE software produced through the OPAL procedure can be applied to worldwide applications, given exposure data. Keywords: OPAL, displacement-based, DBELA, earthquake loss estimation, earthquake loss assessment, open source, HAZUS

  9. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  10. Initial steps to the early warning systems in Bulgaria - earthquakes, tsunamis, marine hazards

    NASA Astrophysics Data System (ADS)

    Ranguelov, Boyko

    2013-04-01

    Several projects on the early warning systems in Bulgaria are presented - some of them in phase of execution, some - in preparation and assessment. The work presents these projects related to the early warning systems (EWS). They are under execution in Bulgaria with wide participation of the international teams. The projects' parameters - partners involved, coordinators, main tasks and objectives, time duration, intended equipment, specific objectives and beneficiaries are discussed. The progress of these projects is presented. The projects themselves - according their acronyms are: MARINEGEOHAZARDS (mainly focused on marine hazards in the Black Sea - earthquakes and tsunamis). DACEA (about possibility of Vrancea seismic source earthquakes to be warned in Bulgaria and Romania). ESNET (about support of decision makers in case of earthquakes and other coastal hazards). SIMORA (about a local monitoring system of strong ground motions and its relevancy to the EWS). A comparison study about the level of reliability and security, as well as, some legislation issues are under investigations. The web-sites and other dissemination tools (like newsletters,webs,etc.) are also under presentation. The goal is to show the specific objectives, their effective execution and the research support to the society and decision makers.

  11. Earthquake and tsunami hazard in West Sumatra: integrating science, outreach, and local stakeholder needs

    NASA Astrophysics Data System (ADS)

    McCaughey, J.; Lubis, A. M.; Huang, Z.; Yao, Y.; Hill, E. M.; Eriksson, S.; Sieh, K.

    2012-04-01

    The Earth Observatory of Singapore (EOS) is building partnerships with local to provincial government agencies, NGOs, and educators in West Sumatra to inform their policymaking, disaster-risk-reduction, and education efforts. Geodetic and paleoseismic studies show that an earthquake as large as M 8.8 is likely sometime in the coming decades on the Mentawai patch of the Sunda megathrust. This earthquake and its tsunami would be devastating for the Mentawai Islands and neighboring areas of the western Sumatra coast. The low-lying coastal Sumatran city of Padang (pop. ~800,000) has been the object of many research and outreach efforts, especially since 2004. Padang experienced deadly earthquakes in 2007 and 2009 that, though tragedies in their own right, served also as wake-up calls for a larger earthquake to come. However, there remain significant barriers to linking science to policy: extant hazard information is sometimes contradictory or confusing for non-scientists, while turnover of agency leadership and staff means that, in the words of one local advocate, "we keep having to start from zero." Both better hazard knowledge and major infrastructure changes are necessary for risk reduction in Padang. In contrast, the small, isolated villages on the outlying Mentawai Islands have received relatively fewer outreach efforts, yet many villages have the potential for timely evacuation with existing infrastructure. Therefore, knowledge alone can go far toward risk reduction. The tragic October 2010 Mentawai tsunami has inspired further disaster-risk reduction work by local stakeholders. In both locations, we are engaging policymakers and local NGOs, providing science to help inform their work. Through outreach contacts, the Mentawai government requested that we produce the first-ever tsunami hazard map for their islands; this aligns well with scientific interests at EOS. We will work with the Mentawai government on the presentation and explanation of the hazard map, as

  12. Bayesian network learning for natural hazard assessments

    NASA Astrophysics Data System (ADS)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  13. Challenges in assessing seismic hazard in intraplate Europe

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Liu, Mian; Camelbeeck, Thierry; Merino, Miguel; Landgraf, Angela; Hintersberger, Esther; Kübler, Simon

    2016-04-01

    Intraplate seismicity is often characterized by episodic, clustered and migrating earth- quakes and extended after-shock sequences. Can these observations - primarily from North America, China and Australia - usefully be applied to seismic hazard assessment for intraplate Europe? Existing assessments are based on instrumental and historical seismicity of the past c. 1000 years, as well as some data for active faults. This time span probably fails to capture typical large-event recurrence intervals of the order of tens of thousands of years. Palaeoseismology helps to lengthen the observation window, but preferentially produces data in regions suspected to be seismically active. Thus the expected maximum magnitudes of future earthquakes are fairly uncertain, possibly underestimated, and earthquakes are likely to occur in unexpected locations. These issues particularly arise in considering the hazards posed by low-probability events to both heavily populated areas and critical facilities. For example, are the variations in seismicity (and thus assumed seismic hazard) along the Rhine Graben a result of short sampling or are they real? In addition to a better assessment of hazards with new data and models, it is important to recognize and communicate uncertainties in hazard estimates. The more users know about how much confidence to place in hazard maps, the more effectively the maps can be used.

  14. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    SciTech Connect

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  15. Sedimentary Basins: A Deeper Look at Seattle and Portland's Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Thompson, M.; Frankel, A. D.; Wirth, E. A.; Vidale, J. E.; Han, J.

    2015-12-01

    to assess the shaking hazards for Portland due to local earthquakes and great earthquakes on the CSZ.

  16. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    NASA Astrophysics Data System (ADS)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  17. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes. PMID:18657068

  18. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily

  19. Monogenetic volcanic hazards and assessment

    NASA Astrophysics Data System (ADS)

    Connor, C.; Connor, L. J.; Richardson, J. A.

    2012-12-01

    Many of the Earth's major cities are build on the products of monogenetic volcanic eruptions and within geologically active basaltic volcanic fields. These cities include Mexico City (Mexico), Auckland (New Zealand), Melbourne (Australia), and Portland (USA) to name a few. Volcanic hazards in these areas are complex, and involve the potential formation of new volcanic vents and associated hazards, such as lava flows, tephra fallout, and ballistic hazards. Hazard assessment is complicated by the low recurrence rate of volcanism in most volcanic fields. We have developed a two-stage process for probabilistic modeling monogenetic volcanic hazards. The first step is an estimation of the possible locations of future eruptive vents based on kernel density estimation and recurrence rate of volcanism using Monte Carlo simulation and accounting for uncertainties in age determinations. The second step is convolution of this spatial density / recurrence rate model with hazard codes for modeling lava inundation, tephra fallout, and ballistic impacts. A methodology is presented using this two-stage approach to estimate lava flow hazard in several monogenetic volcanic fields, including at a nuclear power plant site near the Shamiram Plateau, a Quaternary volcanic field in Armenia. The location of possible future vents is determined by estimating spatial density from a distribution of 18 mapped vents using a 2-D elliptical Gaussian kernel function. The SAMSE method, a modified asymptotic mean squared error approach, uses the distribution of known eruptive vents to optimally determine a smoothing bandwidth for the Gaussian kernel function. The result is a probability map of vent density. A large random sample (N=10000) of vent locations is drawn from this probability map. For each randomly sampled vent location, a lava flow inundation model is executed. Lava flow input parameters (volume and average thickness) are determined from distributions fit to field observations of the low

  20. Neo-deterministic seismic hazard assessment in North Africa

    NASA Astrophysics Data System (ADS)

    Mourabit, T.; Abou Elenean, K. M.; Ayadi, A.; Benouar, D.; Ben Suleman, A.; Bezzeghoud, M.; Cheddadi, A.; Chourak, M.; ElGabry, M. N.; Harbi, A.; Hfaiedh, M.; Hussein, H. M.; Kacem, J.; Ksentini, A.; Jabour, N.; Magrin, A.; Maouche, S.; Meghraoui, M.; Ousadou, F.; Panza, G. F.; Peresan, A.; Romdhane, N.; Vaccari, F.; Zuccolo, E.

    2014-04-01

    North Africa is one of the most earthquake-prone areas of the Mediterranean. Many devastating earthquakes, some of them tsunami-triggering, inflicted heavy loss of life and considerable economic damage to the region. In order to mitigate the destructive impact of the earthquakes, the regional seismic hazard in North Africa is assessed using the neo-deterministic, multi-scenario methodology (NDSHA) based on the computation of synthetic seismograms, using the modal summation technique, at a regular grid of 0.2 × 0.2°. This is the first study aimed at producing NDSHA maps of North Africa including five countries: Morocco, Algeria, Tunisia, Libya, and Egypt. The key input data for the NDSHA algorithm are earthquake sources, seismotectonic zonation, and structural models. In the preparation of the input data, it has been really important to go beyond the national borders and to adopt a coherent strategy all over the area. Thanks to the collaborative efforts of the teams involved, it has been possible to properly merge the earthquake catalogues available for each country to define with homogeneous criteria the seismogenic zones, the characteristic focal mechanism associated with each of them, and the structural models used to model wave propagation from the sources to the sites. As a result, reliable seismic hazard maps are produced in terms of maximum displacement ( D max), maximum velocity ( V max), and design ground acceleration.

  1. Earthquake stress triggers, stress shadows, and seismic hazard

    USGS Publications Warehouse

    Harris, R.A.

    2000-01-01

    Many aspects of earthquake mechanics remain an enigma at the beginning of the twenty-first century. One potential bright spot is the realization that simple calculations of stress changes may explain some earthquake interactions, just as previous and ongoing studies of stress changes have begun to explain human- induced seismicity. This paper, which is an update of Harris1, reviews many published works and presents a compilation of quantitative earthquake-interaction studies from a stress change perspective. This synthesis supplies some clues about certain aspects of earthquake mechanics. It also demonstrates that much work remains to be done before we have a complete story of how earthquakes work.

  2. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at

  3. EARTHQUAKE HAZARDS TO DOMESTIC WATER DISTRIBUTION SYSTEMS IN SALT LAKE COUNTY, UTAH.

    USGS Publications Warehouse

    Highland, Lynn M.

    1985-01-01

    A magnitude-7. 5 earthquake occurring along the central portion of the Wasatch Fault, Utah, may cause significant damage to Salt Lake County's domestic water system. This system is composed of water treatment plants, aqueducts, distribution mains, and other facilities that are vulnerable to ground shaking, liquefaction, fault movement, and slope failures. Recent investigations into surface faulting, landslide potential, and earthquake intensity provide basic data for evaluating the potential earthquake hazards to water-distribution systems in the event of a large earthquake. Water supply system components may be vulnerable to one or more earthquake-related effects, depending on site geology and topography. Case studies of water-system damage by recent large earthquakes in Utah and in other regions of the United States offer valuable insights in evaluating water system vulnerability to earthquakes.

  4. NGNP SITE 2 HAZARDS ASSESSMENT

    SciTech Connect

    Wayne Moe

    2011-10-01

    The Next Generation Nuclear Plant (NGNP) Project initiated at Idaho National Laboratory (INL) by the U.S. Department of Energy pursuant to the 2005 Energy Policy Act, is based on research and development activities supported by the Generation IV Nuclear Energy Systems Initiative. The principal objective of the NGNP Project is to support commercialization of the high temperature gas-cooled reactor (HTGR) technology. The HTGR is a helium-cooled and graphite-moderated reactor that can operate at temperatures much higher than those of conventional light water reactor (LWR) technologies. Accordingly, it can be applied in many industrial applications as a substitute for burning fossil fuels, such as natural gas, to generate process heat in addition to producing electricity, which is the principal application of current LWRs. Nuclear energy in the form of LWRs has been used in the U.S. and internationally principally for the generation of electricity. However, because the HTGR operates at higher temperatures than LWRs, it can be used to displace the use of fossil fuels in many industrial applications. It also provides a carbon emission-free energy supply. For example, the energy needs for the recovery and refining of petroleum, for the petrochemical industry and for production of transportation fuels and feedstocks using coal conversion processes require process heat provided at temperatures approaching 800 C. This temperature range is readily achieved by the HTGR technology. This report summarizes a site assessment authorized by INL under the NGNP Project to determine hazards and potential challenges that site owners and HTGR designers need to be aware of when developing the HTGR design for co-location at industrial facilities, and to evaluate the site for suitability considering certain site characteristics. The objectives of the NGNP site hazard assessments are to do an initial screening of representative sites in order to identify potential challenges and restraints

  5. Hazards assessment for the Hazardous Waste Storage Facility

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency.

  6. ELER software - a new tool for urban earthquake loss assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and

  7. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  8. Tectonic Origin of the 1899 Yakutat Bay Earthquakes, Alaska, and Insights into Future Hazards

    NASA Astrophysics Data System (ADS)

    Gulick, S. S.; LeVoir, M. A.; Haeussler, P. J.; Saustrup, S.

    2012-12-01

    On September 10th the largest of four earthquakes (Mw 8.2) that occurred in southeast Alaska on 1899 produced a 6 m tsunami and may have produced as much as 14 m of co-seismic uplift. This earthquake had an epicenter somewhere near Yakutat or Disenchantment Bays. These bays lie at the transition between the Fairweather Fault (the Pacific-North American strike-slip plate boundary), and the Yakutat Terrane-North American subduction zone. The deformation front of this subduction zone is thought to include the eastern fault in the Pamplona Zone offshore, the Malaspina Fault onshore, and the Esker Creek Fault near Yakutat Bay. The 10 September 1899 event could have taken place on a Yakutat-North American megathrust that daylights in Yakutat or Disenchantment Bay. Alternatively, the 10 September 1899 earthquake could have originated from the Fairweather-Boundary and Yakutat faults, transpressive components of the Fairweather strike-slip system present in the Yakutat Bay region, or from thrusting along the Yakutat and Otemaloi Faults on the southeast flank of Yakutat Bay. Characterizing fault slip during the Alaskan earthquakes of 1899 is vital to assessing both subduction zone structure and seismic hazards in the Yakutat Bay area. Each possible fault model has a different implication for modern hazards. These results will be used to update seismic hazard and fault maps and assess future risk to the Yakutat Bay and surrounding communities. During Aug. 6-17th, we anticipate acquiring high-resolution, marine multichannel seismic data aboard the USGS vessel Alaskan Gyre in Yakutat and Disenchantment Bays to search for evidence of recent faulting and directly test these competing theories for the 10 September 1899 event. This survey uses the University of Texas Institute for Geophysics' mini-GI gun, 24-channel seismic streamer, portable seismic compressor system, and associated gun control and data acquisition system to acquire the data. The profiles have a nominal common

  9. A fractal approach to probabilistic seismic hazard assessment

    NASA Technical Reports Server (NTRS)

    Turcotte, D. L.

    1989-01-01

    The definition of a fractal distribution is that the number of objects (events) N with a characteristic size greater than r satisfies the relation N proportional to r exp - D is the fractal dimension. The applicability of a fractal relation implies that the underlying physical process is scale-invariant over the range of applicability of the relation. The empirical frequency-magnitude relation for earthquakes defining a b-value is a fractal relation with D = 2b. Accepting the fractal distribution, the level of regional seismicity can be related to the rate of regional strain and the magnitude of the largest characteristic earthquake. High levels of seismic activity indicate either a large regional strain or a low-magnitude maximum characteristic earthquake (or both). If the regional seismicity has a weak time dependence, the approach can be used to make probabilistic seismic hazard assessments.

  10. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  11. Seismic hazard assessment in Central Asia using smoothed seismicity approaches

    NASA Astrophysics Data System (ADS)

    Ullah, Shahid; Bindi, Dino; Zuccolo, Elisa; Mikhailova, Natalia; Danciu, Laurentiu; Parolai, Stefano

    2014-05-01

    Central Asia has a long history of large to moderate frequent seismicity and is therefore considered one of the most seismically active regions with a high hazard level in the world. In the hazard map produced at global scale by GSHAP project in 1999( Giardini, 1999), Central Asia is characterized by peak ground accelerations with return period of 475 years as high as 4.8 m/s2. Therefore Central Asia was selected as a target area for EMCA project (Earthquake Model Central Asia), a regional project of GEM (Global Earthquake Model) for this area. In the framework of EMCA, a new generation of seismic hazard maps are foreseen in terms of macro-seismic intensity, in turn to be used to obtain seismic risk maps for the region. Therefore Intensity Prediction Equation (IPE) had been developed for the region based on the distribution of intensity data for different earthquakes occurred in Central Asia since the end of 19th century (Bindi et al. 2011). The same observed intensity distribution had been used to assess the seismic hazard following the site approach (Bindi et al. 2012). In this study, we present the probabilistic seismic hazard assessment of Central Asia in terms of MSK-64 based on two kernel estimation methods. We consider the smoothed seismicity approaches of Frankel (1995), modified for considering the adaptive kernel proposed by Stock and Smith (2002), and of Woo (1996), modified for considering a grid of sites and estimating a separate bandwidth for each site. The activity rate maps are shown from Frankel approach showing the effects of fixed and adaptive kernel. The hazard is estimated for rock site condition based on 10% probability of exceedance in 50 years. Maximum intensity of about 9 is observed in the Hindukush region.

  12. The 2012 Ferrara seismic sequence: Regional crustal structure, earthquake sources, and seismic hazard

    NASA Astrophysics Data System (ADS)

    Malagnini, Luca; Herrmann, Robert B.; Munafò, Irene; Buttinelli, Mauro; Anselmi, Mario; Akinci, Aybige; Boschi, E.

    2012-10-01

    Inadequate seismic design codes can be dangerous, particularly when they underestimate the true hazard. In this study we use data from a sequence of moderate-sized earthquakes in northeast Italy to validate and test a regional wave propagation model which, in turn, is used to understand some weaknesses of the current design spectra. Our velocity model, while regionalized and somewhat ad hoc, is consistent with geophysical observations and the local geology. In the 0.02-0.1 Hz band, this model is validated by using it to calculate moment tensor solutions of 20 earthquakes (5.6 ≥ MW ≥ 3.2) in the 2012 Ferrara, Italy, seismic sequence. The seismic spectra observed for the relatively small main shock significantly exceeded the design spectra to be used in the area for critical structures. Observations and synthetics reveal that the ground motions are dominated by long-duration surface waves, which, apparently, the design codes do not adequately anticipate. In light of our results, the present seismic hazard assessment in the entire Pianura Padana, including the city of Milan, needs to be re-evaluated.

  13. Role of WEGENER (World Earthquake GEodesy Network for Environmental Hazard Research) in monitoring natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Zerbini, S.; Bastos, M. L.; Becker, M. H.; Meghraoui, M.; Reilinger, R. E.

    2013-12-01

    anthropogenic climate change (sea level, ice degradation). In addition, expanded applications of space geodesy to atmospheric studies will remain a major focus with emphasis on ionospheric and tropospheric monitoring to support forecasting extreme events. Towards these ends, we will encourage and foster interdisciplinary, integrated initiatives to develop a range of case studies for these critical problems. Geological studies are needed to extend geodetic deformation studies to geologic time scales, and new modeling approaches will facilitate full exploitation of expanding geodetic databases. In light of this new focus, the WEGENER acronym now represents, 'World Earthquake GEodesy Network for Environmental Hazard Research.

  14. Challenges in Assessing Seismic Hazard in Intraplate Europe

    NASA Astrophysics Data System (ADS)

    Hintersberger, E.; Kuebler, S.; Landgraf, A.; Stein, S. A.

    2014-12-01

    Intraplate regions are often characterized by scattered, clustered and migrating seismicity and the occurrence of low-strain areas next to high-strain ones. Increasing evidence for large paleoearthquakes in such regions together with population growth and development of critical facilities, call for better assessments of earthquake hazards. Existing seismic hazard assessment for intraplate Europe is based on instrumental and historical seismicity of the past 1000 years, as well some active fault data. These observations face important limitations due to the quantity and quality of the available data bases. Even considering the long record of historical events in some populated areas of Europe, this time-span of thousand years likely fails to capture some faults' typical large-event recurrence intervals that are in the order of tens of thousands of years. Paleoseismology helps lengthen the observation window, but only produces point measurements, and preferentially in regions suspected to be seismically active. As a result, the expected maximum magnitudes of future earthquakes are quite uncertain, likely to be underestimated, and earthquakes are likely to occur in unexpected locations. These issues in particular arise in the heavily populated Rhine Graben and Vienna Basin areas, and in considering the hazard to critical facilities like nuclear power plants posed by low-probability events.

  15. Multiple-site estimations in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Sokolov, Vladimir; Ismail-Zadeh, Alik

    2016-04-01

    We analyze specific features of multiple-site probabilistic seismic hazard assessment (PSHA), i.e. annual rate of ground motion level exceedance in at least one site of several sites of interest located within in an area or along a linear extended object. The relation between the multiple-scale hazard estimations and strong ground-motion records obtained during the 2008 Wenchuan (China) Mw 7.9 earthquake is discussed. The ground-motion records may be considered as an example of ground motion exceeding the design level estimated using the classical point-wise PSHA. We showed that the multiple-site hazard (MSH) assessment, when being performed for standard return period 475 years, provide reasonable estimations of the ground motions that may occur during the earthquake, parameters of which are close to maximum possible events accepted in PSHA for the region. Thus the MSH may be useful in estimation of maximum considered earthquake ground motion for the considered territory taking into account its extent.

  16. The Diversity of Large Earthquakes and Its Implications for Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2014-05-01

    With the advent of broadband seismology and GPS, significant diversity in the source radiation spectra of large earthquakes has been clearly demonstrated. This diversity requires different approaches to mitigate hazards. In certain tectonic environments, seismologists can forecast the future occurrence of large earthquakes within a solid scientific framework using the results from seismology and GPS. Such forecasts are critically important for long-term hazard mitigation practices, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainty, and unexpected events will continue to surprise seismologists. Recent developments in real-time seismology will help seismologists to cope with and prepare for tsunamis and earthquakes. Combining a better understanding of earthquake diversity with modern technology is the key to effective and comprehensive hazard mitigation practices.

  17. Earthquake Damage Assessment Using Very High Resolution Satelliteimagery

    NASA Astrophysics Data System (ADS)

    Chiroiu, L.; André, G.; Bahoken, F.; Guillande, R.

    Various studies using satellite imagery were applied in the last years in order to assess natural hazard damages, most of them analyzing the case of floods, hurricanes or landslides. For the case of earthquakes, the medium or small spatial resolution data available in the recent past did not allow a reliable identification of damages, due to the size of the elements (e.g. buildings or other structures), too small compared with the pixel size. The recent progresses of remote sensing in terms of spatial resolution and data processing makes possible a reliable damage detection to the elements at risk. Remote sensing techniques applied to IKONOS (1 meter resolution) and IRS (5 meters resolution) imagery were used in order to evaluate seismic vulnerability and post earthquake damages. A fast estimation of losses was performed using a multidisciplinary approach based on earthquake engineering and geospatial analysis. The results, integrated into a GIS database, could be transferred via satellite networks to the rescue teams deployed on the affected zone, in order to better coordinate the emergency operations. The methodology was applied to the city of Bhuj and Anjar after the 2001 Gujarat (India) Earthquake.

  18. Probabilistic Seismic Hazard assessment for Sultanate of Oman

    NASA Astrophysics Data System (ADS)

    El Hussain, I. W.; Deif, A.; El-Hady, S.; Toksoz, M. N.; Al-Jabri, K.; Al-Hashmi, S.; Al-Toubi, K. I.; Al-Shijbi, Y.; Al-Saifi, M.

    2010-12-01

    Seismic hazard assessment for Oman is conducted utilizing probabilistic approach. Probabilistic Seismic Hazard Assessment (PSHA) has been performed within a logic tree framework. An earthquake catalogue for Oman was compiled and declustered to include only independent earthquakes. The declustered catalogue was used to define seismotectonic source model with 26 source zones that characterize earthquakes in the tectonic environments in and around Oman. The recurrence parameters for all the seismogenic zones are determined using the doubly bounded exponential distribution except the seismogenic zones of Makran subduction zone which were modeled using the characteristic distribution. The maximum earthquakes on known faults were determined geologically and the remaining zones were determined statistically from the compiled catalogue. Horizontal ground accelerations in terms of geometric mean were calculated using ground-motion prediction relationships that were developed from seismic data obtained from the shallow active environment, stable craton environment, and from subduction earthquakes. In this analysis, we have used alternative seismotectonic source models, maximum magnitude, and attenuation models and weighted them to account for the epistemic uncertainty. The application of this methodology leads to the definition of 5% damped seismic hazard maps at rock sites for 72, 475, and 2475 year return periods for spectral accelerations at periods of 0.0 (corresponding to peak ground acceleration), 0.1, 0.2, 0.3, 1.0 and 2.0 sec. Mean and 84th percentile acceleration contour maps were represented. The results also were displayed as uniform hazard spectra for rock sites in the cities of Khasab, Diba, Sohar, Muscat, Nizwa, Sur, and Salalah in Oman and the cities of Abu Dhabi and Dubai in UAE. The PGA across Oman ranges from 20 cm/sec2 in the Mid-West and 115 cm/sec2 at the northern part for 475 years return period and between 40 cm/sec2 and 180 cm/sec2 for 2475 years

  19. Damage-consistent hazard assessment - the revival of intensities

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2016-04-01

    Proposed key-note speech (Introduction of session). Current civil engineering standards for residential buildings in many countries are based on (frequently probabilistic) seismic hazard assessments using ground motion parameters like peak ground accelerations or pseudo displacements as hazard parameters. This approach has its roots in the still wide spread force-based design of structures using simplified methods like linear response spectra in combination with equivalent static forces procedures for the design of structures. In the engineering practice this has led to practical problems because it's not economic to design structures against the maximum forces of earthquakes. Furthermore, a completely linear-elastic response of structures is seldom required. Different types of reduction factors (performance-dependent response factors) considering for example overstrength, structural redundancy and structural ductility have been developed in different countries for compensating the use of simplified and conservative design methods. This has the practical consequence that the methods used in engineering as well as the output results of hazard assessment studies are poorly related to the physics of damaging. Reliable predictions for the response of structures under earthquake loading using such simplified design methods are not feasible. In dependence of the type of structures damage may be controlled by hazard parameters that are different from ground motion accelerations. Furthermore, a realistic risk assessment has to be based on reliable predictions of damage. This is crucial for effective decision-making. This opens the space for a return to the use of intensities as the key output parameter of seismic hazard assessment. Site intensities (e.g. EMS-98) are very well correlated to the damage of structures. They can easily be converted into the required set of engineering parameters or even directly into earthquake time-histories suitable for structural analysis

  20. Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software (OPAL)

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.

    2011-07-01

    This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an "Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software" (OPAL). The OPAL procedure was created to provide a framework for optimisation of a Global Earthquake Modelling process through: 1. overview of current and new components of earthquake loss assessment (vulnerability, hazard, exposure, specific cost, and technology); 2. preliminary research, acquisition, and familiarisation for available ELE software packages; 3. assessment of these software packages in order to identify the advantages and disadvantages of the ELE methods used; and 4. loss analysis for a deterministic earthquake (Mw = 7.2) for the Zeytinburnu district, Istanbul, Turkey, by applying 3 software packages (2 new and 1 existing): a modified displacement-based method based on DBELA (Displacement Based Earthquake Loss Assessment, Crowley et al., 2006), a capacity spectrum based method HAZUS (HAZards United States, FEMA, USA, 2003) and the Norwegian HAZUS-based SELENA (SEismic Loss EstimatioN using a logic tree Approach, Lindholm et al., 2007) software which was adapted for use in order to compare the different processes needed for the production of damage, economic, and social loss estimates. The modified DBELA procedure was found to be more computationally expensive, yet had less variability, indicating the need for multi-tier approaches to global earthquake loss estimation. Similar systems planning and ELE software produced through the OPAL procedure can be applied to worldwide applications, given exposure data.

  1. Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)

    SciTech Connect

    Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.

    2010-09-24

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.

  2. Satellite remote sensing of earthquake, volcano, flood, landslide and coastal inundation hazards

    NASA Astrophysics Data System (ADS)

    Tralli, David M.; Blom, Ronald G.; Zlotnicki, Victor; Donnellan, Andrea; Evans, Diane L.

    Satellite remote sensing is providing a systematic, synoptic framework for advancing scientific knowledge of the Earth as a complex system of geophysical phenomena that, directly and through interacting processes, often lead to natural hazards. Improved and integrated measurements along with numerical modeling are enabling a greater understanding of where and when a particular hazard event is most likely to occur and result in significant socioeconomic impact. Geospatial information products derived from this research increasingly are addressing the operational requirements of decision support systems used by policy makers, emergency managers and responders from international and federal to regional, state and local jurisdictions. This forms the basis for comprehensive risk assessments and better-informed mitigation planning, disaster assessment and response prioritization. Space-based geodetic measurements of the solid Earth with the Global Positioning System, for example, combined with ground-based seismological measurements, are yielding the principal data for modeling lithospheric processes and for accurately estimating the distribution of potentially damaging strong ground motions which is critical for earthquake engineering applications. Moreover, integrated with interferometric synthetic aperture radar, these measurements provide spatially continuous observations of deformation with sub-centimeter accuracy. Seismic and in situ monitoring, geodetic measurements, high-resolution digital elevation models (e.g. from InSAR, Lidar and digital photogrammetry) and imaging spectroscopy (e.g. using ASTER, MODIS and Hyperion) are contributing significantly to volcanic hazard risk assessment, with the potential to aid land use planning in developing countries where the impact of volcanic hazards to populations and lifelines is continually increasing. Remotely sensed data play an integral role in reconstructing the recent history of the land surface and in predicting

  3. Assessing volcanic hazards with Vhub

    NASA Astrophysics Data System (ADS)

    Palma, J. L.; Charbonnier, S.; Courtland, L.; Valentine, G.; Connor, C.; Connor, L.

    2012-04-01

    Vhub (online at vhub.org) is a virtual organization and community cyberinfrastructure designed for collaboration in volcanology research, education, and outreach. One of the core objectives of this project is to accelerate the transfer of research tools to organizations and stakeholders charged with volcano hazard and risk mitigation (such as volcano observatories). Vhub offers a clearinghouse for computational models of volcanic processes and data analysis, documentation of those models, and capabilities for online collaborative groups focused on issues such as code development, configuration management, benchmarking, and validation. Vhub supports computer simulations and numerical modeling at two levels: (1) some models can be executed online via Vhub, without needing to download code and compile on the user's local machine; (2) other models are not available for online execution but for offline use in the user's computer. VHub also has wikis, blogs and group functions around specific topics to encourage collaboration, communication and discussion. Some of the simulation tools currently available to Vhub users are: Energy Cone (rapid delineation of the impact zone by pyroclastic density currents), Tephra2 (tephra dispersion forecast tool), Bent (atmospheric plume analysis), Hazmap (simulate sedimentation of volcanic particles) and TITAN2D (mass flow simulation tool). The list of online simulations available on Vhub is expected to expand considerably as the volcanological community becomes more involved in the project. This presentation focuses on the implementation of online simulation tools, and other Vhub's features, for assessing volcanic hazards following approaches similar to those reported in the literature. Attention is drawn to the minimum computational resources needed by the user to carry out such analyses, and to the tools and media provided to facilitate the effective use of Vhub's infrastructure for hazard and risk assessment. Currently the project

  4. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Hazard assessment. 850.21 Section 850.21 Energy DEPARTMENT... assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  5. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Hazard assessment. 850.21 Section 850.21 Energy DEPARTMENT... assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  6. Statistical analysis of time-dependent earthquake occurrence and its impact on hazard in the low seismicity region Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Hainzl, Sebastian; Scherbaum, Frank; Beauval, Céline

    2007-11-01

    The time-dependence of earthquake occurrence is mostly ignored in standard seismic hazard assessment even though earthquake clustering is well known. In this work, we attempt to quantify the impact of more realistic dynamics on the seismic hazard estimations. We include the time and space dependences between earthquakes into the hazard analysis via Monte Carlo simulations. Our target region is the Lower Rhine Embayment, a low seismicity area in Germany. Including aftershock sequences by using the epidemic type aftershock-sequence (ETAS) model, we find that on average the hypothesis of uncorrelated random earthquake activity underestimates the hazard by 5-10 per cent. Furthermore, we show that aftershock activity of past large earthquakes can locally increase the hazard even centuries later. We also analyse the impact of the so-called long-term behaviour, assuming a quasi-periodic occurrence of main events on a major fault in that region. We found that a significant impact on hazard is only expected for the special case of a very regular recurrence of the main shocks.

  7. Hazards assessment for the INEL Landfill Complex

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-02-01

    This report documents the hazards assessment for the INEL Landfill Complex (LC) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and the DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes the hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding the LC, the buildings and structures at the LC, and the processes that are used at the LC are described in this report. All hazardous materials, both radiological and nonradiological, at the LC were identified and screened against threshold quantities according to DOE Order 5500.3A guidance. Asbestos at the Asbestos Pit was the only hazardous material that exceeded its specified threshold quantity. However, the type of asbestos received and the packaging practices used are believed to limit the potential for an airborne release of asbestos fibers. Therefore, in accordance with DOE Order 5500.3A guidance, no further hazardous material characterization or analysis was required for this hazards assessment.

  8. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    USGS Publications Warehouse

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the

  9. Natural Hazard Assessment and Communication in the Central United States

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Lynch, M. J.

    2009-12-01

    In the central United States, natural hazards, such as floods, tornados, ice storms, droughts, and earthquakes, result in significant damages and losses of life every year. For example, the February 5-6, 2008 tornado touched down in nine states (Alabama, Arkansas, Illinois, Indiana, Kentucky, Mississippi, Missouri, and Tennessee), killing 57, injuring 350, and causing more than 1.0 billion in damages. The January 2009 ice storm struck Arkansas, Illinois, Indiana, Kentucky, Missouri, Ohio, Tennessee, and West Virginia, killing 36 and causing more than 1.0 billion in damages. It is a great challenge for the society to develop an effective policy for mitigating these natural hazards in the central United States. However, the development of an effective policy starts with a good assessment of the natural hazards. Scientists play a key role in assessing the natural hazards. Therefore, scientists play an important role in the development of an effective policy for the natural hazard mitigation. It is critical for scientists to clearly define, quantify, and communicate the hazard assessments, including the associated uncertainties which are a key factor in policy decision making, to end-users. Otherwise, end-users will have difficulty understanding and using the information provided. For example, ground motion hazard maps with 2, 5, and 10 percent probabilities of exceedance (PE) in 50 years in the central United States have been produced for seismic hazard mitigation purpose. End-users have difficulty understanding and using the maps, however, which has led to either indecision or ineffective policy for seismic hazard mitigation in many communities in the central United States.

  10. The U.S. Geological Survey Earthquake Hazards Program Website: Summary of Recent and Ongoing Developments

    NASA Astrophysics Data System (ADS)

    Wald, L. A.; Zirbes, M.; Robert, S.; Wald, D.; Presgrace, B.; Earle, P.; Schwarz, S.; Haefner, S.; Haller, K.; Rhea, S.

    2003-12-01

    The U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) website (http://earthquake.usgs.gov/) focuses on 1) earthquake reporting for informed decisions after an earthquake, 2) hazards information for informed decisions and planning before an earthquake, and 3) the basics of earthquake science to help the users of the information understand what is presented. The majority of website visitors are looking for information about current earthquakes in the U.S. and around the world, and the second most visited portion of the website are the education-related pages. People are eager for information, and they are most interested in "what's in my backyard?" Recent and future web developments are aimed at answering this question, making the information more relevant to users, and enabling users to more quickly and easily find the information they are looking for. Recent and/or current web developments include the new enhanced Recent Global Earthquakes and U.S. Earthquakes webpages, the Earthquake in the News system, the Rapid Accurate Tectonic Summaries (RATS), online Significant Earthquake Summary Posters (ESP's), and the U.S. Quaternary Fault & Fold Database, the details of which are covered individually in greater detail in this or other sessions. Future planned developments include a consistent look across all EHP webpages, an integrated one-stop-shopping earthquake notification (EQMail) subscription webpage, new navigation tabs, and a backend database allowing the user to search for earthquake information across all the various EHP websites (on different webservers) based on a topic or region. Another goal is to eventually allow a user to input their address (Zip Code?) and in return receive all the relevant EHP information (and links to more detailed information) such as closest fault, the last significant nearby earthquake, a local seismicity map, and a local hazard map, for example. This would essentially be a dynamic report based on the entered location

  11. Aftereffects of Subduction-Zone Earthquakes: Potential Tsunami Hazards along the Japan Sea Coast.

    PubMed

    Minoura, Koji; Sugawara, Daisuke; Yamanoi, Tohru; Yamada, Tsutomu

    2015-01-01

    The 2011 Tohoku-Oki Earthquake is a typical subduction-zone earthquake and is the 4th largest earthquake after the beginning of instrumental observation of earthquakes in the 19th century. In fact, the 2011 Tohoku-Oki Earthquake displaced the northeast Japan island arc horizontally and vertically. The displacement largely changed the tectonic situation of the arc from compressive to tensile. The 9th century in Japan was a period of natural hazards caused by frequent large-scale earthquakes. The aseismic tsunamis that inflicted damage on the Japan Sea coast in the 11th century were related to the occurrence of massive earthquakes that represented the final stage of a period of high seismic activity. Anti-compressive tectonics triggered by the subduction-zone earthquakes induced gravitational instability, which resulted in the generation of tsunamis caused by slope failing at the arc-back-arc boundary. The crustal displacement after the 2011 earthquake infers an increased risk of unexpected local tsunami flooding in the Japan Sea coastal areas. PMID:26399180

  12. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  13. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases

  14. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  15. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  16. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  17. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the

  18. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    USGS Publications Warehouse

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  19. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    NASA Astrophysics Data System (ADS)

    Boyd, Oliver Salz; Magistrale, Harold

    2011-09-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  20. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  1. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  2. Earthquakes in Hawai‘i—an underappreciated but serious hazard

    USGS Publications Warehouse

    Okubo, Paul G.; Nakata, Jennifer S.

    2011-01-01

    The State of Hawaii has a history of damaging earthquakes. Earthquakes in the State are primarily the result of active volcanism and related geologic processes. It is not a question of "if" a devastating quake will strike Hawai‘i but rather "when." Tsunamis generated by both distant and local quakes are also an associated threat and have caused many deaths in the State. The U.S. Geological Survey (USGS) and its cooperators monitor seismic activity in the State and are providing crucial information needed to help better prepare emergency managers and residents of Hawai‘i for the quakes that are certain to strike in the future.

  3. Understanding Earthquake Hazard & Disaster in Himalaya - A Perspective on Earthquake Forecast in Himalayan Region of South Central Tibet

    NASA Astrophysics Data System (ADS)

    Shanker, D.; Paudyal, ,; Singh, H.

    2010-12-01

    It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were

  4. Seismic hazard assessment and mitigation in India: an overview

    NASA Astrophysics Data System (ADS)

    Verma, Mithila; Bansal, Brijesh K.

    2013-07-01

    The Indian subcontinent is characterized by various tectonic units viz., Himalayan collision zone in North, Indo-Burmese arc in north-east, failed rift zones in its interior in Peninsular Indian shield and Andaman Sumatra trench in south-east Indian Territory. During the last about 100 years, the country has witnessed four great and several major earthquakes. Soon after the occurrence of the first great earthquake, the Shillong earthquake ( M w: 8.1) in 1897, efforts were started to assess the seismic hazard in the country. The first such attempt was made by Geological Survey of India in 1898 and since then considerable progress has been made. The current seismic zonation map prepared and published by Bureau of Indian Standards, broadly places seismic risk in different parts of the country in four major zones. However, this map is not sufficient for the assessment of area-specific seismic risks, necessitating detailed seismic zoning, that is, microzonation for earthquake disaster mitigation and management. Recently, seismic microzonation studies are being introduced in India, and the first level seismic microzonation has already been completed for selected urban centres including, Jabalpur, Guwahati, Delhi, Bangalore, Ahmadabad, Dehradun, etc. The maps prepared for these cities are being further refined on larger scales as per the requirements, and a plan has also been firmed up for taking up microzonation of 30 selected cities, which lie in seismic zones V and IV and have a population density of half a million. The paper highlights the efforts made in India so far towards seismic hazard assessment as well as the future road map for such studies.

  5. Tsunami Forecast Technology for Asteroid Impact Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Moore, C. W.

    2015-12-01

    Over 75% of all historically documented tsunamis have been generated by earthquakes. As the result, all existing Tsunami Warning and Forecast systems focus almost exclusively on detecting, warning and forecasting earthquake-generated tsunamis.The sequence of devastating tsunamis across the globe over the past 10 years has significantly heightened awareness and preparation activities associated with these high-impact events. Since the catastrophic 2004 Sumatra tsunami, NOAA has invested significant efforts in modernizing the U.S. tsunami warning system. Recent developments in tsunami modeling capability, inundation forecasting, sensing networks, dissemination capability and local preparation and mitigation activities have gone a long way toward enhancing tsunami resilience within the United States. The remaining quarter of the tsunami hazard problem is related to other mechanisms of tsunami generation, that may not have received adequate attention. Among those tsunami sources, the asteroid impact may be the most exotic, but possible one of the most devastating tsunami generation mechanisms. Tsunami forecast capabilities that have been developed for the tsunami warning system can be used to explore both, hazard assessment and the forecast of a tsunami generated by the asteroid impact. Existing tsunami flooding forecast technology allows for forecast for non-seismically generated tsunamis (asteroid impact, meteo-generated tsunamis, landslides, etc.), given an adequate data for the tsunami source parameters. Problems and opportunities for forecast of tsunamis from asteroid impact will be discussed. Preliminary results of impact-generated tsunami analysis for forecast and hazard assessment will be presented.

  6. The 2015 Illapel earthquake: a comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Tilmann, Frederik; Zhang, Yong; Moreno, Maros; Saul, Joachim; Eckelmann, Felix; Palo, Mauro; Deng, Zhiguo; Babeyko, Andrey; Chen, Kejie; Baez, Juan-Carlos; Schurr, Bernd; Wang, Rongjiang; Dahm, Torsten

    2016-04-01

    dominate the aftershock sequence but there are also some thrust events in the forearc crust and some shallow normal faulting events in the oceanic crust below the trench. In 1943, an earthquake of comparable along-strike extent occurred in the Illapel area. The similar extent of the aftershock zone and tsunami heights therefore make this part of the margin a candidate site for generating characteristic earthquakes, in particular as the 1943 event was itself preceded by an event in 1880, again with apparently the same part of the margin affected. The approximate match of peak slip and accumulated slip deficit in the 72 years since the 1943 event also support this interpretation. However, the 1943 Illapel event appears to have had a shorter source time function and probably a smaller magnitude than the 2015 event, pointing to differences in the detailed rupture evolution. The coupling is mostly close to fully locked in this area at least along the coast line but nevertheless the coseismic rupture is associated with a local peak in the locking pattern, whereas a distinct narrow partially interseismically creeping area is found just to the south of the main rupture. The northern transition to lower locking is more gradual but also here the rupture can be said to have terminated against a zone of reduced locking. Although locally the recent Illapel earthquake has relieved much of the accumulated stress, the segment immediately adjacent to the north remains unbroken since 1922, and presents a serious earthquake and tsunami hazard.

  7. Too generous to a fault? Is reliable earthquake safety a lost art? Errors in expected human losses due to incorrect seismic hazard estimates

    NASA Astrophysics Data System (ADS)

    Bela, James

    2014-11-01

    "One is well advised, when traveling to a new territory, to take a good map and then to check the map with the actual territory during the journey." In just such a reality check, Global Seismic Hazard Assessment Program (GSHAP) maps (prepared using PSHA) portrayed a "low seismic hazard," which was then also assumed to be the "risk to which the populations were exposed." But time-after-time-after-time the actual earthquakes that occurred were not only "surprises" (many times larger than those implied on the maps), but they were often near the maximum potential size (Maximum Credible Earthquake or MCE) that geologically could occur. Given these "errors in expected human losses due to incorrect seismic hazard estimates" revealed globally in these past performances of the GSHAP maps (> 700,000 deaths 2001-2011), we need to ask not only: "Is reliable earthquake safety a lost art?" but also: "Who and what were the `Raiders of the Lost Art?' "

  8. Assessing natural hazard risk using images and data

    NASA Astrophysics Data System (ADS)

    Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.

    2012-12-01

    Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.

  9. Tsunami hazards along Chinese coast from potential earthquakes in South China Sea

    NASA Astrophysics Data System (ADS)

    Liu, Yingchun; Santos, Angela; Wang, Shuo M.; Shi, Yaolin; Liu, Hailing; Yuen, David A.

    2007-08-01

    The pair of earthquakes off Taiwan on December 26, 2006 and the subsequent disruption of the Internet traffic have called attention to the potential destructive hazards along the Chinese coast from tsunamis. Historical records show past tsunami earthquakes in this region. Using GPS, earthquake focal mechanisms and geological evolution, we have delineated the dangerous zones in the Philippine Sea plate where major earthquakes may occur. The Manila Trench is identified as being most susceptible to future major earthquakes. We have obtained the local Gutenberg-Richter relationship for five sections along the Philippine Sea plate boundary and use this information for determining the probability distribution for tsunami waves of various heights to impinge on various Chinese cities. We devise a new method called the probabilistic forecast of tsunami hazard (PFTH), which determines this probability distribution by direct numerical simulation of the waves excited by hypothetical earthquakes in these zones. We have employed the linear shallow-water equations over the South China Sea. We have also compared them with results from the nonlinear version and found that the linear treatment serves our purpose sufficiently well. In the next century the probability of a wave with a height of over 2.0 m to hit near-coast ocean of Hong Kong and Macau is about 10%. Cities in Taiwan are less vulnerable than those on the mainland coast.

  10. 222-S laboratory complex hazards assessment

    SciTech Connect

    Broz, R.E.

    1994-08-29

    The US Department of Energy (DOE) Order 5500.3A, Emergency Planning and Preparedness for Operational Emergencies, requires that a facility specific hazards assessment be performed to support Emergency Planning activities. The Hazard Assessment establishes the technical basis for the Emergency Action Levels (EALs) and the Emergency Planning Zone (EPZ). Emergency Planning activities are provided under contract to DOE through the Westinghouse Hanford Company (WHC). This document represents the facility specific hazards assessment for the Hanford Site 222-S Laboratories. The primary mission of 222-S is to provide analytic chemistry support to the Waste Management, Chemical Processing, and Environmental programs at the Hanford Site.

  11. KSC VAB Aeroacoustic Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.

    2010-01-01

    NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.

  12. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-01-01

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and in preparing emergency response plans. The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group of California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping (NSHM) Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault to the east of the study area. Earthquake scenarios are intended to depict the potential consequences of significant earthquakes. They are not necessarily the largest or most damaging earthquakes possible. Earthquake scenarios are both large enough and likely enough that emergency planners should consider them in regional emergency response plans. Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM).For the Hilton Creek Fault, two alternative scenarios were developed in addition to the NSHM scenario to account for different opinions in how far north the fault extends into the Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice

  13. Towards Tsunami Hazard Assessment for the Coasts of Italy

    NASA Astrophysics Data System (ADS)

    Lorito, S.; Romano, F.; Piatanesi, A.; Basili, R.; Kastelic, V.; Tiberti, M.; Valensise, G.; Selva, J.

    2011-12-01

    A reliable Probabilistic Tsunami Hazard Assessment (PTHA) requires an enormous computational effort. We are developing an approach for limiting the computational burden while trying to preserve the variability of the tsunamigenic seismic sources. We split the PTHA into two stages: linear PTHA and nonlinear PTHA. In the first stage, we explore a large variety of seismic sources, representing the most likely complete set of potential sources, and estimate the tsunami propagation in the linear approximation for all the considered target coastlines. We then sample the most hazardous sub-regions of the source parameters/target sites space, by assuming zero probability of hazard threshold exceedance for the remainder. With this subset, which forms the basis for the second stage, we generate probabilistic inundation maps for several damage metrics. We present preliminary results of PTHA for the coasts of Italy. To take into account the different levels of knowledge of potential earthquake sources in different areas of the Mediterranean Sea, we define a logic tree that mainly represents the uncertainties related to seismic source existence. We then use an event tree approach for describing the variability of earthquake parameters.

  14. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  15. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  16. Apparent stress, fault maturity and seismic hazard for normal-fault earthquakes at subduction zones

    USGS Publications Warehouse

    Choy, G.L.; Kirby, S.H.

    2004-01-01

    The behavior of apparent stress for normal-fault earthquakes at subduction zones is derived by examining the apparent stress (?? a = ??Es/Mo, where E s is radiated energy and Mo is seismic moment) of all globally distributed shallow (depth, ?? 1 MPa) are also generally intraslab, but occur where the lithosphere has just begun subduction beneath the overriding plate. They usually occur in cold slabs near trenches where the direction of plate motion across the trench is oblique to the trench axis, or where there are local contortions or geometrical complexities of the plate boundary. Lower ??a (< 1 MPa) is associated with events occurring at the outer rise (OR) complex (between the OR and the trench axis), as well as with intracrustal events occurring just landward of the trench. The average apparent stress of intraslab-normal-fault earthquakes is considerably higher than the average apparent stress of interplate-thrust-fault earthquakes. In turn, the average ?? a of strike-slip earthquakes in intraoceanic environments is considerably higher than that of intraslab-normal-fault earthquakes. The variation of average ??a with focal mechanism and tectonic regime suggests that the level of ?? a is related to fault maturity. Lower stress drops are needed to rupture mature faults such as those found at plate interfaces that have been smoothed by large cumulative displacements (from hundreds to thousands of kilometres). In contrast, immature faults, such as those on which intraslab-normal-fault earthquakes generally occur, are found in cold and intact lithosphere in which total fault displacement has been much less (from hundreds of metres to a few kilometres). Also, faults on which high ??a oceanic strike-slip earthquakes occur are predominantly intraplate or at evolving ends of transforms. At subduction zones, earthquakes occurring on immature faults are likely to be more hazardous as they tend to generate higher amounts of radiated energy per unit of moment than

  17. Earthquakes and faults at Mt. Etna (Italy): time-dependent approach to the seismic hazard of the eastern flank

    NASA Astrophysics Data System (ADS)

    Peruzza, L.; Azzaro, R.; D'Amico, S.; Tuve', T.

    2009-04-01

    A time dependent approach to seismic hazard assessment, based on a renewal model using the Brownian Passage Time (BPT) distribution, has been applied to the best-known seismogenic faults at Mt. Etna volcano. These structures have been characterised by frequent coseismic surface displacement, and a long list of historically well-documented earthquakes occurred in the last 200 years (CMTE catalogue, Azzaro et al., 2000, 2002, 2006). Seismic hazard estimates, given in terms of earthquake rupture forecast, are conditioned to the time elapsed since the last event: impending events are expected on the S. Tecla Fault, and secondly on the Moscatello Fault, both involved in the highly active, geodynamic processes affecting the eastern flank of Mt. Etna. Mean recurrence time of major events is calibrated by merging the inter-event times observed at each fault; aperiodicity is tuned on b-values, following the approach proposed by Zoeller et al. (2008). Finally we compare these mean recurrence times with the values obtained by using only geometrical and kinematic information, as defined in Peruzza et al. (2008) for faults in Italy. Time-dependent hazard assessment is compared with the stationary assumption of seismicity, and validated in a retrospective forward model. Forecasted rates in a 5 years perspective (1st April 2009 to 1st April 2014), on magnitude bins compatible with macroseismic data are available for testing in the frame of the CSEP (Collaboratory for the study of Earthquake Predictability, www.cseptesting.org) project. Azzaro R., Barbano M.S., Antichi B., Rigano R.; 2000: Macroseismic catalogue of Mt. Etna earthquakes from 1832 to 1998. Acta Volcanol., con CD-ROM, 12 (1), 3-36. http://www.ct.ingv.it/Sismologia/macro/default.htm Azzaro R., D'Amico S., Mostaccio A., Scarfì L.; 2002: Terremoti con effetti macrosismici in Sicilia orientale - Calabria meridionale nel periodo Gennaio 1999 - Dicembre 2001. Quad. di Geof., 27, 1-59. Azzaro R., D'Amico S., Mostaccio A

  18. Seismicity and earthquake hazard analysis of the Teton-Yellowstone region, Wyoming

    NASA Astrophysics Data System (ADS)

    White, Bonnie J. Pickering; Smith, Robert B.; Husen, Stephan; Farrell, Jamie M.; Wong, Ivan

    2009-11-01

    Earthquakes of the Teton-Yellowstone region represent a high level of seismicity in the Intermountain west (U.S.A.) that is associated with intraplate extension associated with the Yellowstone hotspot including the nearby Teton and Hebgen Lake faults. The seismicity and the occurrence of high slip-rate late Quaternary faults in this region leads to a high level of seismic hazard that was evaluated using new earthquake catalogues determined from three-dimensional (3-D) seismic velocity models, followed by the estimation of the probabilistic seismic hazard incorporating fault slip and background earthquake occurrence rates. The 3-D P-wave velocity structure of the Teton region was determined using local earthquake data from the Jackson Lake seismic network that operated from 1986-2002. An earthquake catalog was then developed for 1986-2002 for the Teton region using relocated hypocenters. The resulting data revealed a seismically quiescent Teton fault, at M L, local magnitude > 3, with diffuse seismicity in the southern Jackson Hole Valley area but notable seismicity eastward into the Gros Ventre Range. Relocated Yellowstone earthquakes determined by the same methods highlight a dominant E-W zone of seismicity that extends from the aftershock area of the 1959 (M S surface wave magnitude) 7.5 Hebgen Lake, Montana, earthquake along the north side of the 0.64 Ma Yellowstone caldera. Earthquakes are less frequent and shallow beneath the Yellowstone caldera and notably occur along northward trending zones of activity sub-parallel to the post-caldera volcanic vents. Stress-field orientations derived from inversion of focal mechanism data reveal dominant E-W extension across the Teton fault with a NE-SW extension along the northern Teton fault area and southern Yellowstone. The minimum stress axes directions then rotate to E-W extension across the Yellowstone caldera to N-S extension northwest of the caldera and along the Hebgen Lake fault zone. The combination of accurate

  19. Evolution trends in vulnerability of R/C buildings exposed to earthquake induced landslide hazard

    NASA Astrophysics Data System (ADS)

    Fotopoulou, S.; Pitilakis, K.

    2012-04-01

    The assessment of landslide risk depends on the evaluation of landslide hazard and the vulnerability of exposed structures which both change with time. The real, dynamic vulnerability modeling of structures due to landslides may be significantly affected by aging considerations, anthropogenic actions, cumulative damage from past landslide events and retrofitting measures. The present work aims at the development of an efficient analytical methodology to assess the evolution of building vulnerability with time exposed to earthquake -induced landslide hazard. In particular, the aging of typical RC buildings is considered by including probabilistic models of corrosion deterioration of the RC elements within the vulnerability modeling framework. Two potential adverse corrosion scenarios are examined: chloride and carbonation induced corrosion of the steel reinforcement. An application of the proposed methodology to reference low-rise RC buildings exposed to the combined effect of seismically induced landslide differential displacements and reinforcement corrosion is provided. Both buildings with stiff and flexible foundation system standing near the crest of a potentially precarious soil slope are examined. Non linear static time history analyses of the buildings are performed using a fibre-based finite element code. In this analysis type, the applied loads (displacements) at the foundation level vary in the pseudo-time domain, according to a load pattern prescribed as the differential permanent landslide displacement (versus time) curves triggered by the earthquake. The distribution for the corrosion initiation time is assessed through Monte Carlo simulation using appropriate probabilistic models for the carbonation and the chloride induced corrosion. Then, the loss of area of steel over time due to corrosion of the RC elements is modeled as a reduction in longitudinal reinforcing bar cross-sectional area in the fibre section model. Time dependent structural limit

  20. Cruise report for A1-00-SC southern California earthquake hazards project, part A

    USGS Publications Warehouse

    Gutmacher, Christina E.; Normark, William R.; Ross, Stephanie L.; Edwards, Brian D.; Sliter, Ray; Hart, Patrick; Cooper, Becky; Childs, Jon; Reid, Jane A.

    2000-01-01

    A three-week cruise to obtain high-resolution boomer and multichannel seismic-reflection profiles supported two project activities of the USGS Coastal and Marine Geology (CMG) Program: (1) evaluating the earthquake and related geologic hazards posed by faults in the near offshore area of southern California and (2) determining the pathways through which sea-water is intruding into aquifers of Los Angeles County in the area of the Long Beach and Los Angeles harbors. The 2000 cruise, A1-00-SC, is the third major data-collection effort in support of the first objective (Normark et al., 1999a, b); one more cruise is planned for 2002. This report deals primarily with the shipboard operations related to the earthquake-hazard activity. The sea-water intrusion survey is confined to shallow water and the techniques used are somewhat different from that of the hazards survey (see Edwards et al., in preparation).

  1. Global Seismic Hazard Assessment Program Maps Are Misleading

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A. K.

    2010-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and people, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. The Global Seismic Hazard Assessment Program (GSHAP) project was launched in 1992 by the International Lithosphere Program (ILP) with the support of the International Council of Scientific Unions (ICSU), and endorsed as a demonstration program in the framework of the United Nations International Decade for Natural Disaster Reduction (UN/IDNDR). The GSHAP project terminated in 1999 when the probabilistic seismic hazard assessment maps and digital data got published (e.g., URL www.seismo.ethz.ch/GSHAP/). The majority of recent disastrous earthquakes, like the 12 January 2010 Port-au-Prince (Haiti), the 12 May 2008 Wenchuan (Sichuan, China), …, the 26 January 2001 Bhuj (Gujarat, India) prove that the maps resulted from GSHAP are evidently misleading. We have performed a systematic comparison of the GSHAP peak ground acceleration (PGA) values with those related to strong earthquakes in 2000-2010. Each of the 1320 shallow magnitude 6 or larger earthquakes has from 4 to 9 values of the GSHAP PGA at the distance less than 12 km from its epicenter. When transforms to intensity are applied, e.g., MMI(M) = 1.5 (M - 1) (Gutenberg, Richter, 1954) and MMI(PGA) = 1.27 Ln(PGA) - 3.74 (Shteinberg et al. 1993), the difference between the observed and GSHAP estimates MMI(M) - MMI(PGA) is above 1.6 on average while its median equals 2.5. Moreover, for 51 out of 56 magnitude 7.5 or larger events in 2000-2010, the difference is above 1, while for 30 of

  2. Citizen Seismology Provides Insights into Ground Motions and Hazard from Injection-Induced Earthquakes

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2014-12-01

    The US Geological Survey "Did You Feel It?" (DYFI) system is a highly successful example of citizen seismology. Users around the world now routinely report felt earthquakes via the Web; this information is used to determine Community Decimal Intensity values. These data can be enormously valuable for helping address a key issue that has arisen recently: quantifying the shaking/hazard associated with injection-induced earthquakes. I consider the shaking from 11 moderate (Mw3.9-5.7) earthquakes in the central and eastern United States that are believed to be induced by fluid injection. The distance decay of intensities for all events is consistent with that observed for regional tectonic earthquakes, but for all of the events intensities are lower than values predicted from an intensity prediction equation derived using data from tectonic events. I introduce an effective intensity magnitude, MIE, defined as the magnitude that on average would generate a given intensity distribution. For all 11 events, MIE is lower than the event magnitude by 0.4-1.3 units, with an average difference of 0.8 units. This suggests that stress drops of injection-induced earthquakes are lower than tectonic earthquakes by a factor of 2-10. However, relatively limited data suggest that intensities for epicentral distances less than 10 km are more commensurate with expectations for the event magnitude, which can be explained by the shallow focal depth of the events. The results suggest that damage from injection-induced earthquakes will be especially concentrated in the immediate epicentral region. These results further suggest a potential new discriminant for the identification of induced events. For ecample, while systematic analysis of California earthquakes remains to be done, DYFI data from the 2014 Mw5.1 La Habra, California, earthquake reveal no evidence for unusually low intensities, adding to a growing volume of evidence that this was a natural tectonic event.

  3. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  4. Virtual California, ETAS, and OpenHazards web services: Responding to earthquakes in the age of Big Data

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Schultz, K.; Rundle, J. B.; Glasscoe, M. T.; Donnellan, A.

    2014-12-01

    The response to the 2014 m=6 Napa earthquake showcased data driven services and technologies that aided first responders and decision makers to quickly assess damage, estimate aftershock hazard, and efficiently allocate resources where where they were most needed. These tools have been developed from fundamental research as part of a broad collaboration -- facilitated in no small party by the California Earthquake Clearinghouse, between researchers, policy makers, and executive decision makers and practiced and honed during numerous disaster response exercises over the past several years. On 24 August 2014, and the weeks following the m=6 Napa event, it became evident that these technologies will play an important role in the response to natural (and other) disasters in the 21st century. Given the continued rapid growth of computational capabilities, remote sensing technologies, and data gathering capacities -- including by unpiloted aerial vehicles (UAVs), it is reasonable to expect that both the volume and variety of data available during a response scenario will grow significantly in the decades to come. Inevitably, modern Data Science will be critical to effective disaster response in the 21st century. In this work, we discuss the roles that earthquake simulators, statistical seismicity models, and remote sensing technologies played in the the 2014 Napa earthquake response. We further discuss "Big Data" technologies and data models that facilitate the transformation of raw data into disseminable information and actionable products, and we outline a framework for the next generation of disaster response data infrastructure.

  5. An evaluation of earthquake hazard parameters in the Iranian Plateau based on the Gumbel III distribution

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Bayrak, Yusuf

    2016-04-01

    The Gumbel's third asymptotic distribution (GIII) of the extreme value method is employed to evaluate the earthquake hazard parameters in the Iranian Plateau. This research quantifies spatial mapping of earthquake hazard parameters like annual and 100-year mode beside their 90 % probability of not being exceeded (NBE) in the Iranian Plateau. Therefore, we used a homogeneous and complete earthquake catalogue during the period 1900-2013 with magnitude M w ≥ 4.0, and the Iranian Plateau is separated into equal area mesh of 1° late × 1° long. The estimated result of annual mode with 90 % probability of NBE is expected to exceed the values of M w 6.0 in the Eastern part of Makran, most parts of Central and East Iran, Kopeh Dagh, Alborz, Azerbaijan, and SE Zagros. The 100-year mode with 90 % probability of NBE is expected to overpass the value of M w 7.0 in the Eastern part of Makran, Central and East Iran, Alborz, Kopeh Dagh, and Azerbaijan. The spatial distribution of 100-year mode with 90 % probability of NBE uncovers the high values of earthquake hazard parameters which are frequently connected with the main tectonic regimes of the studied area. It appears that there is a close communication among the seismicity and the tectonics of the region.

  6. Assessment and Prediction of Natural Hazards from Satellite Imagery

    PubMed Central

    Gillespie, Thomas W.; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan

    2013-01-01

    Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth’s surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth’s surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space. PMID:25170186

  7. Earthquake Hazard When the Rate Is Non-Stationary: The Challenge of the U. S. Midcontinent

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.; Cochran, E. S.; Llenos, A. L.; McGarr, A.; Michael, A. J.; Mueller, C. S.; Petersen, M. D.; Rubinstein, J. L.

    2014-12-01

    In July 2014, the U. S. Geological Survey released an update of the 2008 National Seismic Hazard Map for the coterminous U. S. The Map provides guidance for the seismic provisions of the building codes and portrays ground motions with a 2% chance of being exceeded in an exposure time of 50 years. Over most of the midcontinent the hazard model is derived by projecting the long-term historic, declustered earthquake rate forward in time. However, parts of the midcontinent have experienced increased seismicity levels since 2009 - locally by 2 orders of magnitude - which is incompatible with the underlying assumption of a constant-rate Poisson process. The 2014 Map acknowledged this problem, and for its intended purpose of underpinning seismic design used seismicity rates that are consistent with the entire historic record. Both the developers of the Map and its critics acknowledge that the remarkable rise of seismicity in Oklahoma and nearby states must be addressed if we are to fully capture the hazard in both space and time. The nature of the space/time distribution of the increased seismicity, as well as numerous published case studies strongly suggest that much of the increase is of anthropogenic origin. If so, the assumptions and procedures used to forecast natural earthquake rates from past rates may not be appropriate. Here we discuss key issues that must be resolved include: the geographic location of areas with elevated seismicity, either active now or potentially active in the future; local geologic conditions including faults and the state of stress; the spatial smoothing of catalog seismicity; the temporal evolution of the earthquake rate change; earthquake sequence statistics including clustering behavior; the magnitude-frequency distribution of the excess earthquakes, particularly to higher and yet unobserved magnitudes; possible source process differences between natural and induced earthquakes; and the appropriate ground motion prediction equations.

  8. Reducing the Risks of Nonstructural Earthquake Damage: A Practical Guide. Earthquake Hazards Reduction Series 1.

    ERIC Educational Resources Information Center

    Reitherman, Robert

    The purpose of this booklet is to provide practical information to owners, operators, and occupants of office and commercial buildings on the vulnerabilities posed by earthquake damage to nonstructural items and the means available to deal with these potential problems. Examples of dangerous nonstructural damages that have occurred in past…

  9. Multi Hazard Assessment: The Azores Archipelagos (PT) case

    NASA Astrophysics Data System (ADS)

    Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos

    2016-04-01

    The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall

  10. Assessment of seismic hazards along the northern Gulf of Aqaba

    NASA Astrophysics Data System (ADS)

    Abueladas, Abdel-Rahman Aqel

    Aqaba and Elat are very important port and recreation cities for the Hashemite Kingdom of Jordan and Israel, respectively. The two cities are the most susceptible to damage from a destructive future earthquake because they are located over the tectonically active Dead Sea transform fault (DST) that is the source of most of the major historical earthquakes in the region. The largest twentieth century earthquake on the DST, the magnitude Mw 7.2 Nuweiba earthquake of November 22, 1995, caused damage to structures in both cities. The integration of geological, geophysical, and earthquake engineering studies will help to assess the seismic hazards by determining the location and slip potential of active faults and by mapping areas of high liquefaction susceptibility. Ground Penetrating Radar (GPR) as a high resolution shallow geophysical tool was used to map the shallow active faults in Aqaba, Taba Sabkha area, and Elat. The GPR data revealed the onshore continuation of the Evrona, West Aqaba, Aqaba fault zones, and several transverse faults. The integration of offshore and onshore data confirm the extension of these faults along both sides of the Gulf of Aqaba. A 3D model of GPR data at one site in Aqaba indicates that the NW-trending transverse faults right laterally offset older than NE-trending faults. The most hazardous fault is the Evrona fault which extends north to the Tabs Sabkha. A geographic information system (GIS) database of the seismic hazard was created in order to facilitate the analyzing, manipulation, and updating of the input parameters. Liquefaction potential maps were created for the region based on analysis of borehole data. The liquefaction map shows high and moderate liquefaction susceptibility zones along the northern coast of the Gulf of Aqaba. In Aqaba several hotels are located within a high and moderate liquefaction zones. The Yacht Club, Aqaba, Ayla archaeological site, and a part of commercial area are also situated in a risk area. A part

  11. Assessment of landslide hazards resulting from the February 13, 2001, El Salvador earthquake; a report to the government of El Salvador and the U. S. Agency for International Development

    USGS Publications Warehouse

    Baum, Rex L.; Crone, Anthony J.; Escobar, Demetreo; Harp, Edwin L.; Major, Jon J.; Martinez, Mauricio; Pullinger, Carlos; Smith, Mark E.

    2001-01-01

    On February 13, 2001, a magnitude 6.5 earthquake occurred about 40 km eastsoutheast of the capital city of San Salvador in central El Salvador and triggered thousands of landslides in the area east of Lago de Ilopango. The landslides are concentrated in a 2,500-km2 area and are particularly abundant in areas underlain by thick deposits of poorly consolidated, late Pleistocene and Holocene Tierra Blanca rhyolitic tephras that were erupted from Ilopango caldera. Drainages in the tephra deposits are deeply incised, and steep valley walls failed during the strong shaking. Many drainages are clogged with landslide debris that locally buries the adjacent valley floor. The fine grain-size of the tephra facilitates its easy mobilization by rainfall runoff. The potential for remobilizing the landslide debris as debris flows and in floods is significant as this sediment is transported through the drainage systems during the upcoming rainy season. In addition to thousands of shallow failures, two very large landslides occurred that blocked the Rio El Desague and the Rio Jiboa. The Rio El Desague landslide has an estimated volume of 1.5 million m3, and the Rio Jiboa landslide has an estimated volume of 12 million m3. Field studies indicate that catastrophic draining of the Rio El Desague landslide-dammed lake would pose a minimal flooding hazard, whereas catastrophic draining of the Rio Jiboa lake would pose a serious hazard and warrants immediate action. Construction of a spillway across part of the dam could moderate the impact of catastrophic lake draining and the associated flood. Two major slope failures on the northern side of Volcan San Vicente occurred in the upper reaches of Quebrada Del Muerto and the Quebrada El Blanco. The landslide debris in the Quebrada Del Muerto consists dominantly of blocks of well-lithified andesite, whereas the debris in the Quebrada El Blanco consists of poorly consolidated pyroclastic sediment. The large blocks of lithified rock in

  12. St. Louis Area Earthquake Hazards Mapping Project - A PowerPoint Presentation

    USGS Publications Warehouse

    Williams, Robert A.

    2009-01-01

    This Open-File Report contains illustrative materials, in the form of PowerPoint slides, used for an oral presentation given at the Earthquake Insight St. Louis, Mo., field trip held on May 28, 2009. The presentation focused on summarizing the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) justification, goals, achievements, and products, for an audience of business and public officials. The individual PowerPoint slides highlight, in an abbreviated format, the topics addressed; they are discussed below and are explained with additional text as appropriate.

  13. Next-Level ShakeZoning for Earthquake Hazard Definition in Nevada

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Savran, W. H.; Flinchum, B. A.; Dudley, C.; Prina, N.; Pullammanappallil, S.; Pancha, A.

    2011-12-01

    We are developing "Next-Level ShakeZoning" procedures tailored for defining earthquake hazards in Nevada. The current Federally sponsored tools- the USGS hazard maps and ShakeMap, and FEMA HAZUS- were developed as statistical summaries to match earthquake data from California, Japan, and Taiwan. The 2008 Wells and Mogul events in Nevada showed in particular that the generalized statistical approach taken by ShakeMap cannot match actual data on shaking from earthquakes in the Intermountain West, even to first order. Next-Level ShakeZoning relies on physics and geology to define earthquake shaking hazards, rather than statistics. It follows theoretical and computational developments made over the past 20 years, to capitalize on detailed and specific local data sets to more accurately model the propagation and amplification of earthquake waves through the multiple geologic basins of the Intermountain West. Excellent new data sets are now available for Las Vegas Valley. Clark County, Nevada has completed the nation's very first effort to map earthquake hazard class systematically through an entire urban area using Optim's SeisOpt° ReMi technique, which was adapted for large-scale data collection. Using the new Parcel Map in computing shaking in the Valley for scenario earthquakes is crucial for obtaining realistic predictions of ground motions. In an educational element of the project, a dozen undergraduate students have been computing 50 separate earthquake scenarios affecting Las Vegas Valley, using the Next-Level ShakeZoning process. Despite affecting only the upper 30 meters, the Vs30 geotechnical shear-velocity from the Parcel Map shows clear effects on 3-d shaking predictions computed so far at frequencies from 0.1 Hz up to 1.0 Hz. The effect of the Parcel Map on even the 0.1-Hz waves is prominent even with the large mismatch of wavelength to geotechnical depths. Amplifications and de-amplifications affected by the Parcel Map exceed a factor of two, and are

  14. The Sumatra-Andaman Earthquake and Tsunami of 2004: the hazards, events, and damage.

    PubMed

    Kohl, Patrice A; O'Rourke, Ann P; Schmidman, Dana L; Dopkin, Wendy A; Birnbaum, Marvin L

    2005-01-01

    The Sumatra-Andaman Earthquake and subsequent Asian Tsunami of 26 December 2004 affected multiple countries in the Indian Ocean and beyond, creating disasters of a scale unprecedented in recorded history. Using the Conceptual Framework and terminology described in the Disaster Health Management: Guidelines for Evaluation and Research in the Utstein Style, the hazard, events, and damage associated with the Earthquake and Tsunami are described. Many gaps in the available information regarding this event are present. Standardized indicators and reporting criteria are necessary for research on future disasters and the development of best practice standards internationally. PMID:16496614

  15. St. Louis Area Earthquake Hazards Mapping Project - December 2008-June 2009 Progress Report

    USGS Publications Warehouse

    Williams, R.A.; Bauer, R.A.; Boyd, O.S.; Chung, J.; Cramer, C.H.; Gaunt, D.A.; Hempen, G.L.; Hoffman, D.; McCallister, N.S.; Prewett, J.L.; Rogers, J.D.; Steckel, P.J.; Watkins, C.M.

    2009-01-01

    This report summarizes the mission, the project background, the participants, and the progress of the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) for the period from December 2008 through June 2009. During this period, the SLAEHMP held five conference calls and two face-to-face meetings in St. Louis, participated in several earthquake awareness public meetings, held one outreach field trip for the business and government community, collected and compiled new borehole and digital elevation data from partners, and published a project summary.

  16. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  17. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed Central

    Kanamori, H

    1996-01-01

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding. Images Fig. 8 PMID:11607657

  18. Cruise report for A1-98-SC southern California Earthquake Hazards Project

    USGS Publications Warehouse

    Normark, William R.; Bohannon, Robert G.; Sliter, Ray; Dunhill, Gita; Scholl, David W.; Laursen, Jane; Reid, Jane A.; Holton, David

    1999-01-01

    The focus of the Southern California Earthquake Hazards project, within the Western Region Coastal and Marine Geology team (WRCMG), is to identify the landslide and earthquake hazards and related ground-deformation processes that can potentially impact the social and economic well-being of the inhabitants of the Southern California coastal region, the most populated urban corridor along the U.S. Pacific margin. The primary objective is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this overall objective, we are investigating the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (see Fig. 1). In addition, the project will examine the Pliocene-Pleistocene record of how this deformation has shifted in space and time. The results of this study should improve our knowledge of shifting deformation for both the long-term (105 to several 106 yr) and short-term (<50 ky) time frames and enable us to identify actively deforming structures that may constitute current significant seismic hazards.

  19. Probabilistic seismic hazard assessment of the Pyrenean region

    NASA Astrophysics Data System (ADS)

    Secanell, R.; Bertil, D.; Martin, C.; Goula, X.; Susagna, T.; Tapia, M.; Dominique, P.; Carbon, D.; Fleta, J.

    2008-07-01

    A unified probabilistic seismic hazard assessment (PSHA) for the Pyrenean region has been performed by an international team composed of experts from Spain and France during the Interreg IIIA ISARD project. It is motivated by incoherencies between the seismic hazard zonations of the design codes of France and Spain and by the need for input data to be used to define earthquake scenarios. A great effort was invested in the homogenisation of the input data. All existing seismic data are collected in a database and lead to a unified catalogue using a local magnitude scale. PSHA has been performed using logic trees combined with Monte Carlo simulations to account for both epistemic and aleatory uncertainties. As an alternative to hazard calculation based on seismic sources zone models, a zoneless method is also used to produce a hazard map less dependant on zone boundaries. Two seismogenic source models were defined to take into account the different interpretations existing among specialists. A new regional ground-motion prediction equation based on regional data has been proposed. It was used in combination with published ground-motion prediction equations derived using European and Mediterranean data. The application of this methodology leads to the definition of seismic hazard maps for 475- and 1,975-year return periods for spectral accelerations at periods of 0 (corresponding to peak ground acceleration), 0.1, 0.3, 0.6, 1 and 2 s. Median and percentiles 15% and 85% acceleration contour lines are represented. Finally, the seismic catalogue is used to produce a map of the maximum acceleration expected for comparison with the probabilistic hazard maps. The hazard maps are produced using a grid of 0.1°. The results obtained may be useful for civil protection and risk prevention purposes in France, Spain and Andorra.

  20. The Magnitude Frequency Distribution of Induced Earthquakes and Its Implications for Crustal Heterogeneity and Hazard

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.

    2015-12-01

    . Alternatively, the MFD of induced earthquakes may be controlled by small scale stress concentrations in a spatially variable stress field. Resolving the underlying causes of the MFD for induced earthquakes may provide key insights into the hazard posed by induced earthquakes.

  1. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  2. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  3. Studies of crustal structure, seismic precursors to volcanic eruptions and earthquake hazard in the eastern provinces of the Democratic Republic of Congo

    NASA Astrophysics Data System (ADS)

    Mavonga, T.; Zana, N.; Durrheim, R. J.

    2010-11-01

    In recent decades, civil wars in the eastern provinces of the Democratic Republic of Congo have caused massive social disruptions, which have been exacerbated by volcanic and earthquake disasters. Seismic data were gathered and analysed as part of an effort to monitor the volcanoes and quantitatively assess the earthquake hazard. This information can be used to regulate the settlement of displaced people and to "build back better". In order to investigate volcanic processes in the Virunga area, a local seismic velocity model was derived and used to relocate earthquake hypocenters. It was found that swarm-type seismicity, composed mainly of long-period earthquakes, preceded both the 2004 and 2006 eruptions of Nyamuragira. A steady increase in seismicity was observed to commence ten or eleven months prior to the eruption, which is attributed to the movement of magma in a deep conduit. In the last stage (1 or 2 months) before the eruption, the hypocenters of long-period earthquakes became shallower. Seismic hazard maps were prepared for the DRC using a 90-year catalogue compiled for homogeneous Mw magnitudes, various published attenuation relations, and the EZ-Frisk software package. The highest levels of seismic hazard were found in the Lake Tanganyika Rift seismic zone, where peak ground accelerations (PGA) in excess of 0.32 g, 0.22 g and 0.16 g are expected to occur with 2%, 5% and 10% chance of exceedance in 50 years, respectively.

  4. Measuring the environmental context of social vulnerability to urban earthquake hazards: An integrative remote sensing and GIS approach

    NASA Astrophysics Data System (ADS)

    Rashed, Tarek Mohamed Gamal Eldin

    Although vulnerability represents an essential concept in the development of mitigation strategies at the local, national, and international levels, there is little consensus among researchers, planners, and disaster managers regarding the best way to undertake vulnerability analysis. The basic objective of this research is to move that discussion forward by integrating remote sensing and GIS analysis into new ways of thinking about urban vulnerability. The research conceptualizes urban vulnerability to be a characteristic of an urban community that can be assessed through a combination of ecological factors associated with the physical conditions of the geographic space in which the urban community is and the social conditions of the population in that place. The basic hypothesis of the research is that these physical and social conditions are so inextricably bound together in many disaster situations that we can use the former as indicative of the latter. The research proposes an approach, through which areas with high levels of vulnerability (hot spots) are first located and differentiated from other areas within a defined urban region. The methodology of this research is tested for the Los Angeles metropolitan area, employing data from the 1990 US census. The findings of this research add to our understanding of how earthquake hazards respond to natural and human-induced changes, and the consequences of land cover alteration on the increasing occurrence worldwide of earthquake disasters. From an empirical viewpoint, the study shows how advanced GIS and remote sensing procedures can be combined to allow planners and decision makers to focus on the more vulnerable communities in their midst, and thus to help develop mitigation measures that could prevent earthquake hazards from becoming major human disasters. Finally, this study tests the importance of using remote sensing data in vulnerability analysis at the local level, thus laying the foundation of

  5. Recent Achievements of the Neo-Deterministic Seismic Hazard Assessment in the CEI Region

    SciTech Connect

    Panza, G. F.; Kouteva, M.; Vaccari, F.; Peresan, A.; Romanelli, F.; Cioflan, C. O.; Radulian, M.; Marmureanu, G.; Paskaleva, I.; Gribovszki, K.; Varga, P.; Herak, M.; Zaichenco, A.; Zivcic, M.

    2008-07-08

    A review of the recent achievements of the innovative neo-deterministic approach for seismic hazard assessment through realistic earthquake scenarios has been performed. The procedure provides strong ground motion parameters for the purpose of earthquake engineering, based on the deterministic seismic wave propagation modelling at different scales--regional, national and metropolitan. The main advantage of this neo-deterministic procedure is the simultaneous treatment of the contribution of the earthquake source and seismic wave propagation media to the strong motion at the target site/region, as required by basic physical principles. The neo-deterministic seismic microzonation procedure has been successfully applied to numerous metropolitan areas all over the world in the framework of several international projects. In this study some examples focused on CEI region concerning both regional seismic hazard assessment and seismic microzonation of the selected metropolitan areas are shown.

  6. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  7. Citizen Monitoring during Hazards: The Case of Fukushima Radiation after the 2011 Japanese Earthquake

    NASA Astrophysics Data System (ADS)

    Hultquist, C.; Cervone, G.

    2015-12-01

    Citizen-led movements producing scientific environmental information are increasingly common during hazards. After the Japanese earthquake-triggered tsunami in 2011, the government produced airborne remote sensing data of the radiation levels after the Fukushima nuclear reactor failures. Advances in technology enabled citizens to monitor radiation by innovative mobile devices built from components bought on the Internet. The citizen-led Safecast project measured on-ground levels of radiation in the Fukushima prefecture which total 14 million entries to date in Japan. This non-authoritative citizen science collection recorded radiation levels at specific coordinates and times is available online, yet the reliability and validity of the data had not been assessed. The nuclear incident provided a case for assessment with comparable dimensions of citizen science and authoritative data. To perform a comparison of the datasets, standardization was required. The sensors were calibrated scientifically but collected using different units of measure. Radiation decays over time so temporal interpolation was necessary for comparison of measurements as being the same time frame. Finally, the GPS located points were selected within the overlapping spatial extent of 500 meters. This study spatially analyzes and statistically compares citizen-volunteered and government-generated radiation data. Quantitative measures are used to assess the similarity and difference in the datasets. Radiation measurements from the same geographic extents show similar spatial variations which suggests that citizen science data can be comparable with government-generated measurements. Validation of Safecast demonstrates that we can infer scientific data from unstructured and not vested data. Citizen science can provide real-time data for situational awareness which is crucial for decision making during disasters. This project provides a methodology for comparing datasets of radiological measurements

  8. Active tectonics of the Seattle fault and central Puget sound, Washington - Implications for earthquake hazards

    USGS Publications Warehouse

    Johnson, S.Y.; Dadisman, S.V.; Childs, J. R.; Stanley, W.D.

    1999-01-01

    We use an extensive network of marine high-resolution and conventional industry seismic-reflection data to constrain the location, shallow structure, and displacement rates of the Seattle fault zone and crosscutting high-angle faults in the Puget Lowland of western Washington. Analysis of seismic profiles extending 50 km across the Puget Lowland from Lake Washington to Hood Canal indicates that the west-trending Seattle fault comprises a broad (4-6 km) zone of three or more south-dipping reverse faults. Quaternary sediment has been folded and faulted along all faults in the zone but is clearly most pronounced along fault A, the northernmost fault, which forms the boundary between the Seattle uplift and Seattle basin. Analysis of growth strata deposited across fault A indicate minimum Quaternary slip rates of about 0.6 mm/yr. Slip rates across the entire zone are estimated to be 0.7-1.1 mm/yr. The Seattle fault is cut into two main segments by an active, north-trending, high-angle, strike-slip fault zone with cumulative dextral displacement of about 2.4 km. Faults in this zone truncate and warp reflections in Tertiary and Quaternary strata and locally coincide with bathymetric lineaments. Cumulative slip rates on these faults may exceed 0.2 mm/yr. Assuming no other crosscutting faults, this north-trending fault zone divides the Seattle fault into 30-40-km-long western and eastern segments. Although this geometry could limit the area ruptured in some Seattle fault earthquakes, a large event ca. A.D. 900 appears to have involved both segments. Regional seismic-hazard assessments must (1) incorporate new information on fault length, geometry, and displacement rates on the Seattle fault, and (2) consider the hazard presented by the previously unrecognized, north-trending fault zone.

  9. Protection of the human race against natural hazards (asteroids, comets, volcanoes, earthquakes)

    NASA Astrophysics Data System (ADS)

    Smith, Joseph V.

    1985-10-01

    Although we justifiably worry about the danger of nuclear war to civilization, and perhaps even to survival of the human race, we tend to consider natural hazards (e.g., comets, asteroids, volcanoes, earthquakes) as unavoidable acts of God. In any human lifetime, a truly catastrophic natural event is very unlikely, but ultimately one will occur. For the first time in human history we have sufficient technical skills to begin protection of Earth from some natural hazards. We could decide collectively throughout the world to reassign resources: in particular, reduction of nuclear and conventional weapons to a less dangerous level would allow concomitant increase of international programs for detection and prevention of natural hazards. Worldwide cooperation to mitigate natural hazards might help psychologically to lead us away from the divisive bickering that triggers wars. Future generations could hail us as pioneers of peace and safety rather than curse us as agents of death and destruction.

  10. Earthquake damage potential and critical scour depth of bridges exposed to flood and seismic hazards under lateral seismic loads

    NASA Astrophysics Data System (ADS)

    Song, Shin-Tai; Wang, Chun-Yao; Huang, Wen-Hsiu

    2015-12-01

    Many bridges located in seismic hazard regions suffer from serious foundation exposure caused by riverbed scour. Loss of surrounding soil significantly reduces the lateral strength of pile foundations. When the scour depth exceeds a critical level, the strength of the foundation is insufficient to withstand the imposed seismic demand, which induces the potential for unacceptable damage to the piles during an earthquake. This paper presents an analytical approach to assess the earthquake damage potential of bridges with foundation exposure and identify the critical scour depth that causes the seismic performance of a bridge to differ from the original design. The approach employs the well-accepted response spectrum analysis method to determine the maximum seismic response of a bridge. The damage potential of a bridge is assessed by comparing the imposed seismic demand with the strengths of the column and the foundation. The versatility of the analytical approach is illustrated with a numerical example and verified by the nonlinear finite element analysis. The analytical approach is also demonstrated to successfully determine the critical scour depth. Results highlight that relatively shallow scour depths can cause foundation damage during an earthquake, even for bridges designed to provide satisfactory seismic performance.

  11. Induced and Natural Seismicity: Earthquake Hazards and Risks in Ohio:

    NASA Astrophysics Data System (ADS)

    Besana-Ostman, G. M.; Worstall, R.; Tomastik, T.; Simmers, R.

    2013-12-01

    To adapt with increasing need to regulate all operations related to both the Utica and Marcellus shale play within the state, ODNR had recently strengthen its regulatory capability through implementation of stricter permit requirements, additional human resources and improved infrastructure. These ODNR's efforts on seismic risk reduction related to induced seismicity led to stricter regulations and many infrastructure changes related particularly to Class II wells. Permit requirement changes and more seismic monitoring stations were implemented together with additional injection data reporting from selected Class II well operators. Considering the possible risks related to seismic events in a region with relatively low seismicity, correlation between limited seismic data and injection volume information were undertaken. Interestingly, initial results showed some indications of both plugging and fracturing episodes. The real-time data transmission from seismic stations and availability of injection volume data enabled ODNR to interact with operators and manage wells dynamically. Furthermore, initial geomorphic and structural analyses indicated possible active faults in the northern and western portion of the state oriented NE-SW. The newly-mapped structures imply possible relatively bigger earthquakes in the region and consequently higher seismic risks. With the above-mentioned recent changes, ODNR have made critical improvement of its principal regulatory role in the state for oil and gas operations but also an important contribution to the state's seismic risk reduction endeavors. Close collaboration with other government agencies and the public, and working together with the well operators enhanced ODNR's capability to build a safety culture and achieve further public and industry participation towards a safer environment. Keywords: Induced seismicity, injection wells, seismic risks

  12. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2009-04-01

    This special issue includes selected papers on the topic of earthquake impact from the sessions held in 2004 in Nice, France and in 2005 in Vienna, Austria at the first and respectivelly the second European Geosciences Union General Assembly. Since its start in 1999, in the Hague, Netherlands, the hazard of earthquakes has been the most popular of the session. The respective calls in 2004 was for: Nature's forces including earthquakes, floods, landslides, high winds and volcanic eruptions can inflict losses to urban settlements and man-made structures such as infrastructure. In Europe, recent years have seen such significant losses from earthquakes in south and south-eastern Europe, floods in central Europe, and wind storms in western Europe. Meanwhile, significant progress has been made in understanding disasters. Several scientific fields contribute to a holistic approach in the evaluation of capacities, vulnerabilities and hazards, the main factors on mitigating urban disasters due to natural hazards. An important part of the session is devoted to assessment of earthquake shaking and loss scenarios, including both physical damage and human causalities. Early warning and rapid damage evaluation are of utmost importance for addressing the safety of many essential facilities, for emergency management of events and for disaster response. In case of earthquake occurrence strong motion networks, data processing and interpretation lead to preliminary estimation (scenarios) of geographical distribution of damages. Factual information on inflicted damage, like those obtained from shaking maps or aerial imagery permit a confrontation with simulation maps of damage in order to define a more accurate picture of the overall losses. Most recent developments towards quantitative and qualitative simulation of natural hazard impacts on urban areas, which provide decision-making support for urban disaster management, and success stories of and lessons learned from disaster

  13. Regional coseismic landslide hazard assessment without historical landslide inventories: A new approach

    NASA Astrophysics Data System (ADS)

    Kritikos, Theodosios; Robinson, Tom R.; Davies, Tim R. H.

    2015-04-01

    Currently, regional coseismic landslide hazard analyses require comprehensive historical landslide inventories as well as detailed geotechnical data. Consequently, such analyses have not been possible where these data are not available. A new approach is proposed herein to assess coseismic landslide hazard at regional scale for specific earthquake scenarios in areas without historical landslide inventories. The proposed model employs fuzzy logic and geographic information systems to establish relationships between causative factors and coseismic slope failures in regions with well-documented and substantially complete coseismic landslide inventories. These relationships are then utilized to estimate the relative probability of landslide occurrence in regions with neither historical landslide inventories nor detailed geotechnical data. Statistical analyses of inventories from the 1994 Northridge and 2008 Wenchuan earthquakes reveal that shaking intensity, topography, and distance from active faults and streams are the main controls on the spatial distribution of coseismic landslides. Average fuzzy memberships for each factor are developed and aggregated to model the relative coseismic landslide hazard for both earthquakes. The predictive capabilities of the models are assessed and show good-to-excellent model performance for both events. These memberships are then applied to the 1999 Chi-Chi earthquake, using only a digital elevation model, active fault map, and isoseismal data, replicating prediction of a future event in a region lacking historic inventories and/or geotechnical data. This similarly results in excellent model performance, demonstrating the model's predictive potential and confirming it can be meaningfully applied in regions where previous methods could not. For such regions, this method may enable a greater ability to analyze coseismic landslide hazard from specific earthquake scenarios, allowing for mitigation measures and emergency response plans

  14. Multicriteria analysis in hazards assessment in Libya

    NASA Astrophysics Data System (ADS)

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol

    2012-11-01

    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.

  15. The Development of an Earthquake Preparedness Plan for a Child Care Center in a Geologically Hazardous Region.

    ERIC Educational Resources Information Center

    Wokurka, Linda

    The director of a child care center at a community college in California developed an earthquake preparedness plan for the center which met state and local requirements for earthquake preparedness at schools. The plan consisted of: (1) the identification and reduction of nonstructural hazards in classrooms, office, and staff rooms; (2) storage of…

  16. Earthquake shaking hazard estimates and exposure changes in the conterminous United States

    USGS Publications Warehouse

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.

    2015-01-01

    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  17. Strong earthquake evolution and present-day earthquake hazard risk around the eastern boundary faults of Sichuan-Yunnan Rhombic Block

    NASA Astrophysics Data System (ADS)

    Cheng, J.; Xu, X.

    2011-12-01

    Based on the specific date of fault segmentation, characteristic earthquakes, and their empirical relationships, we calculated the parameters of the fault segments, such as length, width, magnitudes of characteristic earthquakes. Constrained by GPS velocity field, the slip rates of these fault segments in depth were inversed using the 3-D half-space dislocation model. As not all of the recurrence periods and co-seismic displacements of characteristic earthquakes were known, we selected the fault segments with these two parameters known and calculated the accumulation rate of average co-seismic displacement, which shows the faults' slip rate in seismogenic layer. Then, the slip rate in depth was compared with that in seismogenic layer, the relationship between them was obtained, and this relationship was used to get the recurrence periods and co-seismic displacements of all fault segments. After the studies above, we calculated post-seismic Coulomb stress changes of each fault segment produced by the earthquakes on these segments from AD 1700. Then, we used the relationship between the stress drop and the slip rate of the fault segments to show how many years of strain accumulation changed by the earthquakes. Adding the changed value of strain accumulation into the elapsed time of the characteristic earthquakes, we obtained the earthquake hazard degree of all the segments from the ratios between elapsed time and recurrence periods. At last, the BPT model was used to show the probability of earthquake hazard risks on the segments in the next 10 years and 30 years. Key words The eastern boundary faults of Sichuan-Yunnan rhombic block, characteristic earthquake, recurrence period, coulomb stress Earthquake hazard, BPT.

  18. Sensitivity analysis for Probabilistic Tsunami Hazard Assessment (PTHA)

    NASA Astrophysics Data System (ADS)

    Spada, M.; Basili, R.; Selva, J.; Lorito, S.; Sorensen, M. B.; Zonker, J.; Babeyko, A. Y.; Romano, F.; Piatanesi, A.; Tiberti, M.

    2012-12-01

    In modern societies, probabilistic hazard assessment of natural disasters is commonly used by decision makers for designing regulatory standards and, more generally, for prioritizing risk mitigation efforts. Systematic formalization of Probabilistic Tsunami Hazard Assessment (PTHA) has started only in recent years, mainly following the giant tsunami disaster of Sumatra in 2004. Typically, PTHA for earthquake sources exploits the long-standing practices developed in probabilistic seismic hazard assessment (PSHA), even though important differences are evident. In PTHA, for example, it is known that far-field sources are more important and that physical models for tsunami propagation are needed for the highly non-isotropic propagation of tsunami waves. However, considering the high impact that PTHA may have on societies, an important effort to quantify the effect of specific assumptions should be performed. Indeed, specific standard hypotheses made in PSHA may prove inappropriate for PTHA, since tsunami waves are sensitive to different aspects of sources (e.g. fault geometry, scaling laws, slip distribution) and propagate differently. In addition, the necessity of running an explicit calculation of wave propagation for every possible event (tsunami scenario) forces analysts to finding strategies for diminishing the computational burden. In this work, we test the sensitivity of hazard results with respect to several assumptions that are peculiar of PTHA and others that are commonly accepted in PSHA. Our case study is located in the central Mediterranean Sea and considers the Western Hellenic Arc as the earthquake source with Crete and Eastern Sicily as near-field and far-field target coasts, respectively. Our suite of sensitivity tests includes: a) comparison of random seismicity distribution within area sources as opposed to systematically distributed ruptures on fault sources; b) effects of statistical and physical parameters (a- and b-value, Mc, Mmax, scaling laws

  19. Seismic Hazard and Risk Assessment and Public Policy in the Central United States

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2007-05-01

    United States, however. This inconsistency is caused by the methodology being used for seismic hazard and risk assessment: probabilistic seismic hazard analysis (PSHA). Although PSHA is the most widely used method for seismic hazard and risk assessment, it contains a mathematical error in the formulation: incorrectly treating the ground-motion uncertainty as an independent random variable. The ground-motion uncertainty is an explicit or implicit dependent variable as it is modeled in the ground-motion attenuation relationship. The mathematical error results in difficulty in understanding and applying PSHA. PSHA mixes temporal measurement (occurrence of an earthquake and its consequence [ground motion] at a site) with spatial measurement (ground-motion variability due to the source, path, and site effects). Thus, use of PSHA in seismic hazard and risk assessment is problematic for policy consideration.

  20. Advanced Materials Laboratory hazards assessment document

    SciTech Connect

    Barnett, B.; Banda, Z.

    1995-10-01

    The Department of Energy Order 55OO.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the AML. The entire inventory was screened according to the potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 23 meters. The highest emergency classification is a General Emergency. The Emergency Planning Zone is a nominal area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets.

  1. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, B.; Siu, Y. L.; Mitchell, G.

    2015-12-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  2. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  3. New Tools for Quality Assessment of Modern Earthquake Catalogs: Examples From California and Japan.

    NASA Astrophysics Data System (ADS)

    Woessner, J.; Wiemer, S.; Giardini, D.

    2002-12-01

    Earthquake catalogs provide a comprehensive knowledge database for studies related to seismicity, seismotectonic, earthquake physics, and hazard analysis. We introduce a set of tools and new software for improving the quality of modern catalogs of microseismicty. Surprisingly little research on detecting seismicity changes and analyzing the causes has been performed in recent years. Especially the discrimination between artificial and natural causes responsible for transients in seismicity, such as rate changes or alternations in the earthquake size distribution (b-value), often remains difficult. Thus, significant changes in reporting homogeneity are often detected only years after they occurred. We believe that our tools, used regularly and automatically in a ?real time mode?, allow addressing such problems shortly after they occurred. Based on our experience in analyzing earthquake catalogs, and building on the groundbreaking work by Habermann in the 1980?s, we propose a recipe for earthquake catalog quality assessment: 1) Decluster as a tool to homogenize the data; 2) Identify and remove blast contamination; 3) Estimate completeness as a function of space and time; 4) Assess reporting homogeneity as a function of space and time using self-consistency and, if possible, comparison with other independent data sources. During this sequence of analysis steps, we produce a series of maps that portray for a given period the magnitude of completeness, seismicity rate changes, possible shifts and stretches in the magnitude distribution and the degree of clustering. We apply our algorithms for quality assessment to data sets from California and Japan addressing the following questions: 1) Did the 1983 Coalinga earthquake change the rate of small events on the Parkfield segment of the San Andreas system? 2) Did the Kobe earthquake change the rate of earthquakes or the b-value in nearby volumes?

  4. Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.

    2012-12-01

    An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these

  5. Earthquake scenario in West Bengal with emphasis on seismic hazard microzonation of the city of Kolkata, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Maiti, S. K.; Devaraj, N.; Srivastava, N.; Mohapatra, L. D.

    2014-09-01

    Seismic microzonation is a process of estimating site-specific effects due to an earthquake on urban centers for its disaster mitigation and management. The state of West Bengal, located in the western foreland of the Assam-Arakan Orogenic Belt, the Himalayan foothills and Surma Valley, has been struck by several devastating earthquakes in the past, indicating the need for a seismotectonic review of the province, especially in light of probable seismic threat to its capital city of Kolkata, which is a major industrial and commercial hub in the eastern and northeastern region of India. A synoptic probabilistic seismic hazard model of Kolkata is initially generated at engineering bedrock (Vs30 ~ 760 m s-1) considering 33 polygonal seismogenic sources at two hypocentral depth ranges, 0-25 and 25-70 km; 158 tectonic sources; appropriate seismicity modeling; 14 ground motion prediction equations for three seismotectonic provinces, viz. the east-central Himalaya, the Bengal Basin and Northeast India selected through suitability testing; and appropriate weighting in a logic tree framework. Site classification of Kolkata performed following in-depth geophysical and geotechnical investigations places the city in D1, D2, D3 and E classes. Probabilistic seismic hazard assessment at a surface-consistent level - i.e., the local seismic hazard related to site amplification performed by propagating the bedrock ground motion with 10% probability of exceedance in 50 years through a 1-D sediment column using an equivalent linear analysis - predicts a peak ground acceleration (PGA) range from 0.176 to 0.253 g in the city. A deterministic liquefaction scenario in terms of spatial distribution of liquefaction potential index corresponding to surface PGA distribution places 50% of the city in the possible liquefiable zone. A multicriteria seismic hazard microzonation framework is proposed for judicious integration of multiple themes, namely PGA at the surface, liquefaction potential

  6. Tsunami hazard and risk assessment in El Salvador

    NASA Astrophysics Data System (ADS)

    González, M.; González-Riancho, P.; Gutiérrez, O. Q.; García-Aguilar, O.; Aniel-Quiroga, I.; Aguirre, I.; Alvarez, J. A.; Gavidia, F.; Jaimes, I.; Larreynaga, J. A.

    2012-04-01

    Tsunamis are relatively infrequent phenomena representing a greater threat than earthquakes, hurricanes and tornadoes, causing the loss of thousands of human lives and extensive damage to coastal infrastructure around the world. Several works have attempted to study these phenomena in order to understand their origin, causes, evolution, consequences, and magnitude of their damages, to finally propose mechanisms to protect coastal societies. Advances in the understanding and prediction of tsunami impacts allow the development of adaptation and mitigation strategies to reduce risk on coastal areas. This work -Tsunami Hazard and Risk Assessment in El Salvador-, funded by AECID during the period 2009-12, examines the state of the art and presents a comprehensive methodology for assessing the risk of tsunamis at any coastal area worldwide and applying it to the coast of El Salvador. The conceptual framework is based on the definition of Risk as the probability of harmful consequences or expected losses resulting from a given hazard to a given element at danger or peril, over a specified time period (European Commission, Schneiderbauer et al., 2004). The HAZARD assessment (Phase I of the project) is based on propagation models for earthquake-generated tsunamis, developed through the characterization of tsunamigenic sources -sismotectonic faults- and other dynamics under study -tsunami waves, sea level, etc.-. The study area is located in a high seismic activity area and has been hit by 11 tsunamis between 1859 and 1997, nine of them recorded in the twentieth century and all generated by earthquakes. Simulations of historical and potential tsunamis with greater or lesser affection to the country's coast have been performed, including distant sources, intermediate and close. Deterministic analyses of the threats under study -coastal flooding- have been carried out, resulting in different hazard maps (maximum wave height elevation, maximum water depth, minimum tsunami

  7. Seismic hazard in western Canada from GPS strain rates versus earthquake catalog

    NASA Astrophysics Data System (ADS)

    Mazzotti, S.; Leonard, L. J.; Cassidy, J. F.; Rogers, G. C.; Halchuk, S.

    2011-12-01

    Probabilistic seismic hazard analyses (PSHA) are commonly based on frequency - magnitude statistics from 50-100 yearlong earthquake catalogs, assuming that these statistics are representative of the longer-term frequency of large earthquakes. We test an alternative PSHA approach in continental western Canada, including adjacent areas of northwestern U.S.A., using regional strain rates derived from 179 Global Positioning System (GPS) horizontal velocities. GPS strain rates are converted to earthquake statistics, seismic moment rates, and ground shaking probabilities in seismic source zones using a logic-tree method for uncertainty propagation. Median GPS-based moment rates and shaking estimates agree well with those derived from earthquake catalogs in only two zones (Puget Sound and mid-Vancouver Island). In most other zones, median GPS-based moment rates are 6-150 times larger than those derived from earthquake catalogs (shaking estimates 2-5 times larger), although the GPS-based and catalog estimates commonly agree within their 67% uncertainties. This discrepancy may represent an under-sampling of long-term moment rates and shaking by earthquake catalogs in some zones; however a systematic under-sampling is unlikely over our entire study area. Although not demonstrated with a high confidence level, long-term regional aseismic deformation may account for a significant part of the GPS/catalog discrepancy and, in some areas, represent as much as 90% of the total deformation budget. In order to integrate GPS strain rates in PSHA models, seismic versus aseismic partitioning of long-term deformation needs to be quantified and understood in terms of the underlying mechanical processes.

  8. Identifying the Hazard Before the Earthquake: How Far Have We Come, How Well Have We Done?

    NASA Astrophysics Data System (ADS)

    Schwartz, D. P.

    2011-12-01

    With almost half a century of evolving understanding and tools at our disposal, how successful have we been at identifying active faults and correctly quantifying the future hazards associated with them? The characterization of seismic sources has multiple facets-location and geometry, frequency of rupture and slip rate, and amount of fault displacement and expected earthquake magnitude. From these parameters, site-specific design values, regional probabilistic fault rupture and ground motion hazard maps, and national building codes can be developed. The mid-1960s saw the initiation of investigations focused on identifying active faults with the earliest efforts geared to location and the potential for surface rupture; studies for critical facilities--power plants, dams, pipelines--were central to this development. These studies flourished in the 1970s during which time the importance of fault slip rates was recognized, and the latter part of the decade saw the first major paleoseismic studies aimed at multiple-event earthquake chronologies. During the 1980s paleoseismic data provided the basis for development of fault-specific magnitude-frequency distributions and concepts such as fault segmentation, which advanced source characterization. The 1990s saw active fault and paleoseismic investigations flourish internationally; AMS radiocarbon dating became widely used, which increased information on earthquake recurrence for a great number of faults. In the late 1990s and the 2000s advances in luminescence and cosmogenic radionuclide dating permitted slip rates to be routinely obtained from previously undatable deposits offset by faults, and the development of LiDAR led to identification of previously unrecognized active structures. These data are finding their way into increasingly sophisticated probabilistic ground motion and fault displacement models. How have these developments affected our ability to correctly identify and quantify a hazard prior to the

  9. Near-Field ETAS Constraints and Applications to Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Yoder, Mark R.; Rundle, John B.; Glasscoe, Margaret T.

    2015-08-01

    The epidemic type aftershock sequence (ETAS) statistical model of aftershock seismicity combines various earthquake scaling relations to produce synthetic earthquake catalogs, or estimates of aftershock seismicity rates, based on recent earthquake activity. One challenge to ETAS-based hazard assessment is the large number of free parameters involved. In this paper, we introduce an approach to constrain this parameter space from canonical scaling relations, empirical observations, and fundamental physics. We show that ETAS parameters can be estimated as a function of an earthquake's magnitude m based on the finite temporal and spatial extents of the rupture area. This approach facilitates fast ETAS-based estimates of seismicity from large "seed" catalogs, and it is particularly well suited to web-based deployment and otherwise automated implementations. It constitutes a significant improvement over contemporary ETAS by mitigating variability related to instrumentation and subjective catalog selection.

  10. Earthquake recurrence and risk assessment in circum-Pacific seismic gaps

    USGS Publications Warehouse

    Thatcher, W.

    1989-01-01

    THE development of the concept of seismic gaps, regions of low earthquake activity where large events are expected, has been one of the notable achievements of seismology and plate tectonics. Its application to long-term earthquake hazard assessment continues to be an active field of seismological research. Here I have surveyed well documented case histories of repeated rupture of the same segment of circum-Pacific plate boundary and characterized their general features. I find that variability in fault slip and spatial extent of great earthquakes rupturing the same plate boundary segment is typical rather than exceptional but sequences of major events fill identified seismic gaps with remarkable order. Earthquakes are concentrated late in the seismic cycle and occur with increasing size and magnitude. Furthermore, earthquake rup-ture starts near zones of concentrated moment release, suggesting that high-slip regions control the timing of recurrent events. The absence of major earthquakes early in the seismic cycle indicates a more complex behaviour for lower-slip regions, which may explain the observed cycle-to-cycle diversity of gap-filling sequences. ?? 1989 Nature Publishing Group.

  11. An Integrated Geospatial System for earthquake precursors assessment in Vrancea tectonic active zone in Romania

    NASA Astrophysics Data System (ADS)

    Zoran, Maria A.; Savastru, Roxana S.; Savastru, Dan M.

    2015-10-01

    With the development of space-based technologies to measure surface geophysical parameters and deformation at the boundaries of tectonic plates and large faults, earthquake science has entered a new era. Using time series satellite data for earthquake prediction, it is possible to pursue the behaviors of earthquake precursors in the future and to announce early warnings when the differences between the predicted value and the observed value exceed the pre-define threshold value. Starting with almost one week prior to a moderate or strong earthquake a transient thermal infrared rise in LST of several Celsius degrees (oC) and the increased OLR values higher than the normal have been recorded around epicentral areas, function of the magnitude and focal depth, which disappeared after the main shock. Also are recorded associated geomagnetic and ionospheric distrurbances. Vrancea tectonic active zone in Romania is characterized by a high seismic hazard in European- Mediterranean region, being responsible of strong or moderate intermediate depth and normal earthquakes generation on a confined epicentral area. Based on recorded geophysical parameters anomalies was developed an integrated geospatial system for earthquake precursors assessment in Vrancea active seismic zone. This system integrates derived from time series MODIS Terra/Aqua, NOAA-AVHRR, ASTER, Landsat TM/ETM satellite data multi geophysical parameters (land surface temperature -LST, outgoing long-wave radiation- OLR, and mean air temperature- AT as well as geomagnetic and ionospheric data in synergy with in-situ data for surveillance and forecasting of seismic events.

  12. Multi-disciplinary Hazard Reduction from Earthquakes and Volcanoes in Indonesia - International Research Cooperation Program

    NASA Astrophysics Data System (ADS)

    Kato, Teruyuki

    2010-05-01

    Indonesian and Japanese researchers started a three-year (2009-2011) multi-disciplinary cooperative research project as a part of "Science and Technology Research Partnership for Sustainable Development" supported by the Japanese government. The ultimate goal of this project is to reduce disaster from earthquakes, tsunamis and volcanoes by enhancing capability of forecasting hazards, reducing social vulnerability, and education and outreach activity of research outcomes. We plan to provide platform of collaboration among researchers in natural science, engineering and social sciences, as well as officials in national and local governments. Research activities are grouped into: (1) geological and geophysical surveys of past earthquakes, monitoring current crustal activity, and simulation of future ground motion or tsunamis, (2) short-term and long-term prediction of volcanic eruptions by monitoring Semeru, Guntur and other volcanoes, and development of their evaluation method, (3) studies to establish social infrastructure based on engineering technologies and hazard maps, (4) social, cultural and religious studies to reduce vulnerability of local communities, and (5) studies on education and outreach on disaster reduction and restoration of community. In addition, to coordinate these research activities and to utilize the research results, (6) application of the research and establishment of collaboration mechanism between researchers and the government officials is planned. In addition to mutual visits and collaborative field studies, it is planned to hold annual joint seminars (in Indonesia in 2009 and 2011, in Japan in 2010) that will be broadcasted through internet. Meetings with Joint Coordinating Committee, composed of representatives of relevant Indonesian ministries and institutions as well as project members, will be held annually to oversee the activities. The kick-off workshop was held in Bandung in April 2009 and the research plans from 22 different

  13. Tsunami Hazards along the Eastern Australian Coast from Potential Earthquakes: Results from Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Ding, R. W.; Yuen, D. A.

    2015-08-01

    Australia is surrounded by the Pacific Ocean and the Indian Ocean and, thus, may suffer from tsunamis due to its proximity to the subduction earthquakes around the boundary of Australian Plate. Potential tsunami risks along the eastern coast, where more and more people currently live, are numerically investigated through a scenario-based method to provide an estimation of the tsunami hazard in this region. We have chosen and calculated the tsunami waves generated at the New Hebrides Trench and the Puysegur Trench, and we further investigated the relevant tsunami hazards along the eastern coast and their sensitivities to various sea floor frictions and earthquake parameters (i.e. the strike, the dip and the slip angles and the earthquake magnitude/rupture length). The results indicate that the Puysegur trench possesses a seismic threat causing wave amplitudes over 1.5 m along the coast of Tasmania, Victoria, and New South Wales, and even reaching over 2.6 m at the regions close to Sydney, Maria Island, and Gabo Island for a certain worse case, while the cities along the coast of Queensland are potentially less vulnerable than those on the southeastern Australian coast.

  14. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  15. Y-12 site-specific earthquake response analysis and soil liquefaction assessment

    SciTech Connect

    Ahmed, S.B.; Hunt, R.J.; Manrod, W.E. III

    1995-09-29

    A site-specific earthquake response analysis and soil liquefaction assessment were performed for the Oak Ridge Y-12 Plant. The main purpose of these studies was to use the results of the analyses for evaluating the safety of the performance category -1, -2, and -3 facilities against the natural phenomena seismic hazards. Earthquake response was determined for seven (7), one dimensional soil columns (Fig. 12) using two horizontal components of the PC-3 design basis 2000-year seismic event. The computer program SHAKE 91 (Ref. 7) was used to calculate the absolute response accelerations on top of ground (soil/weathered shale) and rock outcrop. The SHAKE program has been validated for horizontal response calculations at periods less than 2.0 second at several sites and consequently is widely accepted in the geotechnical earthquake engineering area for site response analysis.

  16. Assessing the Seismic Potential Hazard of the Makran Subduction Zone

    NASA Astrophysics Data System (ADS)

    Frohling, E.; Szeliga, W. M.; Melbourne, T. I.; Abolghasem, A.; Lodi, S. H.

    2013-12-01

    . Assessment of GPS data, paleoseismic history, and convergence rates indicate that the western half has been active in the past and may be accumulating strain at a rate comparable to the eastern half and should be capable of producing large magnitude earthquakes. As this is the first fault coupling model to be assessed in this region, our study provides a good initial model for future seismic and tsunami hazards modeling and planning for Pakistan, Iran, India, and the Arabian Peninsula.

  17. Innovation in earthquake and natural hazards research: Determining soil liquefaction potential

    NASA Astrophysics Data System (ADS)

    Moore, G. B.; Yin, R. K.

    1984-11-01

    This case study analyzes how an innovation in earthquake and natural hazards research was used for practical and policy purposes, why utilization occurred, and what potential policy implications can be drawn. The innovation was the dynamic analysis method, used to identify those soils that are likely to liquefy during earthquakes. The research was designed and undertaken by H. Bolton Seed at the University of California at Berkeley during the 1960s. The research was a major breakthrough in engineering research: liquefaction had never before been reproduced in a laboratory. The work yielded quantitative information about the conditions under which liquefaction occurs. These data were then used to develop procedures for predicting liquefaction; eventually the need to test soil samples in the laboratory was eliminated.

  18. Analysing uncertainties associated with flood hazard assessment

    NASA Astrophysics Data System (ADS)

    Neuhold, Clemens; Stanzel, Philipp; Nachtnebel, Hans-Peter

    2010-05-01

    Risk zonation maps are mostly derived from design floods which propagate through the study area. The respective delineation of inundated flood plains is a fundamental input for the flood risk assessment of exposed objects. It is implicitly assumed that the river morphology will not vary, even though it is obvious that the river bed elevation can quickly and drastically change during flood events. The objectives of this study were (1) to integrate river bed dynamics into flood risk assessment and (2) to quantify uncertainties associated to flood hazard modelling by means of (i) hydrology (input hydrographs) (ii) sediment transport (torrential input, river bed elevation) (iii) hydrodynamics (water surface levels) The proposed concept was applied to the River Ill in the Western Austrian Alps. In total, 138 flood and associated sediment transport scenarios were considered, simulated and illustrated for the main river stem. The calculated morphological changes of the river bed during peak flow provided a basis to estimate the variability of possible water surface levels and inundated areas, necessary for flood hazard assessment. The applied multi-scenario approach was compared to the normatively defined design flood event to account for the uncertainty of flood risk management decisions based on a few scenarios. Due to the incorporation of river morphological changes and variations in rainfall characteristics into flood hazard assessment, for 12 % of considered cross sections inundations were calculated where safety was expected.

  19. Seismic hazard and risks estimates for Himalayas and surrounding regions based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Nekrasova, Anastasia; Kossobokov, Vladimir; Parvez, Imtiyaz

    2013-04-01

    under specific environments and conditions. Both conceptual issues must be resolved in a multidisciplinary problem oriented research performed by specialists in the fields of hazard, objects of risk, and object vulnerability. Here, to illustrate this general concept, we perform the following oversimplified four convolutions of seismic hazard assessment map H(g) in a cell g with the population density distribution P: (i) H(g)•gP, where gP is the integral of the population density over the cell g; (ii) H(g)•gP•P ; (iii) H(g)•gP•P•P; and (iv) H(g)•gP•P•P•P. We have to emphasize that the estimations addressing more realistic and practical kinds of seismic risk, not presented here, should involve experts in distribution of objects of risk of different vulnerability, i.e., specialists in earthquake engineering, social sciences and economics.

  20. Occurrence probability assessment of earthquake-triggered landslides with Newmark displacement values and logistic regression: The Wenchuan earthquake, China

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Song, Chongzhen; Lin, Qigen; Li, Juan

    2016-04-01

    The Newmark displacement model has been used to predict earthquake-triggered landslides. Logistic regression (LR) is also a common landslide hazard assessment method. We combined the Newmark displacement model and LR and applied them to Wenchuan County and Beichuan County in China, which were affected by the Ms. 8.0 Wenchuan earthquake on May 12th, 2008, to develop a mechanism-based landslide occurrence probability model and improve the predictive accuracy. A total of 1904 landslide sites in Wenchuan County and 3800 random non-landslide sites were selected as the training dataset. We applied the Newmark model and obtained the distribution of permanent displacement (Dn) for a 30 × 30 m grid. Four factors (Dn, topographic relief, and distances to drainages and roads) were used as independent variables for LR. Then, a combined model was obtained, with an AUC (area under the curve) value of 0.797 for Wenchuan County. A total of 617 landslide sites and non-landslide sites in Beichuan County were used as a validation dataset with AUC = 0.753. The proposed method may also be applied to earthquake-induced landslides in other regions.

  1. Earthquake!

    ERIC Educational Resources Information Center

    Markle, Sandra

    1987-01-01

    A learning unit about earthquakes includes activities for primary grade students, including making inferences and defining operationally. Task cards are included for independent study on earthquake maps and earthquake measuring. (CB)

  2. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a ...

  3. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  4. Earthquake hazards of active blind-thrust faults under the central Los Angeles basin, California

    NASA Astrophysics Data System (ADS)

    Shaw, John H.; Suppe, John

    1996-04-01

    We document several blind-thrust faults under the Los Angeles basin that, if active and seismogenic, are capable of generating large earthquakes (M = 6.3 to 7.3). Pliocene to Quaternary growth folds imaged in seismic reflection profiles record the existence, size, and slip rates of these blind faults. The growth structures have shapes characteristic of fault-bend folds above blind thrusts, as demonstrated by balanced kinematic models, geologic cross sections, and axial-surface maps. We interpret the Compton-Los Alamitos trend as a growth fold above the Compton ramp, which extends along strike from west Los Angeles to at least the Santa Ana River. The Compton thrust is part of a larger fault system, including a decollement and ramps beneath the Elysian Park and Palos Verdes trends. The Cienegas and Coyote Hills growth folds overlie additional blind thrusts in the Elysian Park trend that are not closely linked to the Compton ramp. Analysis of folded Pliocene to Quaternary strata yields slip rates of 1.4 ± 0.4 mm/yr on the Compton thrust and 1.7 ± 0.4 mm/yr on a ramp beneath the Elysian Park trend. Assuming that slip is released in large earthquakes, we estimate magnitudes of 6.3 to 6.8 for earthquakes on individual ramp segments based on geometric segment sizes derived from axial surface maps. Multiple-segment ruptures could yield larger earthquakes (M = 6.9 to 7.3). Relations among magnitude, coseismic displacement, and slip rate yield an average recurrence interval of 380 years for single-segment earthquakes and a range of 400 to 1300 years for multiple-segment events. If these newly documented blind thrust faults are active, they will contribute substantially to the seismic hazards in Los Angeles because of their locations directly beneath the metropolitan area.

  5. Preliminary Hazards Assessment: Iron disulfide purification system

    SciTech Connect

    1991-07-30

    A process for the purification (washing) of iron disulfide (FeS{sub 2}) powder is conducted in the Northeast corner (Area 353) of the main plant building (Building 100). This location is about 130 feet from the fenced boundary of the Partnership School/Child Development Center. In the first steps of the process, raw iron disulfide powder is ground and separated by particle size. The ground and sized powder is then purified in a three-step acid washing process using both hydrochloric acid (HCI) and hydrofluoric (HF) acid. The iron disulfide process is an intermittent batch process conducted four to eight times a year. This study is a Preliminary Hazards Assessment (PHA) to assess the hazards associated with the iron disulfide process. This is a preliminary study and will be used to determine if additional safety analysis is necessary. The scope of the PHA includes assessment of the process steps of grinding, size classification, and purification. The purpose is to identify major hazards and determine if the current and newly added safeguards are adequate for operation. The PHA also lists recommendations for additional safety features that should be added to reduce the risks of operation.

  6. The Contribution of Paleoseismology to Seismic Hazard Assessment in Site Evaluation for Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Guerrieri, Luca; Fukushima, Yoshimitsu

    2015-04-01

    In the framework of site evaluation/re-evaluation procedures for nuclear power plants (NPP), paleoseismology plays an essential role not only for Fault Displacement Hazard Assessment (FDHA) but also for Seismic Hazard Assessment (SHA). The relevance of paleoseismology is recommended in the reference IAEA Safety Guide (IAEA SSG-9) and has been dramatically confirmed in recent time especially after the accident at the Fukushima Daiichi NPP caused by the disastrous great Tohoku earthquake and tsunami occurred on 11 March 2011. After this event, the IAEA International Seismic Safety Center promoted a technical document aimed at encouraging and supporting Member States, especially from newcomer countries, to include paleoseismic investigations into the geologic database, highlighting the value of earthquake geology studies and paleoseismology for nuclear safety and providing standard methodologies to perform such investigations. In detail, paleoseismic investigations in the context of site evaluation of nuclear installations have the following main objectives: i) identification of seismogenic structures based on the recognition of effects of past earthquakes in the regional area; ii) improvement of the completeness of earthquake catalogs, through the identification and dating of ancient moderate to large earthquakes, whose trace has been preserved in the geologic records; iii) estimation of the maximum seismic potential associated with an identified seismogenic structure/source, typically on the basis of the amount of displacement per event (evaluable in paleoseismic trenches), as well as of the geomorphic and stratigraphic features interpretable as the cumulative effect of repeated large seismic events (concept of "seismic landscape"); iv) rough calibration of probabilistic seismic hazard assessment (PSHA), by using the recurrence interval of large earthquakes detectable by paleoseismic investigations, and providing a "reality check" based on direct observations of

  7. Assessment of hazards to workers applying pesticides.

    PubMed

    Carmichael, N G

    1989-01-01

    Exposure to pesticides as a result of their use in agriculture will vary according to the type of formulation, the method of application and the protective measures used. Quantitation of external exposure does not on its own predict the amount absorbed nor does it allow the toxic hazard to be assessed; information on skin penetration is also required. With the use of a suitable generic database for exposure, the assessment of many compounds would only require the measurement of skin penetration. With the knowledge of human dermal pharmacokinetics a field study can be performed which measures the absorbed dose directly and avoids the need for exposure measurement. PMID:2599152

  8. Developing a global tsunami propagation database and its application for coastal hazard assessments in China

    NASA Astrophysics Data System (ADS)

    Wang, N.; Tang, L.; Titov, V.; Newman, J. C.; Dong, S.; Wei, Y.

    2013-12-01

    The tragedies of the 2004 Indian Ocean and 2011 Japan tsunamis have increased awareness of tsunami hazards for many nations, including China. The low land level and high population density of China's coastal areas place it at high risk for tsunami hazards. Recent research (Komatsubara and Fujiwara, 2007) highlighted concerns of a magnitude 9.0 earthquake on the Nankai trench, which may affect China's coasts not only in South China Sea, but also in the East Sea and Yellow Sea. Here we present our work in progress towards developing a global tsunami propagation database that can be used for hazard assessments by many countries. The propagation scenarios are computed by using NOAA's MOST numerical model. Each scenario represents a typical Mw 7.5 earthquake with predefined earthquake parameters (Gica et al., 2008). The model grid was interpolated from ETOPO1 at 4 arc-min resolution, covering -80° to72°N and 0 to 360°E. We use this database for preliminary tsunami hazard assessment along China's coastlines.

  9. Technical problems in the construction of a map to zone the earthquake ground-shaking hazard in the United States

    USGS Publications Warehouse

    Hays, W.W.

    1984-01-01

    Zoning of the earthquake ground-shaking hazard - the division of a region into geographic areas having a similar relative severity or response to ground shaking - has been a goal in the United States for about fifty years. During this period, two types of ground-shaking hazard maps have been constructed. The first type assumes that, except for scaling differences, approximately the same effects that occurred in historic earthquakes will occur in future earthquakes. The second type integrates historic seismicity data and geologic information and uses probabilistic concepts to estimate the characteristics of future ground shaking within specific exposure times. Construction of zoning maps on both a national and regional scale requires innovative research and good data to resolve technical issues about seismicity, the earthquake source, seismic wave attenuation, and local ground response. Because of unresolved issues, implementation in building codes has proceeded fairly slowly. ?? 1984.

  10. Assessing hazards along our Nation's coasts

    USGS Publications Warehouse

    Hapke, Cheryl J.; Brenner, Owen; Hehre, Rachel; Reynolds, B.J.

    2013-01-01

    Coastal areas are essential to the economic, cultural, and environmental health of the Nation, yet by nature coastal areas are constantly changing due to a variety of events and processes. Extreme storms can cause dramatic changes to our shorelines in a matter of hours, while sea-level rise can profoundly alter coastal environments over decades. These changes can have a devastating impact on coastal communities, such as the loss of homes built on retreating sea cliffs or protective dunes eroded by storm waves. Sometimes, however, the changes can be positive, such as new habitat created by storm deposits. The U.S. Geological Survey (USGS) is meeting the need for scientific understanding of how our coasts respond to different hazards with continued assessments of current and future changes along U.S. coastlines. Through the National Assessment of Coastal Change Hazards (NACCH), the USGS carries out the unique task of quantifying coastal change hazards along open-ocean coasts in the United States and its territories. Residents of coastal communities, emergency managers, and other stakeholders can use science-based data, tools, models, and other products to improve planning and enhance resilience.

  11. Waste Encapsulation and Storage Facility (WESF) Hazards Assessment

    SciTech Connect

    COVEY, L.I.

    2000-11-28

    This report documents the hazards assessment for the Waste Encapsulation and Storage Facility (WESF) located on the U.S. Department of Energy (DOE) Hanford Site. This hazards assessment was conducted to provide the emergency planning technical basis for WESF. DOE Orders require an emergency planning hazards assessment for each facility that has the potential to reach or exceed the lowest level emergency classification.

  12. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Hazard identification and assessment. 851.21 Section 851.21 Energy DEPARTMENT OF ENERGY WORKER SAFETY AND HEALTH PROGRAM Specific Program Requirements § 851.21 Hazard identification and assessment. (a) Contractors must establish procedures to identify existing and potential workplace hazards and assess...

  13. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  14. Remote sensing and landslide hazard assessment

    NASA Technical Reports Server (NTRS)

    Mckean, J.; Buechel, S.; Gaydos, L.

    1991-01-01

    Remotely acquired multispectral data are used to improve landslide hazard assessments at all scales of investigation. A vegetation map produced from automated interpretation of TM data is used in a GIS context to explore the effect of vegetation type on debris flow occurrence in preparation for inclusion in debris flow hazard modeling. Spectral vegetation indices map spatial patterns of grass senescence which are found to be correlated with soil thickness variations on hillslopes. Grassland senescence is delayed over deeper, wetter soils that are likely debris flow source areas. Prediction of actual soil depths using vegetation indices may be possible up to some limiting depth greater than the grass rooting zone. On forested earthflows, the slow slide movement disrupts the overhead timber canopy, exposes understory vegetation and soils, and alters site spectral characteristics. Both spectral and textural measures from broad band multispectral data are successful at detecting an earthflow within an undisturbed old-growth forest.

  15. Debris flows: behavior and hazard assessment

    USGS Publications Warehouse

    Iverson, Richard M.

    2014-01-01

    Debris flows are water-laden masses of soil and fragmented rock that rush down mountainsides, funnel into stream channels, entrain objects in their paths, and form lobate deposits when they spill onto valley floors. Because they have volumetric sediment concentrations that exceed 40 percent, maximum speeds that surpass 10 m/s, and sizes that can range up to ~109 m3, debris flows can denude slopes, bury floodplains, and devastate people and property. Computational models can accurately represent the physics of debris-flow initiation, motion and deposition by simulating evolution of flow mass and momentum while accounting for interactions of debris' solid and fluid constituents. The use of physically based models for hazard forecasting can be limited by imprecise knowledge of initial and boundary conditions and material properties, however. Therefore, empirical methods continue to play an important role in debris-flow hazard assessment.

  16. How detailed should earthquake hazard maps be: comparing the performance of Japan's maps to uniform, randomized, and smoothed maps

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Spencer, Bruce; Liu, Mian

    2016-04-01

    Earthquake hazard maps forecast future shaking via assumptions about where, when, and how large future earthquakes will be. These assumptions involve the known earthquake history, models of fault geometry and motion, and geodetic data. Maps are made more detailed as additional data and more complicated models become available. However, the extent to which this process produces better forecasts of shaking is unknown. We explore this issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted shaking should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. Similarly, by the squared misfit metric, map performance improves up to a ~75-150 km smoothing window, and then decreases with further smoothing. Because the maps were made by using other data and models to try to predict future earthquake shaking, rather than by fitting past shaking data, these results are probably not an artifact of hindcasting rather than forecasting. They suggest that hazard models and the resulting maps can be over-parameterized, in that including too high a level of detail to describe past earthquakes may lower the maps' ability to forecast what will occur in the future. For example in Nepal, where GPS data show no significant variation in coupling between areas that have had recent large earthquakes and those that have not, past earthquakes likely do not show which parts are more at risk, and the entire area can be regarded as equally hazardous.

  17. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  18. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  19. Kinematics, mechanics, and potential earthquake hazards for faults in Pottawatomie County, Kansas, USA

    USGS Publications Warehouse

    Ohlmacher, G.C.; Berendsen, P.

    2005-01-01

    Many stable continental regions have subregions with poorly defined earthquake hazards. Analysis of minor structures (folds and faults) in these subregions can improve our understanding of the tectonics and earthquake hazards. Detailed structural mapping in Pottawatomie County has revealed a suite consisting of two uplifted blocks aligned along a northeast trend and surrounded by faults. The first uplift is located southwest of the second. The northwest and southeast sides of these uplifts are bounded by northeast-trending right-lateral faults. To the east, both uplifts are bounded by north-trending reverse faults, and the first uplift is bounded by a north-trending high-angle fault to the west. The structural suite occurs above a basement fault that is part of a series of north-northeast-trending faults that delineate the Humboldt Fault Zone of eastern Kansas, an integral part of the Midcontinent Rift System. The favored kinematic model is a contractional stepover (push-up) between echelon strike-slip faults. Mechanical modeling using the boundary element method supports the interpretation of the uplifts as contractional stepovers and indicates that an approximately east-northeast maximum compressive stress trajectory is responsible for the formation of the structural suite. This stress trajectory suggests potential activity during the Laramide Orogeny, which agrees with the age of kimberlite emplacement in adjacent Riley County. The current stress field in Kansas has a N85??W maximum compressive stress trajectory that could potentially produce earthquakes along the basement faults. Several epicenters of seismic events (

  20. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... existing and potential workplace hazards and assess the risk of associated workers injury and illness... radiological hazards. (b) Contractors must submit to the Head of DOE Field Element a list of closure facility hazards and the established controls within 90 days after identifying such hazards. The Head of DOE...

  1. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... existing and potential workplace hazards and assess the risk of associated workers injury and illness... radiological hazards. (b) Contractors must submit to the Head of DOE Field Element a list of closure facility hazards and the established controls within 90 days after identifying such hazards. The Head of DOE...

  2. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... existing and potential workplace hazards and assess the risk of associated workers injury and illness... radiological hazards. (b) Contractors must submit to the Head of DOE Field Element a list of closure facility hazards and the established controls within 90 days after identifying such hazards. The Head of DOE...

  3. Influence of the Great Megathrust Earthquakes of the Past Decade on Risk Assessment and Outreach Programs

    NASA Astrophysics Data System (ADS)

    Dengler, L. A.

    2014-12-01

    Four subduction zone earthquakes of magnitude ≥ 8.6 occurred between 2004 and 2013. No earthquakes of this size were reported anywhere in the world in the preceding 36 years. The wealth of seismic, geodetic, geologic and tsunami data from these great megathrust events has advanced the understanding of subduction zones and challenged a number of previously accepted ideas. This talk focuses on how they have also influenced risk assessment and preparedness programs. Megathrust earthquakes differ from other large damaging earthquakes. The size of the megathrust source means a much larger area may be impacted by earthquake shakingaffecting not only the amount of damage, but posing response and recovery challenges. A second factor is tsunami generation. About a third of the 760,000 casualties in the decade were caused by the four mega-earthquakes. All four produced deadly tsunamis and over 95% of the death total was attributed to tsunami. Even when the extraordinarily deadly 2004 Andaman Sumatra tsunami is removed from the data set, 85% of the casualties in the remaining three earthquakes were caused by tsunami. In contrast, in the non-megathrust events caused over two-thirds of the decade's casualties but less than 1 % were caused by tsunami. The Cascadia subduction zone along the coast of northern California, Oregon, Washington and southern British Columbia is the only location in the contiguous 48 states where a great megathrust earthquake will someday occur. Assessing the risk posed by Cascadia and developing effective preparedness programs pose a number of challenges. Awareness of Cascadia is relatively recent and assessing the magnitude, recurrence and nature of past events depends primarily on paleoseismology. The megathrust events of the past decade provide a proxy for and a general picture of the likely impacts of a future Cascadia earthquake and have influenced preparedness efforts throughout the Cascadia region. The recent events have also posed problems for

  4. Final Report: Seismic Hazard Assessment at the PGDP

    SciTech Connect

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties of seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.

  5. Seismic hazard along a crude oil pipeline in the event of an 1811-1812 type New Madrid earthquake. Technical report

    SciTech Connect

    Hwang, H.H.M.; Chen, C.H.S.

    1990-04-16

    An assessment of the seismic hazard that exists along the major crude oil pipeline running through the New Madrid seismic zone from southeastern Louisiana to Patoka, Illinois is examined in the report. An 1811-1812 type New Madrid earthquake with moment magnitude 8.2 is assumed to occur at three locations where large historical earthquakes have occurred. Six pipeline crossings of the major rivers in West Tennessee are chosen as the sites for hazard evaluation because of the liquefaction potential at these sites. A seismologically-based model is used to predict the bedrock accelerations. Uncertainties in three model parameters, i.e., stress parameter, cutoff frequency, and strong-motion duration are included in the analysis. Each parameter is represented by three typical values. From the combination of these typical values, a total of 27 earthquake time histories can be generated for each selected site due to an 1811-1812 type New Madrid earthquake occurring at a postulated seismic source.

  6. Determination of Bedrock Variations and S-wave Velocity Structure in the NW part of Turkey for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Ozel, A. O.; Arslan, M. S.; Aksahin, B. B.; Genc, T.; Isseven, T.; Tuncer, M. K.

    2015-12-01

    Tekirdag region (NW Turkey) is quite close to the North Anatolian Fault which is capable of producing a large earthquake. Therefore, earthquake hazard mitigation studies are important for the urban areas close to the major faults. From this point of view, integration of different geophysical methods has important role for the study of seismic hazard problems including seismotectonic zoning. On the other hand, geological mapping and determining the subsurface structure, which is a key to assist management of new developed areas, conversion of current urban areas or assessment of urban geological hazards can be performed by integrated geophysical methods. This study has been performed in the frame of a national project, which is a complimentary project of the cooperative project between Turkey and Japan (JICA&JST), named as "Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education". With this principal aim, this study is focused on Tekirdag and its surrounding region (NW of Turkey) where some uncertainties in subsurface knowledge (maps of bedrock depth, thickness of quaternary sediments, basin geometry and seismic velocity structure,) need to be resolved. Several geophysical methods (microgravity, magnetic and single station and array microtremor measurements) are applied and the results are evaluated to characterize lithological changes in the region. Array microtremor measurements with several radiuses are taken in 30 locations and 1D-velocity structures of S-waves are determined by the inversion of phase velocities of surface waves, and the results of 1D structures are verified by theoretical Rayleigh wave modelling. Following the array measurements, single-station microtremor measurements are implemented at 75 locations to determine the predominant frequency distribution. The predominant frequencies in the region range from 0.5 Hz to 8 Hz in study area. On the other hand, microgravity and magnetic measurements are performed on

  7. Assessment of Liquefaction Susceptibility of Kutahya Soils Based on Recent Earthquakes in Turkey

    NASA Astrophysics Data System (ADS)

    Zengin, Enes; Abiddin Erguler, Zeynal

    2014-05-01

    The plate tectonic setting of Turkey resulted many destructive earthquakes having magnitude higher than 7 in several cities situated close to faulting system. The city of Kutahya and its surrounding counties are notable examples to be located in the earthquake prone region and therefore, several earthquakes have been recently recorded particularly in its Simav district. A significant part of the residential area of Kutahya is found on alluvial deposits dominated by silt and fine sand size materials, and its southern boundary is controlled by Kutahya fault zone (KFZ) extending parallel to the city settlement. In this study, considering the possibility of a potential destructive earthquake in future as well as increasing population dependent further demand for new building in this city, investigation liquefaction potential of these soils is aimed for using in earthquake risk mitigation strategies. For this purpose, physical, ground water condition and standard penetration test (SPT) results from 283 different boreholes spreading over a wide area were examined to understand the behaviour this soil under earthquake induced dynamic loading. The total assessed drilling depth is about 2140 m. Required corrections were applied to all SPT data for obtaining SPT-(N1)60 values for liquefaction analyses. The estimation representative magnitude, depth of epicentre and maximum ground acceleration (amax) based on previous earthquakes and faulting characteristics of KFZ were initial targets for accurately assessment liquefaction phenomena of this city. For determination of amax in this region, in addition to attenuation relationship based on Turkish strong ground motion data, individual measurements from earthquakes stations closing to study site were also collected. As a result of all analyses and reviewing previous earthquakes records in this region, earthquake magnitudes vary between 5.0 and 7.4, and amax values changing between 400 and 800 gal were used in liquefaction

  8. Geodetic constraints on frictional properties and earthquake hazard in the Imperial Valley, Southern California

    NASA Astrophysics Data System (ADS)

    Lindsey, Eric O.; Fialko, Yuri

    2016-02-01

    We analyze a suite of geodetic observations across the Imperial Fault in southern California that span all parts of the earthquake cycle. Coseismic and postseismic surface slips due to the 1979 M 6.6 Imperial Valley earthquake were recorded with trilateration and alignment surveys by Harsh (1982) and Crook et al. (1982), and interseismic deformation is measured using a combination of multiple interferometric synthetic aperture radar (InSAR)-viewing geometries and continuous and survey-mode GPS. In particular, we combine more than 100 survey-mode GPS velocities with InSAR data from Envisat descending tracks 84 and 356 and ascending tracks 77 and 306 (149 total acquisitions), processed using a persistent scatterers method. The result is a dense map of interseismic velocities across the Imperial Fault and surrounding areas that allows us to evaluate the rate of interseismic loading and along-strike variations in surface creep. We compare available geodetic data to models of the earthquake cycle with rate- and state-dependent friction and find that a complete record of the earthquake cycle is required to constrain key fault properties including the rate-dependence parameter (a - b) as a function of depth, the extent of shallow creep, and the recurrence interval of large events. We find that the data are inconsistent with a high (>30 mm/yr) slip rate on the Imperial Fault and investigate the possibility that an extension of the San Jacinto-Superstition Hills Fault system through the town of El Centro may accommodate a significant portion of the slip previously attributed to the Imperial Fault. Models including this additional fault are in better agreement with the available observations, suggesting that the long-term slip rate of the Imperial Fault is lower than previously suggested and that there may be a significant unmapped hazard in the western Imperial Valley.

  9. Earthquake hazard analysis for the different regions in and around Aǧrı

    NASA Astrophysics Data System (ADS)

    Bayrak, Erdem; Yilmaz, Şeyda; Bayrak, Yusuf

    2016-04-01

    We investigated earthquake hazard parameters for Eastern part of Turkey by determining the a and b parameters in a Gutenberg-Richter magnitude-frequency relationship. For this purpose, study area is divided into seven different source zones based on their tectonic and seismotectonic regimes. The database used in this work was taken from different sources and catalogues such as TURKNET, International Seismological Centre (ISC), Incorporated Research Institutions for Seismology (IRIS) and The Scientific and Technological Research Council of Turkey (TUBITAK) for instrumental period. We calculated the a value, b value, which is the slope of the frequency-magnitude Gutenberg-Richter relationship, from the maximum likelihood method (ML). Also, we estimated the mean return periods, the most probable maximum magnitude in the time period of t-years and the probability for an earthquake occurrence for an earthquake magnitude ≥ M during a time span of t-years. We used Zmap software to calculate these parameters. The lowest b value was calculated in Region 1 covered Cobandede Fault Zone. We obtain the highest a value in Region 2 covered Kagizman Fault Zone. This conclusion is strongly supported from the probability value, which shows the largest value (87%) for an earthquake with magnitude greater than or equal to 6.0. The mean return period for such a magnitude is the lowest in this region (49-years). The most probable magnitude in the next 100 years was calculated and we determined the highest value around Cobandede Fault Zone. According to these parameters, Region 1 covered the Cobandede Fault Zone and is the most dangerous area around the Eastern part of Turkey.

  10. Characterizing soils for hazardous waste site assessments.

    PubMed

    Breckenridge, R P; Keck, J F; Williams, J R

    1994-04-01

    This paper provides a review and justification of the minimum data needed to characterize soils for hazardous waste site assessments and to comply with the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA). Scientists and managers within the regulatory agency and the liable party need to know what are the important soil characteristics needed to make decisions about risk assessment, what areas need remediation and what remediation options are available. If all parties involved in characterizing a hazardous waste site can agree on the required soils data set prior to starting a site investigation, data can be collected in a more efficient and less costly manner. Having the proper data will aid in reaching decisions on how to address concerns at, and close-out, hazardous waste sites.This paper was prepared to address two specific concerns related to soil characterization for CERCLA remedial response. The first concern is the applicability of traditional soil classification methods to CERCLA soil characterization. The second is the identification of soil characterization data type required for CERCLA risk assessment and analysis of remedial alternatives. These concerns are related, in that the Data Quality Objective (DQO) process addresses both. The DQO process was developed in part to assist CERCLA decision-makers in identifying the data types, data quality, and data quantity required to support decisions that must be made during the remedial investigation/feasibility study (RI/FS) process. Data Quality Objectives for Remedial Response Activities: Development Process (US EPA, 1987a) is a guidebook on developing DQOs. This process as it relates to CERCLA soil characterization is discussed in the Data Quality Objective Section of this paper. PMID:24213742

  11. Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS

    NASA Astrophysics Data System (ADS)

    Sabuncu, A.; Garagon Dogru, A.; Ozener, H.

    2013-12-01

    Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.

  12. Methods for probabilistic assessments of geologic hazards

    SciTech Connect

    Mann, C.J.

    1987-01-01

    Although risk analysis today is considered to include three separate aspects: (1) identifying sources of risk, (2) estimating probabilities quantitatively, and (3) evaluating consequences of risk, here, only estimation of probabilities for natural geologic events, processes, and phenomena is addressed. Ideally, evaluation of potential future hazards includes an objective determination of probabilities that has been derived from past occurrences of identical events or components contributing to complex processes or phenomena. In practice, however, data which would permit objective estimation of those probabilities of interest may not be adequate, or may not even exist. Another problem that arises normally, regardless of the extent of data, is that risk assessments involve estimating extreme values. Rarely are extreme values accurately predictable even when an empirical frequency distribution is established well by data. In the absence of objective methods for estimating probabilities of natural events or processes, subjective probabilities for the hazard must be established through Bayesian methods, expert opinion, or Delphi methods. Uncertainty of every probability determination must be stated for each component of an event, process, or phenomenon. These uncertainties also must be propagated through the quantitative analysis so that a realistic estimate of total uncertainty can be associated with each final probability estimate for a geologic hazard.

  13. Assessment of External Hazards at Radioactive Waste and Used Fuel Management Facilities - 13505

    SciTech Connect

    Gerchikov, Mark; Schneider, Glenn; Khan, Badi; Alderson, Elizabeth

    2013-07-01

    One of the key lessons from the Fukushima accident is the importance of having a comprehensive identification and evaluation of risks posed by external events to nuclear facilities. While the primary focus has been on nuclear power plants, the Canadian nuclear industry has also been updating hazard assessments for radioactive waste and used fuel management facilities to ensure that lessons learnt from Fukushima are addressed. External events are events that originate either physically outside the nuclear site or outside its control. They include natural events, such as high winds, lightning, earthquakes or flood due to extreme rainfall. The approaches that have been applied to the identification and assessment of external hazards in Canada are presented and analyzed. Specific aspects and considerations concerning hazards posed to radioactive waste and used fuel management operations are identified. Relevant hazard identification techniques are described, which draw upon available regulatory guidance and standard assessment techniques such as Hazard and Operability Studies (HAZOPs) and 'What-if' analysis. Consideration is given to ensuring that hazard combinations (for example: high winds and flooding due to rainfall) are properly taken into account. Approaches that can be used to screen out external hazards, through a combination of frequency and impact assessments, are summarized. For those hazards that cannot be screened out, a brief overview of methods that can be used to conduct more detailed hazard assessments is also provided. The lessons learnt from the Fukushima accident have had a significant impact on specific aspects of the approaches used to hazard assessment for waste management. Practical examples of the effect of these impacts are provided. (authors)

  14. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  15. Zoning surface rupture hazard along normal faults: Insight from L'Aquila, 2009 (Mw 6.3, Central Italy) and other global earthquakes

    NASA Astrophysics Data System (ADS)

    Boncio, P.; Galli, P.; Naso, G.; Pizzi, A.

    2012-04-01

    Surface fault rupture hazard (SFRH) is a localized seismic hazard due to the breaching of the ground surface from slip along a fault during a large earthquake. This motion may offset, tilt, distort and damage buildings on or in the vicinity of the fault trace. Although SFRH should be one of the most easily detectable seismic hazards, due to the visibility of active fault traces, the April 6, 2009 L'Aquila earthquake in central Italy (Mw 6.3) demonstrates that there is much progress to be made in assessing the hazard. Indeed, the 2009 normal faulting surface ruptures occurred across populated areas, producing mild-to-moderate damages to infrastructure (e.g., pipelines, roads) and buildings, including structures less than a few years old. Similar to other countries with SFRH, Italy does not have explicit and comprehensive codes and/or regulations concerning this important issue. Following the observation of surface faulting occurred during the 2009 earthquake, we propose general criteria for delineating zones of SFRH along active normal faults. Our proposal, which is explicitly inspired to the Californian Alquist-Priolo Earthquake Fault Zoning Act, compares the 2009 coseismic surficial faults to surface rupture data collected globally for several normal faulting earthquakes. We propose Earthquake Fault Zones (EFZ) and fault Setbacks (S) which are asymmetrically shaped around the fault trace. The zones are wider on the hanging wall, consistently with the observation of wider coseismic rupture zones in the hanging wall block compared to the footwall block. For faults mapped in detail, we suggest a 150 m-wide EFZ on the hanging wall and a 30 m-wide EFZ on the footwall. The suggested widths of the S on the hanging wall and footwall are 40 m and 15 m, respectively. Considering the data collected for the L'Aquila fault system and abroad, we are confident that our proposal is conservative enough for Apennine-like normal faults and, applicable to Italy and other areas with

  16. A seismic source zone model for the seismic hazard assessment of the Italian territory

    NASA Astrophysics Data System (ADS)

    Meletti, Carlo; Galadini, Fabrizio; Valensise, Gianluca; Stucchi, Massimiliano; Basili, Roberto; Barba, Salvatore; Vannucci, Gianfranco; Boschi, Enzo

    2008-04-01

    We designed a new seismic source model for Italy to be used as an input for country-wide probabilistic seismic hazard assessment (PSHA) in the frame of the compilation of a new national reference map. We started off by reviewing existing models available for Italy and for other European countries, then discussed the main open issues in the current practice of seismogenic zoning. The new model, termed ZS9, is largely based on data collected in the past 10 years, including historical earthquakes and instrumental seismicity, active faults and their seismogenic potential, and seismotectonic evidence from recent earthquakes. This information allowed us to propose new interpretations for poorly understood areas where the new data are in conflict with assumptions made in designing the previous and widely used model ZS4. ZS9 is made out of 36 zones where earthquakes with Mw > = 5 are expected. It also assumes that earthquakes with Mw up to 5 may occur anywhere outside the seismogenic zones, although the associated probability is rather low. Special care was taken to ensure that each zone sampled a large enough number of earthquakes so that we could compute reliable earthquake production rates. Although it was drawn following criteria that are standard practice in PSHA, ZS9 is also innovative in that every zone is characterised also by its mean seismogenic depth (the depth of the crustal volume that will presumably release future earthquakes) and predominant focal mechanism (their most likely rupture mechanism). These properties were determined using instrumental data, and only in a limited number of cases we resorted to geologic constraints and expert judgment to cope with lack of data or conflicting indications. These attributes allow ZS9 to be used with more accurate regionalized depth-dependent attenuation relations, and are ultimately expected to increase significantly the reliability of seismic hazard estimates.

  17. The Effects of the Passage of Time from the 2011 Tohoku Earthquake on the Public's Anxiety about a Variety of Hazards.

    PubMed

    Nakayachi, Kazuya; Nagaya, Kazuhisa

    2016-01-01

    This research investigated whether the Japanese people's anxiety about a variety of hazards, including earthquakes and nuclear accidents, has changed over time since the Tohoku Earthquake in 2011. Data from three nationwide surveys conducted in 2008, 2012, and 2015 were compared to see the change in societal levels of anxiety toward 51 types of hazards. The same two-phase stratified random sampling method was used to create the list of participants in each survey. The results showed that anxiety about earthquakes and nuclear accidents had increased for a time after the Tohoku Earthquake, and then decreased after a four-year time frame with no severe earthquakes and nuclear accidents. It was also revealed that the anxiety level for some hazards other than earthquakes and nuclear accidents had decreased at ten months after the Earthquake, and then remained unchanged after the four years. Therefore, ironically, a major disaster might decrease the public anxiety in general at least for several years. PMID:27589780

  18. Geologic Hazards Associated with Longmen Shan Fault zone, During and After the Mw 8.0, May 12, 2008 Earthquake

    NASA Astrophysics Data System (ADS)

    Xu, X.; Kusky, T.; Li, Z.

    2008-12-01

    A magnitude 8.0 earthquake shook the northeastern margin of the Tibetan plateau, on May 12, 2008 along the Longmen Shan orogenic belt that marks the boundary between the Songpan Ganzi terrane and Yangtze block. The Tibetan plateau is expanding eastwards, and GPS observations show that surface motion directions are northeast relative to the Sizhuan basin where the earthquake occurred. This sense of motion of crustal blocks is the reason why the main faults in Longmen Shan are oblique thrust-dextral strike slip faults. There are three main parallel thrust/ dextral-slip faults in Longmen Shan. All three faults strike northeast and dip to northwest. The May 12 rupture extends 270 km along the fault zone, and the epicenter of the magnitude 8.0 earthquake was located in Wenchuan, 90 km WNW of Chengdu, Sichuan, China. The devastating earthquake killed at least 87,652 people and destroyed all the buildings in epicenter. The victims of the earthquake zone want to rebuild their homes immediately, but they need more suggestions about the geologic hazards to help them withstand future possible earthquakes. So after earthquake, we went to disaster areas from July 5th to 10th to get first-hand field data, which include observations of surface ruptures, landslides, features of X joints on the damaged buildings, parameters of the active faults and landslides. If we only depend on the field data in accessible locations, we can only know the information of the ruptures in these positions, and we can't learn more information about the whole area affected by the earthquake. The earthquake zone shows surface rupture features of both thrust and strike-slip fault activities, indicating oblique slip followed by thrusting during the May 12 earthquake. In my talk, I will show the general regional geological disaster information by processing the pro- and post-earthquake satellite data. Then we combine the raw field data and regional geology as the restrictive conditions to determine the

  19. RiskScape Volcano: Development of a risk assessment tool for volcanic hazards

    NASA Astrophysics Data System (ADS)

    Deligne, Natalia; King, Andrew; Jolly, Gill; Wilson, Grant; Wilson, Tom; Lindsay, Jan

    2013-04-01

    RiskScape is a multi-hazard risk assessment tool developed by GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand that models the risk and impact of various natural hazards on a given built environment. RiskScape has a modular structure: the hazard module models hazard exposure (e.g., ash thickness at a given location), the asset module catalogues assets (built environment, infrastructure, and people) and their attributes exposed to the hazard, and the vulnerability module models the consequences of asset exposure to the hazard. Hazards presently included in RiskScape are earthquakes, river floods, tsunamis, windstorms, and ash from volcanic eruptions (specifically from Ruapehu). Here we present our framework for incorporating other volcanic hazards (e.g., pyroclastic density currents, lava flows, lahars, ground deformation) into RiskScape along with our approach for assessing asset vulnerability. We also will discuss the challenges of evaluating risk for 'point source' (e.g., stratovolcanoes) vs 'diffuse' (e.g., volcanic fields) volcanism using Ruapehu and the Auckland volcanic field as examples. Once operational, RiskScape Volcano will be a valuable resource both in New Zealand and internationally as a practical tool for evaluating risk and also as an example for how to predict the consequences of volcanic eruptions on both rural and urban environments.

  20. Flood hazard assessment for french NPPs

    NASA Astrophysics Data System (ADS)

    Rebour, Vincent; Duluc, Claire-Marie; Guimier, Laurent

    2015-04-01

    This paper presents the approach for flood hazard assessment for NPP which is on-going in France in the framework of post-Fukushima activities. These activities were initially defined considering both European "stress tests" of NPPs pursuant to the request of the European Council, and the French safety audit of civilian nuclear facilities in the light of the Fukushima Daiichi accident. The main actors in that process are the utility (EDF is, up to date, the unique NPP's operator in France), the regulatory authority (ASN) and its technical support organization (IRSN). This paper was prepared by IRSN, considering official positions of the other main actors in the current review process, it was not officially endorsed by them. In France, flood hazard to be considered for design basis definition (for new NPPs and for existing NPPs in periodic safety reviews conducted every 10 years) was revised before Fukushima-Daichi accident, due to le Blayais NPP December 1999 experience (partial site flooding and loss of some safety classified systems). The paper presents in the first part an overview of the revised guidance for design basis flood. In order to address design extension conditions (conditions that could result from natural events exceeding the design basis events), a set of flooding scenarios have been defined by adding margins on the scenarios that are considered for the design. Due to the diversity of phenomena to be considered for flooding hazard, the margin assessment is specific to each flooding scenario in terms of parameter to be penalized and of degree of variation of this parameter. The general approach to address design extension conditions is presented in the second part of the paper. The next parts present the approach for five flooding scenarios including design basis scenario and additional margin to define design extension scenarios.

  1. Premium Rating and Risk Assessment in Earthquake Insurance

    NASA Astrophysics Data System (ADS)

    Jimenez-Huerta, D.

    2005-12-01

    Assessing earthquake risk in a given asset portfolio involves a synthesis of results from two areas of research. The first is knowledge of the earthquake sources that are likely to affect the assets: where they are, how large they are likely to be and how often earthquakes are likely to occur; this issue is addressed via a doubly stochastic Poisson-gamma marked point process model for earthquake occurrence, accounting for the spatial and temporal distribution of seismicity. The second is knowledge of the likely severity of loss that will arise given the occurrence of an earthquake. A beta-regression model is used to relate observed (conditional) losses to site conditions and earthquake characteristics. The calculation of expected losses and associated quantities of interest in an insurance portfolio lies at the interface of the above-mentioned two factors and is the aim of this paper. Of particular interest is the approximation of the aggregate loss distribution, from which any actuarial analysis stems.

  2. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    NASA Technical Reports Server (NTRS)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  3. Probabilistic assessment of decoupling loss-of-coolant accident and earthquake in nuclear power plant design

    SciTech Connect

    Lu, S.C.; Harris, D.O.

    1981-01-01

    This paper describes a research project conducted at Lawrence Livermore National Laboratory to establish a technical basis for reassessing the requirement of combining large loss-of-coolant-accident (LOCA) and earthquake loads in nuclear power plant design. A large LOCA is defined herein as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressureized water reactor (PWR). A systematic probability approach has been employed to estimate the probability of a large LOCA directly and indirectly induced by earthquakes. The probability of a LOCA directly induced by earthquakes was assessed by a numerical simulation of pipe rupture of a reactor coolant system. The simulation employed a deterministic fracture mechanics model which dictates the fatigue growth of pre-existing cracks in the pipe. The simulation accounts for the stochastic nature of input elements such as the initial crack size distribution, the crack occurrence rate, crack and leak detection probabilities as functions of crack size, plant transient occurrence rates, the seismic hazard, stress histories, and crack growth model parameters. Effects on final results due to variation an uncertainty of input elements were assessed by a limited sensitivity study. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the orer of 10/sup -12/). The probability of a leak was found to be several orders of magnitudes greater than that of a complete break.

  4. Introducing ShakeMap Atlas 2.0: An improved suite of recent historical earthquake ShakeMaps for global hazard analyses

    USGS Publications Warehouse

    Garcia, D.; Mah, R.T.; Johnson, K.L.; Worden, B.C.; Hearne, M.G.; Marano, K.D.; Lin, K.; Wald, D.J.

    2011-01-01

    The U.S. Geological Survey (USGS) ShakeMap system is a widely used tool for assessing the ground motion during an earthquake in near-real time applications, but also for past events and seismic scenarios. The ShakeMap Atlas (Allen et al., 2008) is a compilation of nearly 5,000 ShakeMaps of global events that comprises the most damaging and potentially damaging earthquakes between 1973 and 2007. The Atlas is an invaluable resource for investigating strong ground-motion near the source, and it is also used for calibrating the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system. Here we present an extensively revised version of the Atlas, which includes as new features the use of: (1) a new version of ShakeMap; (2) an updated source catalog; (3) a refined ground-motion prediction equation (GMPE) selection; and (4) many more macroseismic intensity and ground-motion data. The new version of ShakeMap (V3.5; Worden et al., 2010) treats in a separate way native and converted data when mapping each intensity measure (MMI, PGA, PGV, and PSA). This is especially important for intensity observations, which are the main data source in the aftermath of most global events. ShakeMap V3.5 also allows for the inclusion of intensity prediction equations and makes use of improved mapping techniques and uncertainty estimations. Earthquake global hypocenters have been substituted, when possible, for regional locations and, in some cases, finite source models not included before. The Atlas span has been extended till mid 2010, and some older events have also been added for the 1973-2007 period. In order to improve the adequacy of the GMPE used by ShakeMap to estimate the ground shaking for a given event where data are not available, we use a new global selection scheme to discriminate between different types of earthquakes (García et al., 2011). Finally, we have included a large amount of recently available observations from national and regional networks. All these

  5. Assessing community vulnerabilities to natural hazards on the Island of Hawaii

    NASA Astrophysics Data System (ADS)

    Nishioka, Chris; Delparte, Donna

    2010-05-01

    The island of Hawaii is susceptible to numerous natural hazards such as tsunamis, flooding, lava flow, earthquakes, hurricanes, landslides, wildfires and storm surge. The impact of a natural disaster on the island's communities has the potential to endanger peoples' lives and threaten critical infrastructure, homes, businesses and economic drivers such as tourism. A Geographic Information System (GIS) has the ability to assess community vulnerabilities by examining the spatial relationships between hazard zones, socioeconomic infrastructure and demographic data. By drawing together existing datasets, GIS was used to examine a number of community vulnerabilities. Key areas of interest were government services, utilities, property assets, industry and transportation. GIS was also used to investigate population dynamics in hazard zones. Identification of community vulnerabilities from GIS analysis can support mitigation measures and assist planning and response measures to natural hazards.

  6. Earthquake induced landslide hazard: a multidisciplinary field observatory in the Marmara SUPERSITE

    NASA Astrophysics Data System (ADS)

    Bigarré, Pascal

    2014-05-01

    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. As one of the three SUPERSITE concept FP7 projects dealing with long term high level monitoring of major natural hazards at the European level, the MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 1999 Earthquake caused extensive landslides while tsunami effects were observed during the post-event surveys in several places along the coasts of the Izmit bay. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest. First, the Cekmece-Avcilar peninsula, located westwards of Istanbul, is a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. Second, the off-shore entrance of the Izmit Gulf, close to the termination of the surface rupture of the 1999 earthquake

  7. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  8. A preliminary regional assessment of earthquake-induced landslide susceptibility for Vrancea Seismic Region

    NASA Astrophysics Data System (ADS)

    Micu, Mihai; Balteanu, Dan; Ionescu, Constantin; Havenith, Hans; Radulian, Mircea; van Westen, Cees; Damen, Michiel; Jurchescu, Marta

    2015-04-01

    In seismically-active regions, earthquakes may trigger landslides enhancing the short-to-long term slope denudation and sediment delivery and conditioning the general landscape evolution. Co-seismic slope failures present in general a low frequency - high magnitude pattern which should be addressed accordingly by landslide hazard assessment, with respect to the generally more frequent precipitation-triggered landslides. The Vrancea Seismic Region, corresponding to the curvature sector of the Eastern Romanian Carpathians, represents the most active sub-crustal (focal depth > 50 km) earthquake province of Europe. It represents the main seismic energy source throughout Romania with significant transboundary effects recorded as far as Ukraine and Bulgaria. During the last 300 years, the region featured 14 earthquakes with M>7, among which seven events with magnitude above 7.5 and three between 7.7 and 7.9. Apart from the direct damages, the Vrancea earthquakes are also responsible for causing numerous other geohazards, such as ground fracturing, groundwater level disturbances and possible deep-seated landslide occurrences (rock slumps, rock-block slides, rock falls, rock avalanches). The older deep-seated landslides (assumed to have been) triggered by earthquakes usually affect the entire slope profile. They often formed landslide dams strongly influencing the river morphology and representing potential threats (through flash-floods) in case of lake outburst. Despite the large potential of this research issue, the correlation between the region's seismotectonic context and landslide predisposing factors has not yet been entirely understood. Presently, there is a lack of information provided by the geohazards databases of Vrancea that does not allow us to outline the seismic influence on the triggering of slope failures in this region. We only know that the morphology of numerous large, deep-seated and dormant landslides (which can possibly be reactivated in future

  9. The Determination of Earthquake Hazard Parameters Deduced from Bayesian Approach for Different Seismic Source Regions of Western Anatolia

    NASA Astrophysics Data System (ADS)

    Bayrak, Yusuf; Türker, Tuğba

    2016-01-01

    The Bayesian method is used to evaluate earthquake hazard parameters of maximum regional magnitude ( M max), β value, and seismic activity rate or intensity ( λ) and their uncertainties for the 15 different source regions in Western Anatolia. A compiled earthquake catalog that is homogenous for M s ≥ 4 was completed during the period from 1900 to 2013. The computed M max values are between 6.00 and 8.06. Low values are found in the northern part of Western Anatolia, whereas high values are observed in the southern part of Western Anatolia, related to the Aegean subduction zone. The largest value is computed in region 10, comprising the Aegean Islands. The quantiles of functions of distributions of true and apparent magnitude on a given time interval [0 ,T] are evaluated. The quantiles of functions of distributions of apparent and true magnitudes for future time intervals of 5, 10, 20, 50, and 100 years are calculated in all seismogenic source regions for confidence limits of probability levels of 50, 70, and 90 %. According to the computed earthquake hazard parameters, the requirement leads to the earthquake estimation of the parameters referred to as the most seismically active regions of Western Anatolia. The Aegean Islands, which have the highest earthquake magnitude (7.65) in the next 100 years with a 90 % probability level, is the most dangerous region compared to other regions. The results found in this study can be used in probabilistic seismic hazard studies of Western Anatolia.

  10. GMTSAR Software for Rapid Assessment of Earthquakes

    NASA Astrophysics Data System (ADS)

    Sandwell, D. T.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.

    2010-12-01

    GMTSAR is an open source (GNU General Public License) InSAR processing system designed for users familiar with Generic Mapping Tools (GMT). The code is written in C and will compile on any computer where GMT and NETCDF are installed. The system has three main components: 1) a preprocessor for each satellite data type (e.g., ERS, Envisat, and ALOS) to convert the native format and orbital information into a generic format; 2) an InSAR processor to focus and align stacks of images, map topography into phase, and form the complex interferogram; 3) a postprocessor, mostly based on GMT, to filter the interferogram and construct interferometric products of phase, coherence, phase gradient, and line-of-sight displacement in both radar and geographic coordinates. GMT is used to display all the products as postscript files and kml-images for Google Earth to be shared rapidly with other investigators. A set of C-shell scripts has been developed for standard 2-pass processing as well as image alignment for stacking and time series. ScanSAR processing is also possible but requires a knowledgeable user. The code was used to quickly process and display mosaics of interferograms from the M8.8 Maule Chile Earthquake as well as the M7.2 El Major-Cucapah Earthquake. Software and test data are available at ftp://topex.ucsd.edu/pub/gmtsar.

  11. Hazards assessment for the Waste Experimental Reduction Facility

    SciTech Connect

    Calley, M.B.; Jones, J.L. Jr.

    1994-09-19

    This report documents the hazards assessment for the Waste Experimental Reduction Facility (WERF) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. This hazards assessment describes the WERF, the area surrounding WERF, associated buildings and structures at WERF, and the processes performed at WERF. All radiological and nonradiological hazardous materials stored, used, or produced at WERF were identified and screened. Even though the screening process indicated that the hazardous materials could be screened from further analysis because the inventory of radiological and nonradiological hazardous materials were below the screening thresholds specified by DOE and DOE-ID guidance for DOE Order 5500.3A, the nonradiological hazardous materials were analyzed further because it was felt that the nonradiological hazardous material screening thresholds were too high.

  12. Assessing Tsunami Hazard from the Geologic Record

    NASA Astrophysics Data System (ADS)

    Jaffe, B. E.

    2011-12-01

    The 11 March 2011 Tohoku-Oki tsunami dramatically demonstrated the vulnerability of the world's coastlines to the impact of tsunamis. Although northeast Japan had experienced large tsunamis in the past, there was no historical precedent for the March 11 tsunami. Most areas of the world capable of receiving such catastrophic tsunamis have not experienced them during the short period of written history. Sedimentary deposits left by tsunamis are being used to extend the record of tsunamis back through time. The state of the science for tsunami deposits has now evolved to a point where false positives (e.g. misinterpreting a storm deposit as a tsunami deposit) are less frequent. Tsunami hazard assessment is beginning to incorporate the spatial distribution of tsunami deposits and the record of tsunami recurrence. A recent development in the use of tsunami deposits for tsunami risk assessment is to obtain tsunami magnitude estimates by applying sediment transport models to replicate the observed deposits. Models have focused on estimating two parameters, tsunami height and flow speed. These models are developed and tested using data sets collected from recent tsunamis (Papua New Guinea 1998, Peru 2001, Indian Ocean 2004, and Samoa 2009, and most recently, Tohoku-Oki 2011). The extent of tsunami deposits were less than the maximum inundation, but typically were within 10% on gently sloping coastal plains. However, recent field investigations on the coastal plain of Sendai, Japan after the 2011 tsunami bring into question whether the extent of tsunami deposits are a good proxy for maximum inundation distance. There, because of sediment source limitations, an easily recognizable deposit (sand thickness >0.5 cm) only extended about 2/3 of the way to the limit of inundation. This new data highlights the need to incorporate other proxies such as geochemical signatures and approaches such as sediment transport modeling in tsunami hazard assessment.

  13. Seismic hazard assessment of Sub-Saharan Africa using geodetic strain rate models

    NASA Astrophysics Data System (ADS)

    Poggi, Valerio; Pagani, Marco; Weatherill, Graeme; Garcia, Julio; Durrheim, Raymond J.; Mavonga Tuluka, Georges

    2016-04-01

    The East African Rift System (EARS) is the major active tectonic feature of the Sub-Saharan Africa (SSA) region. Although the seismicity level of such a divergent plate boundary can be described as moderate, several earthquakes have been reported in historical times causing a non-negligible level of damage, albeit mostly due to the high vulnerability of the local buildings and structures. Formulation and enforcement of national seismic codes is therefore an essential future risk mitigation strategy. Nonetheless, a reliable risk assessment cannot be done without the calibration of an updated seismic hazard model for the region. Unfortunately, the major issue in assessing seismic hazard in Sub-Saharan Africa is the lack of basic information needed to construct source and ground motion models. The historical earthquake record is largely incomplete, while instrumental catalogue is complete down to sufficient magnitude only for a relatively short time span. In addition, mapping of seimogenically active faults is still an on-going program. Recent studies have identified major seismogenic lineaments, but there is substantial lack of kinematic information for intermediate-to-small scale tectonic features, information that is essential for the proper calibration of earthquake recurrence models. To compensate this lack of information, we experiment the use of a strain rate model recently developed by Stamps et al. (2015) in the framework of a earthquake hazard and risk project along the EARS supported by USAID and jointly carried out by GEM and AfricaArray. We use the inferred geodetic strain rates to derive estimates of total scalar moment release, subsequently used to constrain earthquake recurrence relationships for both area (as distributed seismicity) and fault source models. The rates obtained indirectly from strain rates and more classically derived from the available seismic catalogues are then compared and combined into a unique mixed earthquake recurrence model

  14. Earthquake recordings from the 2002 Seattle Seismic Hazard Investigation of Puget Sound (SHIPS), Washington State

    USGS Publications Warehouse

    Pratt, Thomas L.; Meagher, Karen L.; Brocher, Thomas M.; Yelin, Thomas; Norris, Robert; Hultgrien, Lynn; Barnett, Elizabeth; Weaver, Craig S.

    2003-01-01

    This report describes seismic data obtained during the fourth Seismic Hazard Investigation of Puget Sound (SHIPS) experiment, termed Seattle SHIPS . The experiment was designed to study the influence of the Seattle sedimentary basin on ground shaking during earthquakes. To accomplish this, we deployed seismometers over the basin to record local earthquakes, quarry blasts, and teleseisms during the period of January 26 to May 27, 2002. We plan to analyze the recordings to compute spectral amplitudes at each site, to determine the variability of ground motions over the basin. During the Seattle SHIPS experiment, seismometers were deployed at 87 sites in a 110-km-long east-west line, three north-south lines, and a grid throughout the Seattle urban area (Figure 1). At each of these sites, an L-22, 2-Hz velocity transducer was installed and connected to a REF TEK Digital Acquisition System (DAS), both provided by the Program for Array Seismic Studies of the Continental Lithosphere (PASSCAL) of the Incorporated Research Institutes for Seismology (IRIS). The instruments were installed on January 26 and 27, and were retrieved gradually between April 18 and May 27. All instruments continuously sampled all three components of motion (velocity) at a sample rate of 50 samples/sec. To ensure accurate computations of amplitude, we calibrated the geophones in situ to obtain the instrument responses. In this report, we discuss the acquisition of these data, we describe the processing and merging of these data into 1-hour long traces and into windowed events, we discuss the geophone calibration process and its results, and we display some of the earthquake recordings.

  15. Source of 1629 Banda Mega-Thrust Earthquake and Tsunami: Implications for Tsunami Hazard Evaluation in Eastern Indonesia

    NASA Astrophysics Data System (ADS)

    Major, J. R.; Liu, Z.; Harris, R. A.; Fisher, T. L.

    2011-12-01

    Using Dutch records of geophysical events in Indonesia over the past 400 years, and tsunami modeling, we identify tsunami sources that have caused severe devastation in the past and are likely to reoccur in the near future. The earthquake history of Western Indonesia has received much attention since the 2004 Sumatra earthquakes and subsequent events. However, strain rates along a variety of plate boundary segments are just as high in eastern Indonesia where the earthquake history has not been investigated. Due to the rapid population growth in this region it is essential and urgent to evaluate its earthquake and tsunami hazards. Arthur Wichmann's 'Earthquakes of the Indian Archipelago' shows that there were 30 significant earthquakes and 29 tsunami between 1629 to 1877. One of the largest and best documented is the great earthquake and tsunami effecting the Banda islands on 1 August, 1629. It caused severe damage from a 15 m tsunami that arrived at the Banda Islands about a half hour after the earthquake. The earthquake was also recorded 230 km away in Ambon, but no tsunami is mentioned. This event was followed by at least 9 years of aftershocks. The combination of these observations indicates that the earthquake was most likely a mega-thrust event. We use a numerical simulation of the tsunami to locate the potential sources of the 1629 mega-thrust event and evaluate the tsunami hazard in Eastern Indonesia. The numerical simulation was tested to establish the tsunami run-up amplification factor for this region by tsunami simulations of the 1992 Flores Island (Hidayat et al., 1995) and 2006 Java (Katoet al., 2007) earthquake events. The results yield a tsunami run-up amplification factor of 1.5 and 3, respectively. However, the Java earthquake is a unique case of slow rupture that was hardly felt. The fault parameters of recent earthquakes in the Banda region are used for the models. The modeling narrows the possibilities of mega-thrust events the size of the one

  16. Volcanic and earthquake hazards at eastern Turkey volcanoes investigated by InSAR

    NASA Astrophysics Data System (ADS)

    Bathke, H.; Walter, T. R.

    2010-12-01

    Volcanoes of eastern Turkey have been historically active and are located in a poorly understood tectonic system with abundant faults and fissures. Mt Ararat and Tendürek, for instance, are located in a pull apart basin, and have been affected by significant tectonic earthquakes reaching magnitudes of 7.4 in 1840 and again in 1976, causing 10,000 and 5,000 fatalities, respectively. Today's tectonic and volcanic deformation processed remained to be elaborated, however. Here we report on a radar interferometric study using SAR images acquired by the satellites ERS-1, ERS-2 and ENVISAT in ascending and descending orbits. We create interferograms and combine them into a time series, which now allow us to investigate temporal deformation pattern of the ground at unprecedented spatial detail. Although the volcanoes have not been erupting since 160 years and are considered to be dormant, we find various localized but evident deformation processes. Our investigation suggest both processes, earthquakes as well as volcanic activity, to be responsible for the observed deformation. Thus the presented satellite radar data analysis contributes to process understanding and the associated hazards in eastern Turkey.

  17. An Arduino project to record ground motion and to learn on earthquake hazard at high school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Barnaba, Carla; Clocchiatti, Marco; Zuliani, David

    2015-04-01

    Through a multidisciplinary work that integrates Technology education with Earth Sciences, we implemented an educational program to raise the students' awareness of seismic hazard and to disseminate good practices of earthquake safety. Using free software and low-cost open hardware, the students of a senior class of the high school Liceo Paschini in Tolmezzo (NE Italy) implemented a seismograph using the Arduino open-source electronics platform and the ADXL345 sensors to emulate a low cost seismometer (e.g. O-NAVI sensor of the Quake-Catcher Network, http://qcn.stanford.edu). To accomplish their task the students were addressed to use the web resources for technical support and troubleshooting. Shell scripts, running on local computers under Linux OS, controlled the process of recording and display data. The main part of the experiment was documented using the DokuWiki style. Some propaedeutic lessons in computer sciences and electronics were needed to build up the necessary skills of the students and to fill in the gap of their background knowledge. In addition lectures by seismologists and laboratory activity allowed the class to exploit different aspects of the physics of the earthquake and particularly of the seismic waves, and to become familiar with the topics of seismic hazard through an inquiry-based learning. The Arduino seismograph achieved can be used for educational purposes and it can display tremors on the local network of the school. For sure it can record the ground motion due to a seismic event that can occur in the area, but further improvements are necessary for a quantitative analysis of the recorded signals.

  18. The key role of eyewitnesses in rapid earthquake impact assessment

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  19. Were the May 2012 Emilia-Romagna earthquakes induced? A coupled flow-geomechanics modeling assessment

    NASA Astrophysics Data System (ADS)

    Juanes, R.; Jha, B.; Hager, B. H.; Shaw, J. H.; Plesch, A.; Astiz, L.; Dieterich, J. H.; Frohlich, C.

    2016-07-01

    Seismicity induced by fluid injection and withdrawal has emerged as a central element of the scientific discussion around subsurface technologies that tap into water and energy resources. Here we present the application of coupled flow-geomechanics simulation technology to the post mortem analysis of a sequence of damaging earthquakes (Mw = 6.0 and 5.8) in May 2012 near the Cavone oil field, in northern Italy. This sequence raised the question of whether these earthquakes might have been triggered by activities due to oil and gas production. Our analysis strongly suggests that the combined effects of fluid production and injection from the Cavone field were not a driver for the observed seismicity. More generally, our study illustrates that computational modeling of coupled flow and geomechanics permits the integration of geologic, seismotectonic, well log, fluid pressure and flow rate, and geodetic data and provides a promising approach for assessing and managing hazards associated with induced seismicity.

  20. Health hazards of fire fighters: exposure assessment.

    PubMed Central

    Brandt-Rauf, P W; Fallon, L F; Tarantini, T; Idema, C; Andrews, L

    1988-01-01

    There is growing concern over the detrimental health effects to firefighters produced by exposure to combustion byproducts of burning materials. To assess the types and levels of exposure encountered by firefighters during their routine occupational duties, members of the Buffalo Fire Department were monitored during firefighting activities with personal, portable, ambient environmental sampling devices. The results indicate that firefighters are frequently exposed to significant concentrations of hazardous materials including carbon monoxide, benzene, sulphur dioxide, hydrogen cyanide, aldehydes, hydrogen chloride, dichlorofluoromethane, and particulates. Furthermore, in many cases of the worst exposure to these materials respiratory protective equipment was not used owing to the visual impression of low smoke intensity, and thus these levels represent actual direct exposure of the firefighters. Many of these materials have been implicated in the production of cardiovascular, respiratory, or neoplastic diseases, which may provide an explanation for the alleged increased risk for these illnesses among firefighters. PMID:3179235

  1. Seismic Hazard Assessment for the Baku City and Absheron Peninsula, Azerbaijan

    SciTech Connect

    Babayev, Gulam R.

    2006-03-23

    This paper deals with the seismic hazard assessment for Baku and the Absheron peninsula. The assessment is based on the information on the features of earthquake ground motion excitation, seismic wave propagation (attenuation), and site effect. I analyze active faults, seismicity, soil and rock properties, geological cross-sections, the borehole data of measured shear-wave velocity, lithology, amplification factor of each geological unit, geomorphology, topography, and basic rock and surface ground motions. To estimate peak ground acceleration (PGA) at the surface, PGA at the basic rock is multiplied by the amplification parameter of each surface layers. Quaternary soft deposits, representing a high risk due to increasing PGA values at surface, are studied in detail. For a near-zone target earthquake PGA values are compared to intensity at MSK-64 scale for the Absheron peninsula. The amplification factor for the Baku city is assessed and provides estimations for a level of a seismic motion and seismic intensity of the studied area.

  2. Assessment of impact of strong earthquakes to the global economy by example of Thoku event

    NASA Astrophysics Data System (ADS)

    Tatiana, Skufina; Peter, Skuf'in; Sergey, Baranov; Vera, Samarina; Taisiya, Shatalova

    2016-04-01

    We examine the economic consequences of strong earthquakes by example of M9 Tahoku one that occurred on March 11, 2011 close to the northeast shore of Japanese coast Honshu. This earthquake became the strongest in the whole history of the seismological observations in this part of the planet. The generated tsunami killed more than 15,700 people, damaged 332,395 buildings and 2,126 roads. The total economic loss in Japan was estimated at 309 billion. The catastrophe in Japan also impacted global economy. To estimate its impact, we used regional and global stock indexes, production indexes, stock prices of the main Japanese, European and US companies, import and export dynamics, as well as the data provided by the custom of Japan. We also demonstrated that the catastrophe substantially affected the markets and on the short run in some indicators it even exceeded the effect of the global financial crisis of 2008. The last strong earthquake occurred in Nepal (25.04.2015, M7.8) and Chile (16.09.2015, M8.3), both actualized the research of cost assessments of the overall economic impact of seismic hazard. We concluded that it is necessary to treat strong earthquakes as one very important factor that affects the world economy depending on their location. The research was supported by Russian Foundation for Basic Research (Project 16-06-00056A).

  3. Setting the Stage for Harmonized Risk Assessment by Seismic Hazard Harmonization in Europe (SHARE)

    NASA Astrophysics Data System (ADS)

    Woessner, Jochen; Giardini, Domenico; SHARE Consortium

    2010-05-01

    Probabilistic seismic hazard assessment (PSHA) is arguably one of the most useful products that seismology can offer to society. PSHA characterizes the best available knowledge on the seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results form the baseline for informed decision making, such as building codes or insurance rates and provide essential input to each risk assessment application. Several large scale national and international projects have recently been launched aimed at improving and harmonizing PSHA standards around the globe. SHARE (www.share-eu.org) is the European Commission funded project in the Framework Programme 7 (FP-7) that will create an updated, living seismic hazard model for the Euro-Mediterranean region. SHARE is a regional component of the Global Earthquake Model (GEM, www.globalquakemodel.org), a public/private partnership initiated and approved by the Global Science Forum of the OECD-GSF. GEM aims to be the uniform, independent and open access standard to calculate and communicate earthquake hazard and risk worldwide. SHARE itself will deliver measurable progress in all steps leading to a harmonized assessment of seismic hazard - in the definition of engineering requirements, in the collection of input data, in procedures for hazard assessment, and in engineering applications. SHARE scientists will create a unified framework and computational infrastructure for seismic hazard assessment and produce an integrated European probabilistic seismic hazard assessment (PSHA) model and specific scenario based modeling tools. The results will deliver long-lasting structural impact in areas of societal and economic relevance, they will serve as reference for the Eurocode 8 (EC8) application, and will provide homogeneous input for the correct seismic safety assessment for critical industry, such as the energy infrastructures and the re-insurance sector. SHARE will cover the whole European territory, the

  4. Seismogeodesy and Rapid Earthquake and Tsunami Source Assessment

    NASA Astrophysics Data System (ADS)

    Melgar Moctezuma, Diego

    This dissertation presents an optimal combination algorithm for strong motion seismograms and regional high rate GPS recordings. This seismogeodetic solution produces estimates of ground motion that recover the whole seismic spectrum, from the permanent deformation to the Nyquist frequency of the accelerometer. This algorithm will be demonstrated and evaluated through outdoor shake table tests and recordings of large earthquakes, notably the 2010 Mw 7.2 El Mayor-Cucapah earthquake and the 2011 Mw 9.0 Tohoku-oki events. This dissertations will also show that strong motion velocity and displacement data obtained from the seismogeodetic solution can be instrumental to quickly determine basic parameters of the earthquake source. We will show how GPS and seismogeodetic data can produce rapid estimates of centroid moment tensors, static slip inversions, and most importantly, kinematic slip inversions. Throughout the dissertation special emphasis will be placed on how to compute these source models with minimal interaction from a network operator. Finally we will show that the incorporation of off-shore data such as ocean-bottom pressure and RTK-GPS buoys can better-constrain the shallow slip of large subduction events. We will demonstrate through numerical simulations of tsunami propagation that the earthquake sources derived from the seismogeodetic and ocean-based sensors is detailed enough to provide a timely and accurate assessment of expected tsunami intensity immediately following a large earthquake.

  5. From Seismic Scenarios to Earthquake Risk Assessment: A Case Study for Iquique, Chile.

    NASA Astrophysics Data System (ADS)

    Aguirre, P.; Fortuno, C.; Martin, J. C. D. L. L.; Vasquez, J.

    2015-12-01

    Iquique is a strategic city and economic center in northern Chile, and is located in a large seismic gap where a megathrust earthquake and tsunami is expected. Although it was hit by a Mw 8.2 earthquake on April 1st 2014, which caused moderate damage, geophysical evidence still suggests that there is potential for a larger event, so a thorough risk assessment is key to understand the physical, social, and economic effects of such potential event, and devise appropriate mitigation plans. Hence, Iquique has been selected as a prime study case for the implementation of a risk assessment platform in Chile. Our study integrates research on three main elements of risk calculations: hazard evaluation, exposure model, and physical vulnerabilities. To characterize the hazard field, a set of synthetic seismic scenarios have been developed based on plate interlocking and the residual slip potential that results from subtracting the slip occurred during the April 1st 2014 rupture fault mechanism, obtained using InSAR+GPS inversion. Additional scenarios were developed based of the fault rupture model of the Maule 2010 Mw 8.8 earthquake and on the local plate locking models in northern Chile. These rupture models define a collection of possible realizations of earthquake geometries parameterized in terms of critical variables like slip magnitude, rise time, mean propagation velocity, directivity, and other, which are propagated to obtain a hazard map for Iquique (e.g. PGA, PGV, PDG). Furthermore, a large body of public and local data was used to construct a detailed exposure model for Iquique, including aggregated building count, demographics, essential facilities, and lifelines. This model together with the PGA maps for the April 1st 2014 earthquake are used to calibrate HAZUS outputs against observed damage, and adjust the fragility curves of physical systems according to more detailed analyses of typical Chilean building types and their structural properties, plus historical

  6. Assessment of liquefaction potential during earthquakes by arias intensity

    USGS Publications Warehouse

    Kayen, R.E.; Mitchell, J.K.

    1997-01-01

    An Arias intensity approach to assess the liquefaction potential of soil deposits during earthquakes is proposed, using an energy-based measure of the severity of earthquake-shaking recorded on seismograms of the two horizontal components of ground motion. Values representing the severity of strong motion at depth in the soil column are associated with the liquefaction resistance of that layer, as measured by in situ penetration testing (SPT, CPT). This association results in a magnitude-independent boundary that envelopes initial liquefaction of soil in Arias intensity-normalized penetration resistance space. The Arias intensity approach is simple to apply and has proven to be highly reliable in assessing liquefaction potential. The advantages of using Arias intensity as a measure of earthquake-shaking severity in liquefaction assessment are: Arias intensity is derived from integration of the entire seismogram wave form, incorporating both the amplitude and duration elements of ground motion; all frequencies of recorded motion are considered; and Arias intensity is an appropriate measure to use when evaluating field penetration test methodologies that are inherently energy-based. Predictor equations describing the attenuation of Arias intensity as a function of earthquake magnitude and source distance are presented for rock, deep-stiff alluvium, and soft soil sites.

  7. A fault kinematic based assessment of Maximum Credible Earthquake magnitudes for the slow Vienna Basin Fault

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Beidinger, Andreas; Hintersberger, Esther

    2010-05-01

    Assessing the maximum credible earthquake (MCE) for a specific region is an important step in seismic hazard assessment. In regions of high seismicitiy and long historic records, the possibility is relatively high that the maximum credible earthquake is included in the regional earthquake catalog. In regions with low or absent historic seismicity, however, the MCE must be determined from geological information. In the Vienna Basin, seismicity along the eastern basin margin is on a moderate level (Imax/Mmax = 8/5.2), concentrated along the left-lateral strike-slip Vienna Basin Transfer Fault (VBTF). In contrast, in the northern and western parts, as well as close to the city of Vienna, there are neither historical nor instrumental earthquake records. New paleoseismological data, however, have shown that several surface-breaking earthquakes occurred in that region during the Late Pleistocene. We consequently try to assess the earthquake potential in that region using an elaborated kinematic model of Quaternary and active faulting. The VBTF comprises several sinistral strike-slip segments with distinct kinematic and seismotectonic properties. Seismicity along the fault highlights four major segments referred to as the Mitterndorf-Schwadorf-, Lassee-, Zohor- and Dobra Voda Segment. Unlike the Lassee Segment, which hardly released any seismic energy in historical times, the three others show abundant moderate earthquakes in the last 400 yrs. Fault mapping using 2D/3D reflection seismic, gravity, and geomorphology shows that these seismotectonically defined segments are delimited by major fault bends including a restraining bend (Dobra Voda) and three releasing bends with negative flower structures overlain by Pleistocene pull-apart basins with up to 150 m growth strata. The releasing bends are connected by non-transtensive segments. In addition to the overall geometry of the strike-slip fault with releasing / restraining bends, the transfer of displacement to several

  8. Lateral spread hazard mapping of the northern Salt Lake Valley, Utah, for a M7.0 scenario earthquake

    USGS Publications Warehouse

    Olsen, M.J.; Bartlett, S.F.; Solomon, B.J.

    2007-01-01

    This paper describes the methodology used to develop a lateral spread-displacement hazard map for northern Salt Lake Valley, Utah, using a scenario M7.0 earthquake occurring on the Salt Lake City segment of the Wasatch fault. The mapping effort is supported by a substantial amount of geotechnical, geologic, and topographic data compiled for the Salt Lake Valley, Utah. ArcGIS?? routines created for the mapping project then input this information to perform site-specific lateral spread analyses using methods developed by Bartlett and Youd (1992) and Youd et al. (2002) at individual borehole locations. The distributions of predicted lateral spread displacements from the boreholes located spatially within a geologic unit were subsequently used to map the hazard for that particular unit. The mapped displacement zones consist of low hazard (0-0.1 m), moderate hazard (0.1-0.3 m), high hazard (0.3-1.0 m), and very high hazard (> 1.0 m). As expected, the produced map shows the highest hazard in the alluvial deposits at the center of the valley and in sandy deposits close to the fault. This mapping effort is currently being applied to the southern part of the Salt Lake Valley, Utah, and probabilistic maps are being developed for the entire valley. ?? 2007, Earthquake Engineering Research Institute.

  9. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  10. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  11. Seismic Hazard and risk assessment for Romania -Bulgaria cross-border region

    NASA Astrophysics Data System (ADS)

    Simeonova, Stela; Solakov, Dimcho; Alexandrova, Irena; Vaseva, Elena; Trifonova, Petya; Raykova, Plamena

    2016-04-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic hazard and vulnerability to earthquakes are steadily increasing as urbanization and development occupy more areas that are prone to effects of strong earthquakes. The assessment of the seismic hazard and risk is particularly important, because it provides valuable information for seismic safety and disaster mitigation, and it supports decision making for the benefit of society. Romania and Bulgaria, situated in the Balkan Region as a part of the Alpine-Himalayan seismic belt, are characterized by high seismicity, and are exposed to a high seismic risk. Over the centuries, both countries have experienced strong earthquakes. The cross-border region encompassing the northern Bulgaria and southern Romania is a territory prone to effects of strong earthquakes. The area is significantly affected by earthquakes occurred in both countries, on the one hand the events generated by the Vrancea intermediate-depth seismic source in Romania, and on the other hand by the crustal seismicity originated in the seismic sources: Shabla (SHB), Dulovo, Gorna Orjahovitza (GO) in Bulgaria. The Vrancea seismogenic zone of Romania is a very peculiar seismic source, often described as unique in the world, and it represents a major concern for most of the northern part of Bulgaria as well. In the present study the seismic hazard for Romania-Bulgaria cross-border region on the basis of integrated basic geo-datasets is assessed. The hazard results are obtained by applying two alternative approaches - probabilistic and deterministic. The MSK64 intensity (MSK64 scale is practically equal to the new EMS98) is used as output parameter for the hazard maps. We prefer to use here the macroseismic intensity instead of PGA, because it is directly related to the degree of damages and, moreover, the epicentral intensity is the original

  12. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    2007 Working Group on California Earthquake Probabilities

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  13. Apophis: complex rotation and hazard assessment

    NASA Astrophysics Data System (ADS)

    Farnocchia, Davide; Chesley, Steven R.; Vokrouhlicky, David; Mueller, Thomas G.

    2014-11-01

    (99942) Apophis is one of the most remarkable near-Earth asteroids in terms of impact hazard. In 2004 the probability of an impact in 2029 reached a peak of 2.7%. With the data available today we know that Apophis will pass Earth safely in 2029 at about 38,000 km. However, despite the availability of a well observed arc and three radar apparitions, the 2029 Earth encounter has such a strong scattering effect on the trajectory of Apophis that post-2029 predictions are only possible in a statistical sense and impacts in the following decades are hard to rule out.To predict the future ephemerides of Apophis the dominant source of uncertainty is the Yarkovsky effect, a small nongravitational perturbation that arises from the anisotropic re-emission at thermal wavelengths of absorbed solar radiation. Modeling the Yarkovsky effect acting on an asteroid is generally challenging, as we need a good knowledge of the asteroid’s physical model or observable deviations from a purely gravitational trajectory. A further complication comes from the complex rotation state of Apophis. We use the available information on the physical properties of Apophis, e.g., shape, size, thermal inertia, and rotation state, to estimate the Yarkovsky effect acting on Apophis by solving the nonlinear heat transfer equation on a finite-element mesh of facets model of the shape of Apophis.We find that the Yarkovsky perturbation significantly affects the trajectory of Apophis despite the complex rotation. We analyze the implications on the hazard assessment by mapping the orbital uncertainty to the 2029 close approach and computing the keyholes, i.e., the locations at the 2029 Earth encounter leading to a resonant impact at a future close approach. Whereas collisions with Earth before 2060 are ruled out, impacts are still possible after 2060.

  14. Generalized probabilistic seimsic hazard estimates in terms of macroseismic intensity as a tool for risk assessment in urban areas

    NASA Astrophysics Data System (ADS)

    Albarello, Dario; D'Amico, Vera; Rotondi, Renata; Varini, Elsa; Zonno, Gaetano

    2013-04-01

    The use of macroseismic intensity to parameterize earthquakes effects allows a direct link of hazard assessment with risk estimates in urban areas. This is particularly true in most of European countries where long lasting documentary history is available about the effects of past earthquakes. This is why the use of the computational code SASHA (Site Approach to Seismic Hazard Assessment), on purpose developed for a coherent probabilistic analysis of intensity data locally available (site seismic histories) to provided hazard estimates in terms of intensity by taking into account the specific nature of intensity (ordinal, discrete, finite in range, site-dependent) and relevant uncertainty (completeness, ill-definition of the oldest earthquakes, etc.), resulted of specific interest in the frame of the EU research project UPStratMAFA "Urban Disaster Prevention Strategies Using MAcroseismic Fields and FAult Sources" (Grant Agreement n. 230301/2011/613486/SUB/A5). In order to extend the application of this approach to sites and countries where local seismic histories are relatively poor, a new implementation of the code was provided, allowing to include in the hazard assessment information coming from different branches (historical studies, seismological instrumental information and numerical simulations). In particular, macroseismic information related to the seismic history locally documented, that represents the bulk of the considered information, can be integrated with "virtual" intensities deduced from epicentral data (via earthquake-specific probabilistic attenuation relationships) and "simulated" intensities deduced via physical/stochastic simulations from data concerning seismogenic faults activated during past earthquakes. This allows a more complete reconstruction of local seismic history and also reducing uncertainty affecting macroseismic data relative to older earthquakes. Results of some applications of the new release of the SASHA code will be described.

  15. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach

  16. Strong earthquakes knowledge base for calibrating fast damage assessment systems

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Kozlov, M.; Larionov, V.; Nikolaev, A.; Suchshev, S.; Ugarov, A.

    2003-04-01

    At present Systems for fast damage and loss assessment due to strong earthquakes may use as input data: (1) information about event parameters (magnitude, depth and coordinates) issued by Alert Seismological Surveys; (2) wave-form data obtained by strong-motion seismograph network; (3) high resolution space images of the affected area obtained before and after the event. When data about magnidute, depth and location of event are used to simulate possible consequences, the reliability of estimations depends on completeness and reliability of databases on elements at risk (population and built environment); reliability of vulnerability functions of elements at risk; and errors in strong earthquakes' parameters determination by Alert Seismological Surveys. Some of these factors may be taken into account at the expense of the System calibration with usage of well documented past strong earthquakes. The paper is describing the structure and content of the knowledge base about well documented strong events, which occurred in last century. It contains the description of more than 1000 events. The data are distributed almost homogeneously as the losses due to earthquakes are concerned; the most events are in the magnitude range 6.5 -7.9. Software is created to accumulate and analyze the information about these events source parameters and social consequences. Created knowledge base is used for calibration the Fast Damage Assessment Tool, which is at present on duty with the framework of EDRIM Program. It is also used as additional information by experts who analyses the results of computations.

  17. OpenQuake, a platform for collaborative seismic hazard and risk assessment

    NASA Astrophysics Data System (ADS)

    Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben

    2013-04-01

    Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental

  18. Implications of rupture complexity for hazard assessment and forecasting of local and regional tsunami

    NASA Astrophysics Data System (ADS)

    Müller, Christof; Power, William; Fraser, Stuart; Wang, Xiaoming; Ristau, John

    2014-05-01

    Traditionally hazard assessment for tsunami does not take rupture complexity, i.e. the heterogeneity of the slip distribution across the earthquake rupture interface, into account. The authors have demonstrated that the potential extent of inundation will be significantly underestimated if rupture complexity is ignored. For local tsunami it has also been shown that for a target site a strict proportionality between earthquake moment magnitude and inundation extent does not exist. The main difficulty in including the effects of rupture complexity in Probabilistic Tsunami Hazard Assessment (PTHA) for local and regional tsunami lies in the fact that calculations to full inundation need to solve non-linear wave equations. These calculations are so computationally expensive that simulating a statistically significant number of scenarios becomes impractical. The hazard assessment process thus requires a de-aggregation procedure that can rely on simulations based on the linear wave equations alone, to identify scenarios significant enough to be considered for full inundation modelling. We correlate properties of the offshore wave field derived from linear simulations with the extent of inundation derived from non-linear tsunami simulations, allowing us to reduce non-linear calculations in our hazard assessment to a practical number. The effect of rupture complexity on the tsunami wave field is routinely considered in tsunami forecasting for distant and regional sources. Source models are inverted from DART buoy readings as soon as this information becomes available. However, depending on the location of the earthquake causing the tsunami, DART buoy information will not be provided immediately after the event, which poses a challenge to tsunami forecasting for local and regional sources. We propose a concept of tsunami forecasting for regional tsunami, which also provides probabilistic hazard assessment for the event in question. This approach considers rupture complexity

  19. Seismic hazard assessments for European nuclear power plants: a review based on the results of the ENSREG Stress Tests

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Hintersberger, Esther

    2015-04-01

    In aftermath of the Fukushima Daiichi accident ENSREG and the European Commission reviewed the seismic safety of all European nuclear plants on the basis of a comprehensive and transparent risk and safety assessment ('Stress Tests'). This process resulted in the publication of a large amount of data describing approaches, methods and results previously used to assess seismic hazards for European NPPs (http://www.ensreg.eu/eu-stress-tests). A review of the published documents reveals considerable differences between the approaches of seismic hazard assessment. Most of the EU countries use probabilistic or a combination of probabilistic and deterministic approaches to estimate hazard. A second group of countries relies on deterministic assessments. Reports from countries adopting probabilistic hazard assessment methodologies reveal a spread of exceedance frequencies defining the design base earthquake (DBE) between 10-3 and 10-5 per year with a majority of countries referring to a frequency of 10-4. Deterministic approaches use the maximum earthquake intensities to define the DBE, mostly adding 1° of intensity as a safety margin. In very few cases only 0.5° or even no safety margin was added to the strongest intensity. The hazard levels obtained from both types of analyses are not comparable to each other as no benchmark studies appear to exist to define the occurrence probabilities of DBE values established by deterministic methods. The Stress Tests documents do not allow for an in-depth check of the hazard assessments. Assessments for different countries/sites have been performed between the 1970s and 2011. Although it is conceded that all assessments were performed according to the state of the art of the time of their performance, only a part of the hazard assessments can be justified in terms of being compliant with current scientific standards. Due to the time elapsed since their implementation several decades ago some assessments do not take advantage of

  20. A Revised Evaluation of Tsunami Hazards along the Chinese Coast in View of the Tohoku-Oki Earthquake

    NASA Astrophysics Data System (ADS)

    Jing, Huimin Helen; Zhang, Huai; Yuen, David A.; Shi, Yaolin

    2013-01-01

    Japan's 2011 Tohoku-Oki earthquake and the accompanying tsunami have reminded us of the potential tsunami hazards from the Manila and Ryukyu trenches to the South China and East China Seas. Statistics of historical seismic records from nearly the last 4 decades have shown that major earthquakes do not necessarily agree with the local Gutenberg-Richter relationship. The probability of a mega-earthquake may be higher than we have previously estimated. Furthermore, we noted that the percentages of tsunami-associated earthquakes are much higher in major events, and the earthquakes with magnitudes equal to or greater than 8.8 have all triggered tsunamis in the past approximately 100 years. We will emphasize the importance of a thorough study of possible tsunami scenarios for hazard mitigation. We focus on several hypothetical earthquake-induced tsunamis caused by M w 8.8 events along the Manila and Ryukyu trenches. We carried out numerical simulations based on shallow-water equations (SWE) to predict the tsunami dynamics in the South China and East China Seas. By analyzing the computed results we found that the height of the potential surge in China's coastal area caused by earthquake-induced tsunamis may reach a couple of meters high. Our preliminary results show that tsunamis generated in the Manila and Ryukyu trenches could pose a significant threat to Chinese coastal cities such as Shanghai, Hong Kong and Macao. However, we did not find the highest tsunami wave at Taiwan, partially because it lies right on the extension of an assumed fault line. Furthermore, we put forward a multi-scale model with higher resolution, which enabled us to investigate the edge waves diffracted around Taiwan Island with a closer view.

  1. Possible Biases in Characteristic Earthquake and Seismic Hazard Studies: Illustrations for the Wasatch, New Madrid, North Africa, and Eastern Canadian Seismic Zones

    NASA Astrophysics Data System (ADS)

    Swafford, L.; Stein, S.; Newman, A.; Friedrich, A.

    2005-12-01

    Attempts to study earthquake recurrence in space and time are limited in simple but frustrating ways by the short history of instrumental seismology compared to the long and variable recurrence time of large earthquakes. If major seismicity is spread uniformly with a seismic zone but the recurrence interval is long compared to the earthquake record, apparent differences in seismic hazard within the seismic zone inferred from the earthquake history are likely to simply reflect the short earthquake record. Simple numerical simulations suggest that this may be the case for the concentrated areas of high predicted hazard with seismic zones in North Africa, along the St. Lawrence valley, and the eastern coast of Canada. If so, with more time and a longer earthquake record, large earthquakes would likely occur over the entire seismic zone and show that the real hazard is uniform. Similarly, large earthquakes are likely to appear "characteristic", more frequent than would be inferred from the rates of smaller ones, for two reasons. First, a short history is likely to overestimate the rate of large earthquakes because fractions of earthquakes cannot be observed. Second, because the rates of small earthquakes are typically determined from the seismological record whereas the rates of large earthquakes are inferred from paleoseismology, biases in estimating paleomagnitudes can produce apparent characteristic earthquakes, as appears to have occurred for New Madrid. A further complexity is illustrated by results for the Wasatch seismic zone, where some studies find characteristic earthquake behavior whereas others do not. The discrepancy arises primarily because some studies consider the entire Wasatch front area whereas others focus on the Wasatch fault, on which only some of the smaller earthquakes but all of the large paleoearthquakes occur. Similar situations may arise in other seismic zones containing a major fault and a number of smaller ones.

  2. GRC Payload Hazard Assessment: Supporting the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Schoren, William R.; Zampino, Edward J.

    2004-01-01

    A hazard assessment was conducted on the GRC managed payloads in support of a NASA Headquarters Code Q request to examine STS-107 payloads and determine if they were credible contributors to the Columbia accident. This assessment utilized each payload's Final Flight Safety Data Package for hazard identification. An applicability assessment was performed and most of the hazards were eliminated because they dealt with payload operations or crew interactions. A Fault Tree was developed for all the hazards deemed applicable and the safety verification documentation was reviewed for these applicable hazards. At the completion of this hazard assessment, it was concluded that none of the GRC managed payloads were credible contributors to the Columbia accident.

  3. Seismic Hazard Assessment for Western Kentucky, Northeastern Kentucky and Southeastern Ohio

    SciTech Connect

    Cobb, James C; Wang, Zhenming; Woolery, Edward W; Kiefer, John D

    2002-07-01

    Earthquakes pose a seismic hazards and risk to the Commonwealth of Kentucky. Furthermore, the seismic hazards and risk vary throughout the Commonwealth. The US Nuclear Regulatory Commission uses the seismic hazard maps developed by the US Geological Survey for seismic safety regulation for nuclear facilities. Under current US Geological Survey's seismic hazard assessment it is economically unfeasible to build a new uranium plant near Paducah relative to the Portsmouth, Ohio site. This is not to say that the facility cannot be safely engineered to withstand the present seismic load, but enormously expensive to do so. More than 20 years observations and research at UK have shown that the US Geological Survey has overestimated seismic hazards in western Kentucky, particularly in the Jackson Purchase area that includes Paducah. Furthermore, our research indicates underestimated seismic hazards in northeastern Kentucky and southeastern Ohio. Such overestimation and underestimation could jeopardize possible site selection of PGDP for the new uranium plant. The existing database, research experience, and expertise in UK's Kentucky Geological Survey and Department of Geological Science put this institution in a unique position to conduct a comprehensive seismic hazard evaluation.

  4. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  5. Seismic hazard and risk assessment for large Romanian dams situated in the Moldavian Platform

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Popescu, Emilia; Otilia Placinta, Anica; Petruta Constantin, Angela; Toma Danila, Dragos; Borleanu, Felix; Emilian Toader, Victorin; Moldoveanu, Traian

    2016-04-01

    Besides periodical technical inspections, the monitoring and the surveillance of dams' related structures and infrastructures, there are some more seismic specific requirements towards dams' safety. The most important one is the seismic risk assessment that can be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine (2002), and Bureau (2003), taking into account the maximum expected peak ground motions at the dams site - values obtained using probabilistic hazard assessment approaches (Moldovan et al., 2008), the structures vulnerability and the downstream risk characteristics (human, economical, historic and cultural heritage, etc) in the areas that might be flooded in the case of a dam failure. Probabilistic seismic hazard (PSH), vulnerability and risk studies for dams situated in the Moldavian Platform, starting from Izvorul Muntelui Dam, down on Bistrita and following on Siret River and theirs affluent will be realized. The most vulnerable dams will be studied in detail and flooding maps will be drawn to find the most exposed downstream localities both for risk assessment studies and warnings. GIS maps that clearly indicate areas that are potentially flooded are enough for these studies, thus giving information on the number of inhabitants and goods that may be destroyed. Geospatial servers included topography is sufficient to achieve them, all other further studies are not necessary for downstream risk assessment. The results will consist of local and regional seismic information, dams specific characteristics and locations, seismic hazard maps and risk classes, for all dams sites (for more than 30 dams), inundation maps (for the most vulnerable dams from the region) and possible affected localities. The studies realized in this paper have as final goal to provide the local emergency services with warnings of a potential dam failure and ensuing flood as a result of an large earthquake occurrence, allowing further

  6. GNSS-monitoring of Natural Hazards: Ionospheric Detection of Earthquakes and Volcano Eruptions

    NASA Astrophysics Data System (ADS)

    Shults, K.; Astafyeva, E.; Lognonne, P. H.

    2015-12-01

    During the last few decades earthquakes as sources of strong perturbations in the ionosphere have been reported by many researchers, and in the last few years the seismo-ionosphere coupling has been more and more discussed (e.g., Calais and Minster, 1998, Phys. Earth Planet. Inter., 105, 167-181; Afraimovich et al., 2010, Earth, Planets, Space, V.62, No.11, 899-904; Rolland et al., 2011, Earth Planets Space, 63, 853-857). Co-volcanic ionospheric perturbations have come under the scrutiny of science only in recent years but observations have already shown that mass and energy injections of volcanic activities can also excite oscillations in the ionosphere (Heki, 2006, Geophys. Res. Lett., 33, L14303; Dautermann et al., 2009, Geophys. Res., 114, B02202). The ionospheric perturbations are induced by acoustic and gravity waves generated in the neutral atmosphere by seismic source or volcano eruption. The upward propagating vibrations of the atmosphere interact with the plasma in the ionosphere by the particle collisions and excite variations of electron density detectable with dual-frequency receivers of the Global Navigation Satellite System (GNSS). In addition to co-seismic ionospheric disturbances (CID) observations, ionospheric GNSS measurements have recently proved to be useful to obtain ionospheric images for the seismic fault allowing to provide information on its' parameters and localization (Astafyeva et al., 2011, Geophys. Res. Letters, 38, L22104). This work describes how the GNSS signals can be used for monitoring of natural hazards on examples of the 9 March 2011 M7.3 Tohoku Foreshock and April 2015 M7.8 Nepal earthquake as well as the April 2015 Calbuco volcano eruptions. We also show that use of high-resolution GNSS data can aid to plot the ionospheric images of seismic fault.

  7. Research program on Indonesian active faults to support the national earthquake hazard assesments

    NASA Astrophysics Data System (ADS)

    Natawidjaja, D. H.

    2012-12-01

    In mid 2010 an Indonesian team of earthquake scientists published the new Indonesian probabilistic seismic hazard analysis (PSHA) map. The new PSHA map replaced the previous version that is published in 2002. One of the major challenges in developing the new map is that data for many active fault zones in Indonesia is sparse and mapped only at regional scale, thus the input fault parameters for the PSHA introduce unavoidably large uncertainties. Despite the fact that most Indonesian islands are torn by active faults, only Sumatra has been mapped and studied in sufficient details. In other areas, such as Java and Bali, the most populated regions as well as in the east Indonesian region, where tectonic plate configurations are far more complex and relative plate motions are generally higher, many major active faults and plate boundaries are not well mapped and studied. In early 2011, we have initiated a research program to study major active faults in Indonesia together with starting a new graduate study program, GREAT (Graduate Research for Earthquake and Active Tectonics), hosted by ITB (Institute of Technology bandung) and LIPI (Indonesian Institute of Sciences) in partnership with the Australia-Indonesia Facility for Disaster Reduction (AIFDR). The program include acquisition of high-resolution topography and images required for detailed fault mapping, measuring geological slip rates and locating good sites for paleoseismological studies. It is also coupled by seismological study and GPS surveys to measure geodetic slip rates. To study submarine active faults, we collect and incorporate bathymetry and marine geophysical data. The research will be carried out, in part, through masters and Ph.D student theses. in the first four year of program we select several sites for active fault studies, particulary the ones that pose greater risks to society.

  8. Earthquakes for Kids

    MedlinePlus

    ... Hazards Data & Products Learn Monitoring Research Earthquakes for Kids Kid's Privacy Policy Earthquake Topics for Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters ...

  9. Geodynamics and seismic hazard in the Calabrian Arc: towards a Messina earthquake supersite

    NASA Astrophysics Data System (ADS)

    Chiarabba, Claudio; Dell'Acqua, Fabio; Faccenna, Claudio; Lanari, Riccardo; Matteuzzi, Francesco; Mattia, Mario; Neri, Giancarlo; Patané, Domenico; Polonia, Alina; Prati, Claudio; Tinti, Stefano; Zerbini, Susanna; Ozener, Haluk

    2015-04-01

    The Messina region represents a key site of the Mediterranean, where active faulting, seismic shaking, volcanism, rapid uplift and landslides represent the surface manifestation of deep processes. Fast deformation results in one of the highest seismic hazard of the Mediterranean, as testified by historic destructive earthquakes occasionally accompanied by submarine mass flows and tsunami-events that added death and destruction to the already devastating effects of the earthquakes. Several geophysical and geological studies carried out during the last decades help defining the kinematics and the dynamics of the system. The tectonic evolution of the Messina region is strictly linked with the Southern Tyrrhenian and Calabrian Arc system, the retreat of the Ionian slab and the back-arc basin opening. The present-day geometry of the Calabrian slab, as well imaged by tomographic analyses and shallow-to-deep seismicity, shows a narrow slab plunging down steeply into the mantle. At 100-150 km depth, the southern edge of the slab is positioned beneath Northeastern Sicily, approximately between Tindari and Messina. Within this frame, several relevant questions are still unsolved. For example, it is not clear how the upper plate may deform as a response of a differential sinking of the subducting slabs, or how deep mantle flow at the slab edge may influence the pattern of surface deformation. Structural and geodetic data show the first-order pattern of deformation in Northeastern Sicily, and define the Tindari-Messina area as the boundary between a region in compression to the west, dominated by the Africa convergence, and a region in extension to the east-northeast, dominated by slab rollback. In addition, geodetic studies also show an increase of crustal motion velocity from Sicily to Calabria with an overall clockwise rotation of the velocity vector. This pattern of surface deformation evidences a sharp extension process active in the Messina region. The elevation of

  10. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  11. Tsunamis hazard assessment and monitoring for the Back Sea area

    NASA Astrophysics Data System (ADS)

    Partheniu, Raluca; Ionescu, Constantin; Constantin, Angela; Moldovan, Iren; Diaconescu, Mihail; Marmureanu, Alexandru; Radulian, Mircea; Toader, Victorin

    2016-04-01

    NIEP has improved lately its researches regarding tsunamis in the Black Sea. As part of the routine earthquake and tsunami monitoring activity, the first tsunami early-warning system in the Black Sea has been implemented in 2013 and is active during these last years. In order to monitor the seismic activity of the Black Sea, NIEP is using a total number of 114 real time stations and 2 seismic arrays, 18 of the stations being located in Dobrogea area, area situated in the vicinity of the Romanian Black Sea shore line. Moreover, there is a data exchange with the Black Sea surrounding countries involving the acquisition of real-time data for 17 stations from Bulgaria, Turkey, Georgia and Ukraine. This improves the capability of the Romanian Seismic Network to monitor and more accurately locate the earthquakes occurred in the Black Sea area. For tsunamis monitoring and warning, a number of 6 sea level monitoring stations, 1 infrasound barometer, 3 offshore marine buoys and 7 GPS/GNSS stations are installed in different locations along and near the Romanian shore line. In the framework of ASTARTE project, few objectives regarding the seismic hazard and tsunami waves height assessment for the Black Sea were accomplished. The seismic hazard estimation was based on statistical studies of the seismic sources and their characteristics, compiled using different seismic catalogues. Two probabilistic methods were used for the evaluation of the seismic hazard, the Cornell method, based on the Gutenberg Richter distribution parameters, and Gumbel method, based on extremes statistic. The results show maximum values of possible magnitudes and their recurrence periods, for each seismic source. Using the Tsunami Analysis Tool (TAT) software, a set of tsunami modelling scenarios have been generated for Shabla area, the seismic source that could mostly affect the Romanian shore. These simulations are structured in a database, in order to set maximum possible tsunami waves that could be

  12. Comparison of the historical record of earthquake hazard with seismic-hazard models for New Zealand and the continental United States

    USGS Publications Warehouse

    Stirling, M.; Petersen, M.

    2006-01-01

    We compare the historical record of earthquake hazard experienced at 78 towns and cities (sites) distributed across New Zealand and the continental United States with the hazard estimated from the national probabilistic seismic-hazard (PSH) models for the two countries. The two PSH models are constructed with similar methodologies and data. Our comparisons show a tendency for the PSH models to slightly exceed the historical hazard in New Zealand and westernmost continental United States interplate regions, but show lower hazard than that of the historical record in the continental United States intraplate region. Factors such as non-Poissonian behavior, parameterization of active fault data in the PSH calculations, and uncertainties in estimation of ground-motion levels from historical felt intensity data for the interplate regions may have led to the higher-than-historical levels of hazard at the interplate sites. In contrast, the less-than-historical hazard for the remaining continental United States (intraplate) sites may be largely due to site conditions not having been considered at the intraplate sites, and uncertainties in correlating ground-motion levels to historical felt intensities. The study also highlights the importance of evaluating PSH models at more than one region, because the conclusions reached on the basis of a solely interplate or intraplate study would be very different.

  13. United States Regional GIC Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Gannon, J. L.; Trichtchenko, L.; Fernberg, P.

    2012-12-01

    Geomagnetically-Induced Currents (GICs) are driven by impulsive geomagnetic disturbances created by the interaction between the Earth's magnetosphere and sharp velocity, density, and magnetic field enhancements in the solar wind. These disturbances result in ground-level, time varying magnetic fields, which, when large, can create induced currents that may interfere with electric transmission systems. We present a first step towards a regional GIC hazard assessment map by considering the spatial distribution and magnitude of this phenomenon for specific physiographic regions. The analysis is based on 1D models of Earth conductivity that were compiled and interpreted from past conductivity measurements and specified levels of the change of magnetic field with time (dB/dt), representing different levels of geomagnetic storm conditions. In addition, we compare the estimated electric field values with derived indices often used as proxies for geomagnetic disturbance, including the realtime USGS-K index and local magnetic disturbance values calculated from magnetic field data at US Geological Survey magnetic observatories.

  14. Volcanic-hazards assessments; past, present, and future

    USGS Publications Warehouse

    Crandell, D.R.

    1991-01-01

    Worldwide interest in volcanic-hazards assessments was greatly stimulated by the 1980 eruption of Mount St. Helens, just 2 years after a hazards assessment of the volcano was published in U.S Geological Survey Bulletin 1383-C. Many climactic eruption on May 18, although the extent of the unprecedented and devastating lateral blast was not anticipated. 

  15. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  16. Seismic Hazard and Risk Assessments for Beijing-Tianjin-Tangshan, China, Area

    USGS Publications Warehouse

    Xie, F.; Wang, Z.; Liu, J.

    2011-01-01

    Seismic hazard and risk in the Beijing-Tianjin-Tangshan, China, area were estimated from 500-year intensity observations. First, we digitized the intensity observations (maps) using ArcGIS with a cell size of 0.1 ?? 0.1??. Second, we performed a statistical analysis on the digitized intensity data, determined an average b value (0.39), and derived the intensity-frequency relationship (hazard curve) for each cell. Finally, based on a Poisson model for earthquake occurrence, we calculated seismic risk in terms of a probability of I ??? 7, 8, or 9 in 50 years. We also calculated the corresponding 10 percent probability of exceedance of these intensities in 50 years. The advantages of assessing seismic hazard and risk from intensity records are that (1) fewer assumptions (i. e., earthquake source and ground motion attenuation) are made, and (2) site-effect is included. Our study shows that the area has high seismic hazard and risk. Our study also suggests that current design peak ground acceleration or intensity for the area may not be adequate. ?? 2010 Birkh??user / Springer Basel AG.

  17. New Directions in Seismic Hazard Assessment through Focused Earth Observation in the MARmara SuperSITE

    NASA Astrophysics Data System (ADS)

    Meral Ozel, Nurcan; Necmioglu, Ocal; Favali, Paolo; Douglas, John; Mathieu, Pierre Philippe; Geli, Louis; Tan, Onur; Ergintav, Semih; Oguz Ozel, A.; Gurbuz, Cemil; Erdik, Mustafa

    2013-04-01

    Among the regions around the Mediterranean Sea for which earthquakes represent a major threat to their social and economic development, the area around the Marmara Sea, one of the most densely populated parts of Europe, is subjected to a high level of seismic hazard. For this region the MARSITE project is proposed with the aim of assessing the "state of the art" of seismic risk evaluation and management at European level. This will be the starting point to move a "step forward" towards new concepts of risk mitigation and management by long-term monitoring activities carried out both on land and at sea. MARsite will serve as the platform for an integrated, multidisciplinary, holistic and articulated framework for dealing with fault zone monitoring, capable of developing the next generation of observatories to study earthquake generation processes. The main progress will be the fusion of ground- and space-based monitoring systems dedicated to geo-hazard monitoring. All data (space/sea-bottom/seismology/borehole/geochemistry) will flow to KOERI and hosted in and served via a secure server. The MARSITE project aims to coordinate research groups with different scientific skills (from seismology to engineering to gas geochemistry) in a comprehensive monitoring activity developed both in the Marmara Sea and in the surrounding urban and country areas. The project collects multidisciplinary data, to be shared, interpreted and merged in consistent theoretical and practical models suitable for the implementation of good practices to move the necessary information to the end users in charge of seismic risk management of the Istanbul-Marmara Sea area. Marsite is divided into eleven work packages that consider the processes involved in earthquake generation and the physics of short-term seismic transients, 4D deformations to understand earthquake cycle processes, fluid activity monitoring and seismicity under the sea floor using existing autonomous instrumentation, early warning

  18. Feasibility of anomaly occurrence in aerosols time series obtained from MODIS satellite images during hazardous earthquakes

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi; Jahani Chehrebargh, Fatemeh

    2016-09-01

    Earthquake is one of the most devastating natural disasters that its prediction has not materialized comprehensive. Remote sensing data can be used to access information which is closely related to an earthquake. The unusual variations of lithosphere, atmosphere and ionosphere parameters before the main earthquakes are considered as earthquake precursors. To date the different precursors have been proposed. This paper examines one of the parameters which can be derived from satellite imagery. The mentioned parameter is Aerosol Optical Depth (AOD) that this article reviews its relationship with earthquake. Aerosol parameter can be achieved through various methods such as AERONET ground stations or using satellite images via algorithms such as the DDV (Dark Dense Vegetation), Deep Blue Algorithm and SYNTAM (SYNergy of Terra and Aqua Modis). In this paper, by analyzing AOD's time series (derived from MODIS sensor on the TERRA platform) for 16 major earthquakes, seismic anomalies were observed before and after earthquakes. Before large earthquakes, rate of AOD increases due to the pre-seismic changes before the strong earthquake, which produces gaseous molecules and therefore AOD increases. Also because of aftershocks after the earthquake there is a significant change in AOD due to gaseous molecules and dust. These behaviors suggest that there is a close relationship between earthquakes and the unusual AOD variations. Therefore the unusual AOD variations around the time of earthquakes can be introduced as an earthquake precursor.

  19. Development of Tools for the Rapid Assessment of Landslide Potential in Areas Exposed to Intense Storms, Earthquakes, and Other Triggering Mechanisms

    NASA Astrophysics Data System (ADS)

    Highland, Lynn

    2014-05-01

    Landslides frequently occur in connection with other types of hazardous phenomena such as earthquake or volcanic activity and intense rainstorms. Strong shaking, for example, often triggers extensive landslides in mountainous areas, which can then complicate response and compound socio-economic impacts over shaking losses alone. The U.S. Geological Survey (USGS) is exploring different ways to add secondary hazards to its Prompt Assessment of Global Earthquakes for Response (PAGER) system, which has been developed to deliver rapid earthquake impact and loss assessments following significant global earthquakes. The PAGER team found that about 22 percent of earthquakes with fatalities have deaths due to secondary causes, and the percentage of economic losses they incur has not been widely studied, but is probably significant. The current approach for rapid assessment and reporting of the potential and distribution of secondary earthquake-induced landslides involves empirical models that consider ground acceleration, slope, and rock-strength. A complementary situational awareness tool being developed is a region-specific landslide database for the U.S. The latter will be able to define, in a narrative form, the landslide types (debris flows, rock avalanches, shallow versus deep) that generally occur in each area, along with the type of soils, geology and meteorological effects that could have a bearing on soil saturation, and thus susceptibility. When a seismic event occurs in the U.S. and the PAGER system generates web-based earthquake information, these landslide narratives will simultaneously be made available, which will help in the assessment of the nature of landslides in that particular region. This landslide profile database could also be applied to landslide events that are not triggered by earthquake shaking, in conjunction with National Weather Service Alerts and other landslide/debris-flow alerting systems. Currently, prototypes are being developed for both

  20. Thermal anomaly before earthquake and damage assessment using remote sensing data for 2014 Yutian earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Yanmei; Huang, Haiying; Jiang, Zaisen; Fang, Ying; Cheng, Xiao

    2014-12-01

    Thermal anomaly appears to be a significant precursor of some strong earthquakes. In this study, time series of MODIS Land Surface Temperature (LST) products from 2001 to 2014 are processed and analyzed to locate possible anomalies prior to the Yutian earthquake (12 February 2014, Xinjiang, CHINA). In order to reduce the seasonal or annual effects from the LST variations, also to avoid the rainy and cloudy weather in this area, a background mean of ten-day nighttime LST are derived using averaged MOD11A2 products from 2001 to 2012. Then the ten-day LST data from Jan 2014 to FebJanuary 2014 were differenced using the above background. Abnormal LST increase before the earthquake is quite obvious from the differential images, indicating that this method is useful in such area with high mountains and wide-area deserts. Also, in order to assess the damage to infrastructure, China's latest civilian high-resolution remote sensing satellite - GF-1 remote sensed data are applied to the affected counties in this area. The damaged infrastructures and ground surface could be easily interpreted in the fused pan-chromatic and multi-spectral images integrating both texture and spectral information.

  1. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J., II; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/q