Science.gov

Sample records for assessing earthquake hazards

  1. Earthquake hazard assessment after Mexico (1985).

    PubMed

    Degg, M R

    1989-09-01

    The 1985 Mexican earthquake ranks foremost amongst the major earthquake disasters of the twentieth century. One of the few positive aspects of the disaster is that it provided massive quantities of data that would otherwise have been unobtainable. Every opportunity should be taken to incorporate the findings from these data in earthquake hazard assessments. The purpose of this paper is to provide a succinct summary of some of the more important lessons from Mexico. It stems from detailed field investigations, and subsequent analyses, conducted by the author on the behalf of reinsurance companies.

  2. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-04-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  3. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-01-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  4. Assessing the earthquake hazards in urban areas

    USGS Publications Warehouse

    Hays, W.W.; Gori, P.L.; Kockelman, W.J.

    1988-01-01

    Major urban areas in widely scattered geographic locations across the United States are a t varying degrees of risk from earthquakes. the locations of these urban areas include Charleston, South Carolina; Memphis Tennessee; St.Louis, Missouri; Salt Lake City, Utah; Seattle-Tacoma, Washington; Portland, Oregon; and Anchorage, Alaska; even Boston, Massachusetts, and Buffalo New York, have a history of large earthquakes. Cooperative research during the past decade has focused on assessing the nature and degree of the risk or seismic hazard i nthe broad geographic regions around each urban area. The strategy since the 1970's has been to bring together local, State, and Federal resources to solve the problem of assessing seismic risk. Successfl sooperative programs have been launched in the San Francisco Bay and Los Angeles regions in California and the Wasatch Front region in Utah. 

  5. Earthquake Hazard Assessment: Basics of Evaluation

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of

  6. Earthquake Hazard Assessment: an Independent Review

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  7. Spatial earthquake hazard assessment of Evansville, Indiana

    USGS Publications Warehouse

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.

    1997-01-01

    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  8. USGS Training in Afghanistan: Modern Earthquake Hazards Assessments

    NASA Astrophysics Data System (ADS)

    Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.

    2007-05-01

    Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."

  9. Modern Earthquake Hazard Assessments in Afghanistan: A USGS Training Course

    NASA Astrophysics Data System (ADS)

    Garthwaite, M.; Mooney, W. D.; Medlin, J.; Holzer, T.; McGarr, A.; Bohannon, R.

    2007-12-01

    Afghanistan is located in a tectonically active region at the western extent of the Indo-Asian collision zone, where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can cause damage, not only from strong ground shaking and surface rupture, but also from liquefaction and extensive landsliding. The M=6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghan communities to such hazards, and resulted in at least 1000 fatalities. This training course in modern earthquake hazard assessments is an integral part of the international effort to provide technical assistance to Afghanistan using an "end-to-end" approach. This approach involves providing assistance in all stages of hazard assessment, from identifying earthquakes, to disseminating information on mitigation strategies to the public. The purpose of this training course, held December 2-6, 2006 at the Afghan Geological Survey in Kabul, was to provide a solid background in the relevant seismological and geological methods for preparing for future earthquakes. With this information, participants may now be expected to educate other members of the Afghan community. In addition, they are better prepared to conduct earthquake hazard assessments and to build the capabilities of the Afghan Geological Survey. The training course was taught using a series of Power Point lectures, with all lectures being presented in English and translated into Dari, one of the two main languages of Afghanistan. The majority of lecture slides were also annotated in both English and Dari. Lectures were provided to the students in both hardcopy and digital formats. As part of the on-going USGS participation in the program, additional training sessions are planned in the subjects of field geology, modern concepts in Earth science, mineral resource assessments and applied geophysics.

  10. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  11. Earthquake Hazard and Risk Assessment Based on Unified Scaling Law for Earthquakes: State of Gujarat, India

    NASA Astrophysics Data System (ADS)

    Parvez, Imtiyaz A.; Nekrasova, Anastasia; Kossobokov, Vladimir

    2017-03-01

    The Gujarat state of India is one of the most seismically active intercontinental regions of the world. Historically, it has experienced many damaging earthquakes including the devastating 1819 Rann of Kachchh and 2001 Bhuj earthquakes. The effect of the later one is grossly underestimated by the Global Seismic Hazard Assessment Program (GSHAP). To assess a more adequate earthquake hazard for the state of Gujarat, we apply Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter recurrence relation taking into account naturally fractal distribution of earthquake loci. USLE has evident implications since any estimate of seismic hazard depends on the size of the territory considered and, therefore, may differ dramatically from the actual one when scaled down to the proportion of the area of interest (e.g. of a city) from the enveloping area of investigation. We cross-compare the seismic hazard maps compiled for the same standard regular grid 0.2° × 0.2° (1) in terms of design ground acceleration based on the neo-deterministic approach, (2) in terms of probabilistic exceedance of peak ground acceleration by GSHAP, and (3) the one resulted from the USLE application. Finally, we present the maps of seismic risks for the state of Gujarat integrating the obtained seismic hazard, population density based on India's Census 2011 data, and a few model assumptions of vulnerability.

  12. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: State of Gujarat, India

    NASA Astrophysics Data System (ADS)

    Nekrasova, Anastasia; Kossobokov, Vladimir; Parvez, Imtiyaz

    2016-04-01

    The Gujarat state of India is one of the most seismically active intercontinental regions of the world. Historically, it has experienced many damaging earthquakes including the devastating 1819 Rann of Kutch and 2001 Bhuj earthquakes. The effect of the later one is grossly underestimated by the Global Seismic Hazard Assessment Program (GSHAP). To assess a more adequate earthquake hazard for the state of Gujarat, we apply Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter recurrence relation taking into account naturally fractal distribution of earthquake loci. USLE has evident implications since any estimate of seismic hazard depends on the size of the territory considered and, therefore, may differ dramatically from the actual one when scaled down to the proportion of the area of interest (e.g. of a city) from the enveloping area of investigation. We cross compare the seismic hazard maps compiled for the same standard regular grid 0.2°×0.2° (i) in terms of design ground acceleration (DGA) based on the neo-deterministic approach, (ii) in terms of probabilistic exceedance of peak ground acceleration (PGA) by GSHAP, and (iii) the one resulted from the USLE application. Finally, we present the maps of seismic risks for the state of Gujarat integrating the obtained seismic hazard, population density based on 2011 census data, and a few model assumptions of vulnerability.

  13. Harmonized Probabilistic Seismic Hazard Assessment in Europe: Earthquake Geology Applied

    NASA Astrophysics Data System (ADS)

    Woessner, J.; Danciu, L.; Giardini, D.; Share Consortium

    2012-04-01

    Probabilistic seismic hazard assessment (PSHA) aims to characterize the best available knowledge on seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results from PSHAs form the baseline for informed decision-making and provide essential input to each risk assessment application. SHARE is an EC-FP7 funded project to create a testable time-independent community-based hazard model for the Euro-Mediterranean region. SHARE scientists are creating a model framework and infrastructure for a harmonized PSHA. The results will serve as reference for the Eurocode 8 application and are envisioned to provide homogeneous input for state-of-the art seismic safety assessment for critical industry. Harmonizing hazard is pursued on the input data level and the model building procedure across borders and tectonic features of the European-Mediterranean region. An updated earthquake catalog, a harmonized database of seismogenic sources together with adjusted ground motion prediction equations (GMPEs) form the bases for a borderless assessment. We require transparent and reproducible strategies to estimate parameter values and their uncertainties within the source model assessment and the contributions of the GMPEs. The SHARE model accounts for uncertainties via a logic tree. Epistemic uncertainties within the seismic source-model are represented by four source model options including area sources, fault sources and kernel-smoothing approaches, aleatory uncertainties for activity rates and maximum magnitudes. Epistemic uncertainties for predicted ground motions are considered by multiple GMPEs as a function of tectonic settings and treated as being correlated. For practical implementation, epistemic uncertainties in the source model (i.e. dip and strike angles) are treated as aleatory, and a mean seismicity model is considered. The final results contain the full distribution of ground motion variability. This contribution will feature preliminary

  14. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  15. Generating Random Earthquake Events for Probabilistic Tsunami Hazard Assessment

    NASA Astrophysics Data System (ADS)

    LeVeque, Randall J.; Waagan, Knut; González, Frank I.; Rim, Donsub; Lin, Guang

    2016-12-01

    To perform probabilistic tsunami hazard assessment for subduction zone earthquakes, it is necessary to start with a catalog of possible future events along with the annual probability of occurrence, or a probability distribution of such events that can be easily sampled. For near-field events, the distribution of slip on the fault can have a significant effect on the resulting tsunami. We present an approach to defining a probability distribution based on subdividing the fault geometry into many subfaults and prescribing a desired covariance matrix relating slip on one subfault to slip on any other subfault. The eigenvalues and eigenvectors of this matrix are then used to define a Karhunen-Loève expansion for random slip patterns. This is similar to a spectral representation of random slip based on Fourier series but conforms to a general fault geometry. We show that only a few terms in this series are needed to represent the features of the slip distribution that are most important in tsunami generation, first with a simple one-dimensional example where slip varies only in the down-dip direction and then on a portion of the Cascadia Subduction Zone.

  16. Seismic hazard assessment for Myanmar: Earthquake model database, ground-motion scenarios, and probabilistic assessments

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Thant, M.; Maung Maung, P.; Sieh, K.

    2015-12-01

    We have constructed an earthquake and fault database, conducted a series of ground-shaking scenarios, and proposed seismic hazard maps for all of Myanmar and hazard curves for selected cities. Our earthquake database integrates the ISC, ISC-GEM and global ANSS Comprehensive Catalogues, and includes harmonized magnitude scales without duplicate events. Our active fault database includes active fault data from previous studies. Using the parameters from these updated databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and the elapse time of last events), we have determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the modelled ground motions to the felt intensities of earthquakes. Through the case of the 1975 Bagan earthquake, we determined that Atkinson and Moore's (2003) scenario using the ground motion prediction equations (GMPEs) fits the behaviours of the subduction events best. Also, the 2011 Tarlay and 2012 Thabeikkyin events suggested the GMPEs of Akkar and Cagnan (2010) fit crustal earthquakes best. We thus incorporated the best-fitting GMPEs and site conditions based on Vs30 (the average shear-velocity down to 30 m depth) from analysis of topographic slope and microtremor array measurements to assess seismic hazard. The hazard is highest in regions close to the Sagaing Fault and along the Western Coast of Myanmar as seismic sources there have earthquakes occur at short intervals and/or last events occurred a long time ago. The hazard curves for the cities of Bago, Mandalay, Sagaing, Taungoo and Yangon show higher hazards for sites close to an active fault or with a low Vs30, e.g., the downtown of Sagaing and Shwemawdaw Pagoda in Bago.

  17. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    NASA Astrophysics Data System (ADS)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  18. Some differences in seismic hazard assessment for natural and fluid-induced earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2014-12-01

    Although there is little doubt that fluid-induced earthquakes contribute significantly to the seismic hazard in some parts of the United States, assessing this contribution in ways consistent with hazard assessment for natural earthquakes is proving to be challenging. For natural earthquakes, the hazard is considered to be independent of time whereas for fluid-induced seismicity there is considerable time dependence as evidenced, for instance, by the dramatic increase in recent years of the seismicity in Oklahoma. Case histories of earthquakes induced by the development of Enhanced Geothermal Systems and wastewater injection at depth illustrate a few of the problems. Analyses of earthquake sequences induced by these operations indicate that the rate of earthquake occurrence is proportional to the rate of injection, a factor that, on a broad scale, depends on the level of energy production activities. For natural earthquakes, in contrast, the rate of earthquake occurrence depends on time-independent tectonic factors including the long-term slip rates across known faults. Maximum magnitude assessments for natural and fluid-induced earthquake sources also show a contrast in behavior. For a natural earthquake source, maximum magnitude is commonly assessed from empirical relations between magnitude and the area of a potentially-active fault. The same procedure applied to fluid-induced earthquakes yields magnitudes that are systematically higher than what is observed. For instance, the maximum magnitude estimated from the fault area of the Prague, OK, main shock of 6 November 2011 is 6.2 whereas the magnitude measured from seismic data is 5.65 (Sun and Hartzell, 2014). For fluid-induced earthquakes, maximum magnitude appears to be limited according to the volume of fluid injected before the largest earthquake. This implies that for a given fluid-injection project, the upper limit on magnitude increases as long as injection continues.

  19. Improving earthquake hazard assessments in Italy: An alternative to “Texas sharpshooting”

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Panza, Giuliano F.

    2012-12-01

    The 20 May 2012 M = 6.1 earthquake that struck the Emilia region of northern Italy illustrates a common problem afflicting earthquake hazard assessment. It occurred in an area classified as "low seismic hazard" based on the current national seismic hazard map (Gruppo di Lavoro, Redazione della mappa di pericolosità sismica, rapporto conclusivo, 2004, http://zonesismiche.mi.ingv.it/mappa_ps_apr04/italia.html) adopted in 2006. That revision of the seismic code was motivated by the 2002 M = 5.7 earthquake that struck S. Giuliano di Puglia in central Italy, also a previously classified low-hazard area, resulting in damage and casualties. Previous code was updated in 1981-1984 after earlier maps missed the 1980 M = 6.5 Irpinia earthquake.

  20. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  1. Long aftershock sequences within continents and implications for earthquake hazard assessment.

    PubMed

    Stein, Seth; Liu, Mian

    2009-11-05

    One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere.

  2. Recent destructive earthquakes and international collaboration for seismic hazard assessment in the East Asia region

    NASA Astrophysics Data System (ADS)

    Hao, K.; Fujiwara, H.

    2013-12-01

    Recent destructive earthquakes in East-Asia claimed one third of million of people's lives. People learned from the lessons but forgotten after generations even one sculpted on stones. Probabilistic seismic hazard assessment (SHA) is considered as a scientific way to define earthquake zones and to guide urban plan and construction. NIED promoted SHA as a national mission of Japan over 10 years and as an international cooperation to neighbor countries since the 2008 Wenchuan earthquake. We initiated China-Japan-Korea SHA strategic cooperative program for the next generation map supported by MOST-JST-NRF in 2010. We also initiated cooperative program with Taiwan Earthquake Model from 2012, as well many other parties in the world. Consequently NIED proudly joined Global Earthquake Model (GEM) since its SHA's methodologies and technologies were highly valuated. As a representative of Japan, NIED will continue to work closely with all members of GEM not only for the GEM global components, also for its regional programs. Seismic hazard assessment has to be carrying out under existed information with epistemic uncertainty. We routinely improve the existed models to carefully treat active faults, earthquake records, and magnitudes under the newest authorized information provided by Earthquake Research Committee, Headquarters for Earthquake Research Promotion. After the 2011 Tohoku earthquake, we have been re-considering the national SHA maps in even long-term and low probabilities. We have setup a platform of http://www.j-shis.bosai.go.jp/en to exchange the SHA information and share our experiences, lessons and knowledge internationally. Some probabilistic SHA concepts, seismic risk mitigation issues need constantly to be promoted internationally through outreach and media. Major earthquakes in East Asian region which claimed one third of million of people's lives (slab depth with contour (Hayes et al., 2011)).

  3. Scaling of intraplate earthquake recurrence interval with fault length and implications for seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Marrett, Randall

    1994-12-01

    Consensus indicates that faults follow power-law scaling, although significant uncertainty remains about the values of important parameters. Combining these scaling relationships with power-law scaling relationships for earthquakes suggests that intraplate earthquake recurrence interval scales with fault length. Regional scaling data may be locally calibrated to yield a site-specific seismic hazard assessment tool. Scaling data from small faults (those that do not span the seismogenic layer) suggest that recurrence interval varies as a negative power of fault length. Due to uncertainties regarding the recently recognized changes in scaling for large earthquakes, it is unclear whether recurrence interval varies as a negative or positive power of fault length for large fauts (those that span the seismogenic layer). This question is of critical importance for seismic hazard assessment.

  4. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  5. Field-based assessment of landslide hazards resulting from the 2015 Gorkha, Nepal earthquake sequence

    NASA Astrophysics Data System (ADS)

    Collins, B. D.; Jibson, R.

    2015-12-01

    The M7.8 2015 Gorkha, Nepal earthquake sequence caused thousands of fatalities, destroyed entire villages, and displaced millions of residents. The earthquake sequence also triggered thousands of landslides in the steep Himalayan topography of Nepal and China; these landslides were responsible for hundreds of fatalities and blocked vital roads, trails, and rivers. With the support of USAID's Office of Foreign Disaster Assistance, the U.S. Geological Survey responded to this crisis by providing landslide-hazard expertise to Nepalese agencies and affected villages. Assessments of landslide hazards following earthquakes are essential to identify vulnerable populations and infrastructure, and inform government agencies working on rebuilding and mitigation efforts. However, assessing landslide hazards over an entire earthquake-affected region (in Nepal, estimated to be ~30,000 km2), and in exceedingly steep, inaccessible topography presents a number of logistical challenges. We focused the scope of our assessment by conducting helicopter- and ground-based landslide assessments in 12 priority areas in central Nepal identified a priori from satellite photo interpretation performed in conjunction with an international consortium of remote sensing experts. Our reconnaissance covered 3,200 km of helicopter flight path, extending over an approximate area of 8,000 km2. During our field work, we made 17 site-specific assessments and provided landslide hazard information to both villages and in-country agencies. Upon returning from the field, we compiled our observations and further identified and assessed 74 river-blocking landslide dams, 12% of which formed impoundments larger than 1,000 m2 in surface area. These assessments, along with more than 11 hours of helicopter-based video, and an overview of hazards expected during the 2015 summer monsoon have been publically released (http://dx.doi.org/10.3133/ofr20151142) for use by in-country and international agencies.

  6. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  7. East Meets West: An Earthquake in India Helps Hazard Assessment in the Central United States

    USGS Publications Warehouse

    ,

    2002-01-01

    Although geographically distant, the State of Gujarat in India bears many geological similarities to the Mississippi Valley in the Central United States. The Mississippi Valley contains the New Madrid seismic zone that, during the winter of 1811-1812, produced the three largest historical earthquakes ever in the continental United States and remains the most seismically active region east of the Rocky Mountains. Large damaging earthquakes are rare in ‘intraplate’ settings like New Madrid and Gujarat, far from the boundaries of the world’s great tectonic plates. Long-lasting evidence left by these earthquakes is subtle (fig. 1). Thus, each intraplate earthquake provides unique opportunities to make huge advances in our ability to assess and understand the hazards posed by such events.

  8. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 18. Errors in Probabilistic Seismic Hazard Analysis.

    DTIC Science & Technology

    1982-01-01

    PAPER S-73-lu- STATE-OF-THE-ART FOR ASSESSING EARTHQUAKE HAZARDS IN THE - UNITED STATES Report 18 ERRORS IN PROBABILISTIC SEISMIC HAZARD ANALYSISo by...Daniel. Veneziano t(Ii Department of Civil Engineering ~ MassachuseWt institute oF Technology .0 Cambridge, Mass. 02139 January 1982 Report 18 of a Serios...COVE.RED STATE-OF-THE-ART FOR ASSESSING EARTHQUAKE HAZARDS IN THE UNITED STATES; Report 18 , ERRORS IN Report 18 of a series PROBABILISTIC SEISMIC

  9. From Earthquake Prediction Research to Time-Variable Seismic Hazard Assessment Applications

    NASA Astrophysics Data System (ADS)

    Bormann, Peter

    2011-01-01

    The first part of the paper defines the terms and classifications common in earthquake prediction research and applications. This is followed by short reviews of major earthquake prediction programs initiated since World War II in several countries, for example the former USSR, China, Japan, the United States, and several European countries. It outlines the underlying expectations, concepts, and hypotheses, introduces the technologies and methodologies applied and some of the results obtained, which include both partial successes and failures. Emphasis is laid on discussing the scientific reasons why earthquake prediction research is so difficult and demanding and why the prospects are still so vague, at least as far as short-term and imminent predictions are concerned. However, classical probabilistic seismic hazard assessments, widely applied during the last few decades, have also clearly revealed their limitations. In their simple form, they are time-independent earthquake rupture forecasts based on the assumption of stable long-term recurrence of earthquakes in the seismotectonic areas under consideration. Therefore, during the last decade, earthquake prediction research and pilot applications have focused mainly on the development and rigorous testing of long and medium-term rupture forecast models in which event probabilities are conditioned by the occurrence of previous earthquakes, and on their integration into neo-deterministic approaches for improved time-variable seismic hazard assessment. The latter uses stress-renewal models that are calibrated for variations in the earthquake cycle as assessed on the basis of historical, paleoseismic, and other data, often complemented by multi-scale seismicity models, the use of pattern-recognition algorithms, and site-dependent strong-motion scenario modeling. International partnerships and a global infrastructure for comparative testing have recently been developed, for example the Collaboratory for the Study of

  10. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  11. Seismic hazard assessment and pattern recognition of earthquake prone areas in the Po Plain (Italy)

    NASA Astrophysics Data System (ADS)

    Gorshkov, Alexander; Peresan, Antonella; Soloviev, Alexander; Panza, Giuliano F.

    2014-05-01

    A systematic and quantitative assessment, capable of providing first-order consistent information about the sites where large earthquakes may occur, is crucial for the knowledgeable seismic hazard evaluation. The methodology for the pattern recognition of areas prone to large earthquakes is based on the morphostructural zoning method (MSZ), which employs topographic data and present-day tectonic structures for the mapping of earthquake-controlling structures (i.e. the nodes formed around lineaments intersections) and does not require the knowledge about past seismicity. The nodes are assumed to be characterized by a uniform set of topographic, geologic, and geophysical parameters; on the basis of such parameters the pattern recognition algorithm defines a classification rule to discriminate seismogenic and non-seismogenic nodes. This methodology has been successfully applied since the early 1970s in a number of regions worldwide, including California, where it permitted the identification of areas that have been subsequently struck by strong events and that previously were not considered prone to strong earthquakes. Recent studies on the Iberian Peninsula and the Rhone Valley, have demonstrated the applicability of MSZ to flat basins, with a relatively flat topography. In this study, the analysis is applied to the Po Plain (Northern Italy), an area characterized by a flat topography, to allow for the systematic identification of the nodes prone to earthquakes with magnitude larger or equal to M=5.0. The MSZ method differs from the standard morphostructural analysis where the term "lineament" is used to define the complex of alignments detectable on topographic maps or on satellite images. According to that definition the lineament is locally defined and the existence of the lineament does not depend on the surrounding areas. In MSZ, the primary element is the block - a relatively homogeneous area - while the lineament is a secondary element of the morphostructure

  12. Synergistic use of geospatial and in-situ data for earthquake hazard assessment in Vrancea area

    NASA Astrophysics Data System (ADS)

    Zoran, M. A.; Savastru, R. S.; Savastru, D. M.

    2016-08-01

    Space-time anomalies of Earth's emitted radiation: thermal infrared in spectral range measured from satellite months to weeks before the occurrence of earthquakes, radon in underground water and soil, etc., and electromagnetic anomalies are considered as pre-seismic signals. Satellite remote sensing provides spatially continuous information of the tectonic landscape but also contribute to the understanding of specific fault and information about stress transfer between fault systems from depth and to the surface as well as on released energy by earthquakes and other modes of deformation. This paper presents observations made using time series MODIS Terra/Aqua, NOAA-AVHRR, Landsat satellite data for derived multi-parameters land surface temperature (LST), outgoing long-wave radiation (OLR), and mean air temperature (AT) for some seismic events recorded in Vrancea active geotectonic region in Romania. For some analyzed earthquakes, starting with almost one week prior to a moderate or strong earthquake a transient thermal infrared rise in LST of several Celsius degrees (oC) and the increased OLR values higher than the normal function of the magnitude and focal depth, which disappeared after the main shock. Synergy of multisenzor and multitemporal satellite data with in-situ and GPS data and spatial analysis of magnitude-frequency distributions of Vrancea earthquakes provides more information on Vrancea area seismicity. Earthquake hazard assessment for Vrancea region in Romania must have different degrees of complexity, which consists of derived geospatial and in-situ geophysical/geodetic parameters monitoring, analysis, predictive modeling, and forecast-oriented as well as decision-making procedures.

  13. No longer so clueless in seattle: Current assessment of earthquake hazards

    USGS Publications Warehouse

    Weaver, C.S.

    1998-01-01

    The Pacific Northwest is an active subduction zone. Because of this tectonic setting, there are three distinct earthquake source zones in earthquake hazard assessments of the Seattle area. Offshore, the broad sloping interface between the Juan de Fuca and the North America plates produces earthquakes as large as magnitude 9; on the average these events occur every 400-600 years. The second source zone is within the subducting Juan de Fuca plate as it bends, at depths of 40-60 km, beneath the Puget lowland. Five earthquakes in this zone this century have had magnitudes greater than 6, including one magnitude 7.1 event in 1949. The third zone, the crust of the North America plate, is the least well known. Paleoseismic evidence shows that an event of approximate magnitude 7 occurred on the Seattle fault about 1000 years ago. Potentially very damaging to the heavily urbanized areas of Puget Sound, the rate of occurrence and area over which large magnitude crustal events are to be expected is the subject of considerable research.

  14. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    NASA Astrophysics Data System (ADS)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  15. Assessment of earthquake hazard by simultaneous use of the statistical method and the method of fuzzy mathematics

    NASA Astrophysics Data System (ADS)

    Feng, De-Yi; Gu, Jing-Ping; Lin, Ming-Zhou; Xu, Shao-Xie; Yu, Xue-Jun

    1984-11-01

    A probabilistic method and a retrieval method of fuzzy information are simultaneously studied for assessment of earthquake hazard, or earthquake prediction. Statistical indices of regional seismicity in three adjacent time intervals are used to predict an earthquake in the next interval. The indices are earthquake frequency, the maximum magnitude, and a parameter related to the average magnitude (or b-value) and their time derivatives. Applying the probabilistic method, we can estimate a probability for a large earthquake with magnitude larger than a certain threshold occurring in the next time interval in a given region. By using the retrieval method of fuzzy information we can classify time intervals into several classes according to the regional seismic activity in each time interval and then evaluate whether or not the next time interval belongs to seismically hazardous time interval with a large earthquake. Some examples of applying both methods to the North section of the North-South Seismic Zone in China are shown. The results obtained are in good agreement with actual earthquake history. A comparison of the probabilistic method with the method of fuzzy mathematics is made, and it is recommended that earthquake hazard be assessed by simultaneous use of both methods.

  16. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence

  17. Site-specific Earthquake-generate Tsunami Hazard Assessment in U.S. Atlantic Coast

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Titov, V. V.; Moore, C. W.; Gica, E.; Arcas, D.; Spillane, M. C.; Zhou, H.

    2009-12-01

    The Indian Ocean tsunami of 24 December 2004 has changed the perception of a tsunami as an infrequent low-risk hazard. The devoid of subduction or convergent zones in the Atlantic Ocean makes coastal communities less aware of the potential tsunami hazard in the East Coast of US. The existing continental shelf offshore has believed to act as additional buffer that may significantly attenuate the tsunami impact to the U.S. Atlantic coast. However, the uncertainties are still substantial and need to be timely addressed: 1. the largest tsunami ever recorded in Atlantic, 1755 Lisbon, was understudied; 2. the Hispaniola-Puerto Rico-Lesser Antilles subduction zone - a Sumatra-Andaman type of trench - in the northeast of Caribbean is capable of generating catastrophic tsunami; 3. the South Sandwich Trench was mostly overlooked; and 4. most of previous studies tackling these issues did not surpass the linear tsunami propagation in the deep ocean for nonlinear tsunami inundation modeling in the coastal area. Using the established NOAA high-resolution tsunami inundation model, the present study explores above uncertainties and provides comprehensive modeling assessment of the potential earthquake-generated tsunami hazard for selected coastal communities in U.S. Atlantic coasts, with highlight on over-shelf tsunami wave dynamics. This study is an extension of the USGS evaluation of earthquake-tsunami impact in Atlantic (ten Brink et al., 2007; Barkan et al., 2009) in the light of the Nuclear Regulation Commission (NRC) efforts on tsunami risk assessment for existing and potential nuclear power plants in U. S. East Coast.

  18. Metrics, Bayes, and BOGSAT: Recognizing and Assessing Uncertainties in Earthquake Hazard Maps

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Brooks, E. M.; Spencer, B. D.

    2015-12-01

    Recent damaging earthquakes in areas predicted to be relatively safe illustrate the need to assess how seismic hazard maps perform. At present, there is no agreed way of assessing how well a map performed. The metric implicit in current maps, that during a time interval predicted shaking will be exceeded only at a specific fraction of sites, is useful but permits maps to be nominally successful although they significantly underpredict or overpredict shaking, or nominally unsuccessful but predict shaking well. We explore metrics that measure the effects of overprediction and underprediction. Although no single metric fully characterizes map behavior, using several metrics can provide useful insight for comparing and improving maps. A related question is whether to regard larger-than-expected shaking as a low-probability event allowed by a map, or to revise the map to show increased hazard. Whether and how much to revise a map is complicated, because a new map that better describes the past may or may not better predict the future. The issue is like deciding after a coin has come up heads a number of times whether to continue assuming that the coin is fair and the run is a low-probability event, or to change to a model in which the coin is assumed to be biased. This decision can be addressed using Bayes' Rule, so that how much to change depends on the degree of one's belief in the prior model. Uncertainties are difficult to assess for hazard maps, which require subjective assessments and choices among many poorly known or unknown parameters. However, even rough uncertainty measures for estimates/predictions from such models, sometimes termed BOGSATs (Bunch Of Guys Sitting Around Table) by risk analysts, can give users useful information to make better decisions. We explore the extent of uncertainty via sensitivity experiments on how the predicted hazard depends on model parameters.

  19. Assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L.; Hays, Walter W.

    2000-01-01

    This report--the second of two volumes--represents an ongoing effort by the U.S. Geological Survey to transfer accurate Earth science information about earthquake hazards along Utah's Wasatch Front to researchers, public officials, design professionals, land-use planners, and emergency managers in an effort to mitigate the effects of these hazards. This volume contains eight chapters on ground-shaking hazards and aspects of loss estimation.

  20. Earthquake risk reduction in the United States: An assessment of selected user needs and recommendations for the National Earthquake Hazards Reduction Program

    SciTech Connect

    1994-12-31

    This Assessment was conducted to improve the National Earthquake Hazards Reduction Program (NEHRP) by providing NEHRP agencies with information that supports their user-oriented setting of crosscutting priorities in the NEHRP strategic planning process. The primary objective of this Assessment was to take a ``snapshot`` evaluation of the needs of selected users throughout the major program elements of NEHRP. Secondary objectives were to conduct an assessment of the knowledge that exists (or is being developed by NEHRP) to support earthquake risk reduction, and to begin a process of evaluating how NEHRP is meeting user needs. An identification of NEHRP`s strengths also resulted from the effort, since those strengths demonstrate successful methods that may be useful to NEHRP in the future. These strengths are identified in the text, and many of them represent important achievements since the Earthquake Hazards Reduction Act was passed in 1977.

  1. Assessment of the 1988 Saguenay earthquake: Implications on attenuation functions for seismic hazard analysis

    SciTech Connect

    Toro, G.R.; McGuire, R.K. )

    1991-09-01

    This study investigates the earthquake records from the 1988 Saguenay earthquake and examines the implications of these records with respect to ground-motion models used in seismic-hazard studies in eastern North America (ENA), specifically, to what extent the ground motions from this earthquake support or reject the various attenuation functions used in the EPRI and LLNL seismic-hazard calculations. Section 2 provides a brief description of the EPRI and LLNL attenuation functions for peak acceleration and for spectral velocities. Section 2 compares these attenuation functions the ground motions from the Saguenay earthquake and from other relevant earthquakes. Section 4 reviews available seismological studies about the Saguenay earthquake, in order to understand its seismological characteristics and why some observations may differ from predictions. Section 5 examines the assumptions and methodology used in the development of the attenuation functions selected by LLNL ground-motion expert 5. Finally, Section 6 draws conclusions about the validity of the various sets of attenuation functions, in light of the Saguenay data and of other evidence presented here. 50 refs., 37 figs., 7 tabs.

  2. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  3. Can Apparent Stress be Used to Time-Dependent Seismic Hazard Assessment or Earthquake Forecast? An Ongoing Approach in China

    NASA Astrophysics Data System (ADS)

    Wu, Zhongliang; Jiang, Changsheng; Zhang, Shengfeng

    2016-08-01

    The approach in China since the last 1.5 decade for using apparent stress in time-dependent seismic hazard assessment or earthquake forecast is summarized. Retrospective case studies observe that apparent stress exhibits short-term increase, with time scale of several months, before moderate to strong earthquakes in a large area surrounding the `target earthquake'. Apparent stress is also used to estimate the tendency of aftershock activity. The concept relating apparent stress indirectly to stress level is used to understand the properties of some `precursory' anomalies. Meanwhile, different opinions were reported. Problems in the calculation also existed for some cases. Moreover, retrospective studies have the limitation in their significance as compared to forward forecast test. Nevertheless, this approach, seemingly uniquely carried out in a large scale in mainland China, provides the earthquake catalogs for the predictive analysis of seismicity with an additional degree of freedom, deserving a systematic review and reflection.

  4. Uncertainty in local and regional tsunami earthquake source parameters: Implications for scenario based hazard assessment and forecasting

    NASA Astrophysics Data System (ADS)

    Müller, Christof; Power, William; Burbidge, David; Wang, Xiaoming

    2016-04-01

    Over the last decade tsunami propagation models have been used extensively for both tsunami forecasting, hazard and risk assessment. However, the effect of uncertainty in the earthquake source parameters, such as location and distribution of slip in the earthquake source on the results of the tsunami model has not always been examined in great detail. We have developed a preliminary combined and continuous Hikurangi-Kermadec subduction zone interface model. The model is defined by a spline surface and is based on a previously published spline model for Hikurangi interface and a more traditional unit source model for the Kermadec interface. The model allows to freely position and vary the earthquake epicenter and to consider non-uniform slip. Using this model we have investigated the effects of variability in non-uniform slip and epicenter location on the distribution of offshore maximum wave heights for local New Zealand targets. Which scenario out of an ensemble is responsible for the maximum wave height locally is a spatially highly variable function of earthquake location and/or the distribution of slip. We use the Coefficient of Variation (CoV) to quantify the variability of offshore wave heights as a function of source location and distribution of slip. CoV increases significantly with closer proximity to the shore, in bays and in shallow water. The study has implication for tsunami hazard assessment and forecasting. As an example, our results challenge the concept of hazard assessment using a single worst case scenario in particular for local tsunami.

  5. New seafloor map of the Puerto Rico Trench helps assess earthquake and tsunami hazards

    USGS Publications Warehouse

    ten Brink, Uri S.; Danforth, William; Polloni, Christopher; Andrews, Brian D.; Llanes Estrada, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-01-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure l). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S.Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands [McCann et al., 2004]. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918 [Mercado and McCann, 1998]. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico [Mercado et al., 2002; Schwab et al., 1991],although their ages are unknown.

  6. The Effects on Tsunami Hazard Assessment in Chile of Assuming Earthquake Scenarios with Spatially Uniform Slip

    NASA Astrophysics Data System (ADS)

    Carvajal, Matías; Gubler, Alejandra

    2016-12-01

    We investigated the effect that along-dip slip distribution has on the near-shore tsunami amplitudes and on coastal land-level changes in the region of central Chile (29°-37°S). Here and all along the Chilean megathrust, the seismogenic zone extends beneath dry land, and thus, tsunami generation and propagation is limited to its seaward portion, where the sensitivity of the initial tsunami waveform to dislocation model inputs, such as slip distribution, is greater. We considered four distributions of earthquake slip in the dip direction, including a spatially uniform slip source and three others with typical bell-shaped slip patterns that differ in the depth range of slip concentration. We found that a uniform slip scenario predicts much lower tsunami amplitudes and generally less coastal subsidence than scenarios that assume bell-shaped distributions of slip. Although the finding that uniform slip scenarios underestimate tsunami amplitudes is not new, it has been largely ignored for tsunami hazard assessment in Chile. Our simulations results also suggest that uniform slip scenarios tend to predict later arrival times of the leading wave than bell-shaped sources. The time occurrence of the largest wave at a specific site is also dependent on how the slip is distributed in the dip direction; however, other factors, such as local bathymetric configurations and standing edge waves, are also expected to play a role. Arrival time differences are especially critical in Chile, where tsunamis arrive earlier than elsewhere. We believe that the results of this study will be useful to both public and private organizations for mapping tsunami hazard in coastal areas along the Chilean coast, and, therefore, help reduce the risk of loss and damage caused by future tsunamis.

  7. Rapid field-based landslide hazard assessment in response to post-earthquake emergency

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Gambini, Stefano; Cancelliere, Giorgio

    2016-04-01

    On April 25, 2015 a Mw 7.8 earthquake occurred 80 km to the northwest of Kathmandu (Nepal). The largest aftershock, occurred on May 12, 2015, was the Mw 7.3 Nepal earthquake (SE of Zham, China), 80 km to the east of Kathmandu. . The earthquakes killed ~9000 people and severely damaged a 10,000 sqkm region in Nepal and neighboring countries. Several thousands of landslides have been triggered during the event, causing widespread damages to mountain villages and the evacuation of thousands of people. Rasuwa was one of the most damaged districts. This contribution describes landslide hazard analysis of the Saramthali, Yarsa and Bhorle VDCs (122 km2, Rasuwa district). Hazard is expressed in terms of qualitative classes (low, medium, high), through a simple matrix approach that combines frequency classes and magnitude classes. The hazard analysis is based primarily on the experience gained during a field survey conducted in September 2014. During the survey, local knowledge has been systematically exploited through interviews with local people that have experienced the earthquake and the coseismic landslides. People helped us to recognize fractures and active deformations, and allowed to reconstruct a correct chronicle of landslide events, in order to assign the landslide events to the first shock, the second shock, or the post-earthquake 2015 monsoon. The field experience was complemented with a standard analysis of the relationship between potential controlling factors and the distribution of landslides reported in Kargel et al (2016). This analysis allowed recognizing the most important controlling factor. This information was integrated with the field observations to verify the mapped units and to complete the mapping in area not accessible for field activity. Finally, the work was completed with the analysis and the use of a detailed landslide inventory produced by the University of Milano Bicocca that covers most of the area affected by coseismic landslides in

  8. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  9. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    USGS Publications Warehouse

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  10. Assessment of existing and potential landslide hazards resulting from the April 25, 2015 Gorkha, Nepal earthquake sequence

    USGS Publications Warehouse

    Collins, Brian D.; Jibson, Randall W.

    2015-07-28

    This report provides a detailed account of assessments performed in May and June 2015 and focuses on valley-blocking landslides because they have the potential to pose considerable hazard to many villages in Nepal. First, we provide a seismological background of Nepal and then detail the methods used for both external and in-country data collection and interpretation. Our results consist of an overview of landsliding extent, a characterization of all valley-blocking landslides identified during our work, and a description of video resources that provide high resolution coverage of approximately 1,000 kilometers (km) of river valleys and surrounding terrain affected by the Gorkha earthquake sequence. This is followed by a description of site-specific landslide-hazard assessments conducted while in Nepal and includes detailed descriptions of five noteworthy case studies. Finally, we assess the expectation for additional landslide hazards during the 2015 summer monsoon season.

  11. Hazard assessment of long-period ground motions for the Nankai Trough earthquakes

    NASA Astrophysics Data System (ADS)

    Maeda, T.; Morikawa, N.; Aoi, S.; Fujiwara, H.

    2013-12-01

    We evaluate a seismic hazard for long-period ground motions associated with the Nankai Trough earthquakes (M8~9) in southwest Japan. Large interplate earthquakes occurring around the Nankai Trough have caused serious damages due to strong ground motions and tsunami; most recent events were in 1944 and 1946. Such large interplate earthquake potentially causes damages to high-rise and large-scale structures due to long-period ground motions (e.g., 1985 Michoacan earthquake in Mexico, 2003 Tokachi-oki earthquake in Japan). The long-period ground motions are amplified particularly on basins. Because major cities along the Nankai Trough have developed on alluvial plains, it is therefore important to evaluate long-period ground motions as well as strong motions and tsunami for the anticipated Nankai Trough earthquakes. The long-period ground motions are evaluated by the finite difference method (FDM) using 'characterized source models' and the 3-D underground structure model. The 'characterized source model' refers to a source model including the source parameters necessary for reproducing the strong ground motions. The parameters are determined based on a 'recipe' for predicting strong ground motion (Earthquake Research Committee (ERC), 2009). We construct various source models (~100 scenarios) giving the various case of source parameters such as source region, asperity configuration, and hypocenter location. Each source region is determined by 'the long-term evaluation of earthquakes in the Nankai Trough' published by ERC. The asperity configuration and hypocenter location control the rupture directivity effects. These parameters are important because our preliminary simulations are strongly affected by the rupture directivity. We apply the system called GMS (Ground Motion Simulator) for simulating the seismic wave propagation based on 3-D FDM scheme using discontinuous grids (Aoi and Fujiwara, 1999) to our study. The grid spacing for the shallow region is 200 m and

  12. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    NASA Astrophysics Data System (ADS)

    Türker, Tuǧba; Bayrak, Yusuf

    2016-04-01

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn't been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, MS=7.3 and 1897, MS=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for MS magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boǧazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99 with an earthquake

  13. Seismic noise in the shallow subsurface: Methods for using it in earthquake hazard assessment

    NASA Astrophysics Data System (ADS)

    Scott, James B.

    2007-12-01

    The primary focus of this work has been characterization of the shallow subsurface for seismic hazard using naturally occurring seismic noise. Three studies chronicle the further development of the refraction microtremor method for determining shear-wave velocity-depth structure, which is a predictor of earthquake shaking amplification. These studies present results from the first uses of the refraction microtremor method to determine earthquake hazard across entire urban basins. Improved field methods led to speed and efficiency in these deployments. These spatially dense geophysical measurements of shallow shear-wave velocity were conducted to broadly define shaking hazard and to determine the accuracy of earlier methods of prediction. The refraction microtremor method agrees well with borehole and other shear-velocity methods. In Chapter 2, I present results from the first long urban transect, 16 km across the Reno, Nevada basin. In 45 of the 55 (82%) measurements of shear velocity averaged to 30 m depth (Vs30) the result was above 360 m/s. The National Earthquake Hazards Reduction Program (NEHRP) defines Vs30 of 360 m/s as the boundary between site hazard class C and class D, with class C above 360 m/s. Mapped geologic and soil units are not accurate predictors of Vs30 on this transect, and would have predicted most of the transect as NEHRP-D. In Chapter 3, I present Vs30 results along a 13 km-long transect parallel to Las Vegas Blvd. (The Strip), along with borehole and surface-wave measurements of 30 additional sites. Again, our transect measurements correlate poorly against geologic map units, which do not predict Vs30 at any individual site with sufficient accuracy for engineering application. Two models to predict Vs30 were reported in this study. In Chapter 4, I present aggregate results from the Reno and Las Vegas transects and include results from our 60 km-long transect across the Los Angeles basin. Our statistical analyses suggest that the lateral

  14. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    NASA Astrophysics Data System (ADS)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  15. Unified Scaling Law for Earthquakes: Seismic hazard and risk assessment for Himalayas, Lake Baikal, and Central China regions

    NASA Astrophysics Data System (ADS)

    Nekrasova, Anastasia; Kossobokov, Vladimir; Parvez, Imtiyaz; Tao, Xiaxin

    2015-04-01

    The Unified Scaling Law for Earthquakes (USLE), that generalizes the Gutenberg-Richter recurrence relation, has evident implications since any estimate of seismic hazard depends on the size of the territory that is used for investigation, averaging, and extrapolation into the future. Therefore, the hazard may differ dramatically when scaled down to the proportion of the area of interest (e.g. territory occupied by a city) from the enveloping area of investigation. In fact, given the observed patterns of distributed seismic activity the results of multi-scale analysis embedded in USLE approach demonstrate that traditional estimations of seismic hazard and risks for cities and urban agglomerations are usually underestimated. Moreover, the USLE approach provides a significant improvement when compared to the results of probabilistic seismic hazard analysis, e.g. the maps resulted from the Global Seismic Hazard Assessment Project (GSHAP). We apply the USLE approach to evaluating seismic hazard and risks to population of the three territories of different size representing a sub-continental and two different regional scales of analysis, i.e. the Himalayas and surroundings, Lake Baikal, and Central China regions.

  16. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto

    2016-04-01

    The characteristic earthquake hypothesis is not strongly supported by observational data because of the relatively short duration of historical and even paleoseismological records. For instance, for the Calabria (Southern Italy) region, historical information on strong earthquakes exist for at least two thousand years, but they can be considered complete for M > 6.0 only for the latest few centuries. As a consequence, characteristic earthquakes are seldom reported for individual fault segments, and hazard assessment is not reliably estimated by means of only minor seismicity reported in the historical catalogs. Even if they cannot substitute the information contained in a good historical catalog, physics-based earthquake simulators have become popular in the recent literature, and their application has been justified by a number of reasons. In particular, earthquake simulators can provide interesting information on which renewal models can better describe the recurrence statistics, and how this is affected by features as local fault geometry and kinematics. The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 100,000 events of magnitudes ≥ 4.5. The algorithm on which this simulator is based is constrained by several physical elements, as an average slip rate due to tectonic loading for every single segment in the investigated fault system, the process of rupture growth and termination, and interaction between earthquake sources, including small magnitude events. Events nucleated in one segment are allowed to expand into neighboring segments, if they are separated by a given maximum range of distance. The application of our simulation algorithm to Calabria region provides typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term periodicity of strong earthquakes, short

  17. A new earthquake catalogue for seismic hazard assessment of the NPP (Nuclear Power Plant) Jaslovske Bohunice, Slovakia, site

    NASA Astrophysics Data System (ADS)

    Kysel, Robert; Kristek, Jozef; Moczo, Peter; Csicsay, Kristian; Cipciar, Andrej; Srbecky, Miroslav

    2014-05-01

    According to the IAEA (International Atomic Energy Agency) Safety Guide No. SSG-9, an earthquake catalogue should comprise all information on pre-historical, historical and seismometrically recorded earthquakes in the region which should cover geographic area not smaller than a circle with radius of 300 km around the site. Jaslovske Bohunice is an important economic site. Several nuclear facilities are located in Jaslovske Bohunice - either in operation (NPP V2, national radioactive waste repository) or in decommissioning (NPP A1, NPP V1). Moreover, a new reactor unit is being planned for the site. Jaslovske Bohunice site is not far from the Dobra Voda seismic source zone which has been the most active seismic zone at territory of Slovakia since the beginning of 20th century. Relatively small distances to Austria, Hungary, Czech Republic and Slovak capital Bratislava make the site a prominent priority in terms of seismic hazard assessment. We compiled a new earthquake catalogue for the NPP Jaslovske Bohunice region following the recommendations of the IAEA Safety Guide. The region includes parts of the territories of Slovakia, Hungary, Austria, the Czech Republic and Poland, and it partly extends up to Germany, Slovenia, Croatia and Serbia. The catalogue is based on data from six national earthquake catalogues, two regional earthquake catalogues (ACORN, CENEC) and a catalogue from the local NPP network. The primarily compiled catalogue for the time period 350 - 2011 consists of 9 142 events. We then homogenized and declustered the catalogue. Eventually we checked the catalogue for time completeness. For homogenization, we divided the catalogue into preseismometric (350 - 1900) and seismometric (1901-2011) periods. For earthquakes characterized by the epicentral intensity and local magnitude we adopted relations proposed for homogenization of the CENEC catalogue (Grünthal et al. 2009). Instead of assuming the equivalency between local magnitudes reported by the

  18. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  19. Implications for prediction and hazard assessment from the 2004 Parkfield earthquake.

    PubMed

    Bakun, W H; Aagaard, B; Dost, B; Ellsworth, W L; Hardebeck, J L; Harris, R A; Ji, C; Johnston, M J S; Langbein, J; Lienkaemper, J J; Michael, A J; Murray, J R; Nadeau, R M; Reasenberg, P A; Reichle, M S; Roeloffs, E A; Shakal, A; Simpson, R W; Waldhauser, F

    2005-10-13

    Obtaining high-quality measurements close to a large earthquake is not easy: one has to be in the right place at the right time with the right instruments. Such a convergence happened, for the first time, when the 28 September 2004 Parkfield, California, earthquake occurred on the San Andreas fault in the middle of a dense network of instruments designed to record it. The resulting data reveal aspects of the earthquake process never before seen. Here we show what these data, when combined with data from earlier Parkfield earthquakes, tell us about earthquake physics and earthquake prediction. The 2004 Parkfield earthquake, with its lack of obvious precursors, demonstrates that reliable short-term earthquake prediction still is not achievable. To reduce the societal impact of earthquakes now, we should focus on developing the next generation of models that can provide better predictions of the strength and location of damaging ground shaking.

  20. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  1. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  2. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  3. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  4. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  5. Vulnerability assessment of a port and harbor community to earthquake and tsunami hazards: Integrating technical expert and stakeholder input

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Research suggests that the Pacific Northwest could experience catastrophic earthquakes and tsunamis in the near future, posing a significant threat to the numerous ports and harbors along the coast. A collaborative, multiagency initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to these hazards, involving Oregon Sea Grant, Washington Sea Grant, the National Oceanic and Atmospheric Administration Coastal Services Center, and the U.S. Geological Survey Center for Science Policy. One element of this research, planning, and outreach initiative is a natural hazard mitigation and emergency preparedness planning process that combines technical expertise with local stakeholder values and perceptions. This paper summarizes and examines one component of the process, the vulnerability assessment methodology, used in the pilot port and harbor community of Yaquina River, Oregon, as a case study of assessing vulnerability at the local level. In this community, stakeholders were most concerned with potential life loss and other nonstructural vulnerability issues, such as inadequate hazard awareness, communication, and response logistics, rather than structural issues, such as damage to specific buildings or infrastructure.

  6. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  7. Volunteers in the earthquake hazard reduction program

    USGS Publications Warehouse

    Ward, P.L.

    1978-01-01

    With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 

  8. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  9. Foreshocks and short-term hazard assessment of large earthquakes using complex networks: the case of the 2009 L'Aquila earthquake

    NASA Astrophysics Data System (ADS)

    Daskalaki, Eleni; Spiliotis, Konstantinos; Siettos, Constantinos; Minadakis, Georgios; Papadopoulos, Gerassimos A.

    2016-08-01

    The monitoring of statistical network properties could be useful for the short-term hazard assessment of the occurrence of mainshocks in the presence of foreshocks. Using successive connections between events acquired from the earthquake catalog of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) for the case of the L'Aquila (Italy) mainshock (Mw = 6.3) of 6 April 2009, we provide evidence that network measures, both global (average clustering coefficient, small-world index) and local (betweenness centrality) ones, could potentially be exploited for forecasting purposes both in time and space. Our results reveal statistically significant increases in the topological measures and a nucleation of the betweenness centrality around the location of the epicenter about 2 months before the mainshock. The results of the analysis are robust even when considering either large or off-centered the main event space windows.

  10. Multicomponent Body and Surface Wave Seismic Analysis using an Urban Land Streamer System: An Integrative Earthquake Hazards Assessment Approach

    NASA Astrophysics Data System (ADS)

    Gribler, G.; Liberty, L. M.

    2014-12-01

    We present earthquake site response results from a 48-channel multicomponent seismic land streamer and large weight drop system. We acquired data along a grid of city streets in western Idaho at a rate of a few km per day where we derived shear wave velocity profiles to a depth of 40-50 m by incorporating vertical and radial geophone signals to capture the complete elliptical Rayleigh wave motion. We also obtained robust p-wave reflection and refraction results by capturing the returned signals that arrive at non-vertical incidence angles that result from the high-velocity road surface layer. By integrating the derived shear wave velocity profiles with p-wave reflection results, we include depositional and tectonic boundaries from the upper few hundred meters into our analysis to help assess whether ground motions may be amplified by shallow bedrock. By including p-wave refraction information into the analysis, we can identify zones of high liquefaction potential by comparing shear wave and p-wave velocity (Vp/Vs) measurements relative to refraction-derived water table depths. The utilization of multicomponent land streamer data improves signal-noise levels over single component data with no additional field effort. The added multicomponent data processing step can be as simple as calculating the magnitude of the vector for surface wave and refraction arrivals or rotating the reflected signals to the maximum emergence angle based on near surface p-wave velocity information. We show example data from a number of Idaho communities where historical earthquakes have been recorded. We also present numerical models and systematic field tests that show the effects of a high velocity road surface layer in surface and body wave measurements. We conclude that multicomponent seismic information derived from seismic land streamers can provide a significant improvement in earthquake hazard assessment over a standard single component approach with only a small addition in

  11. 77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... needs for existing buildings, to review the National Earthquake Hazards Reduction Program (NEHRP)...

  12. 76 FR 64325 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... Directive/PPD-8: National Preparedness to National Earthquake Hazards Reduction Program (NEHRP)...

  13. 76 FR 18165 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. ] SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... be sent to National Earthquake Hazards Reduction Program Director, National Institute of...

  14. 77 FR 18792 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  15. 77 FR 19224 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  16. 77 FR 27439 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  17. 75 FR 8042 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-23

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a.... Jack Hayes, National Earthquake Hazards Reduction Program Director, National Institute of Standards...

  18. 75 FR 75457 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-03

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  19. 75 FR 18787 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... Earthquake Hazards Reduction Program Director, National Institute of Standards and Technology, 100...

  20. Historical Earthquakes As Examples To Assess The Seismic Hazard In The Eastern Region of Venezuela

    NASA Astrophysics Data System (ADS)

    Martin, J.; Posadas, A.; Avendaño, J.; Sierra, R.; Bonive, F.

    The North-East region of Venezuela lies on the border of the friction zone between the Caribbean and South-American tectonic plates, a source of great seismicity. The first written news of an earthquake in the American Continent were those of the earth- quake of september 1530 which caused damage to Cumaná, the first town of that Continent. Since then a continuous series of earthquakes have been reported, many of them with damaging effects on Cumaná; those caused in the 1929 earthquake (17-01- 1929; with IX Mercalli degrees) were well described by Sidney Paige in the Vo. 20 of the B.S.S.A., March, 1930. An earthquake of magnitude 5.9 {11-06-1986; 10.26z N,63.29z W} was the trigger for the Unesco`s intention to declare the Estado Sucre as a pilot zone for seismological studies. In 1991 a report issued by the International Institute of Earthquake Prediction Theory and Matematical Geophysics (Academy of Sciences, U.R.S.S.) stated that the ocurrence of an earthquake of great magnitude which could affect the North-East region of Venezuela was possible. Other studies of the seismicity of the region have been carried out. The interest of the authorities and of the seismologists reached a peak with the earthquake of july 1997 (10.456z N, 63.555z W), with a magnitude of 6.9; there was a death toll of 73, around 528 people injured and more than 2000 houses needed to be completely rebuilt. A study of micro- zonification of the city of Cumaná has been carried out recently and the results of this study will be presented also to this Congress.

  1. Applications of Multi-Cycle Earthquake Simulations to Earthquake Hazard

    NASA Astrophysics Data System (ADS)

    Gilchrist, Jacquelyn Joan

    This dissertation seeks to contribute to earthquake hazard analyses and forecasting by conducting a detailed study of the processes controlling the occurrence, and particularly the clustering, of large earthquakes, the probabilities of these large events, and the dynamics of their ruptures. We use the multi-cycle earthquake simulator RSQSim to investigate several fundamental aspects of earthquake occurrence in order to improve the understanding of earthquake hazard. RSQSim, a 3D, boundary element code that incorporates rate- and state-friction to simulate earthquakes in fully interacting, complex fault systems has been successful at modeling several aspects of fault slip and earthquake occurrence. Multi-event earthquake models with time-dependent nucleation based on rate- and state-dependent friction, such as RSQSim, provide a viable physics-based method for modeling earthquake processes. These models can provide a better understanding of earthquake hazard by improving our knowledge of earthquake processes and probabilities. RSQSim is fast and efficient, and therefore is able to simulate very long sequences of earthquakes (from hundreds of thousands to millions of events). This makes RSQSim an ideal instrument for filling in the current gaps in earthquake data, from short and incomplete earthquake catalogs to unrealistic initial conditions used for dynamic rupture models. RSQSim catalogs include foreshocks, aftershocks, and occasional clusters of large earthquakes, the statistics of which are important for the estimation of earthquake probabilities. Additionally, RSQSim finds a near optimal nucleation location that enables ruptures to propagate at minimal stress conditions and thus can provide suites of heterogeneous initial conditions for dynamic rupture models that produce reduced ground motions compared to models with homogeneous initial stresses and arbitrary forced nucleation locations.

  2. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    SciTech Connect

    Wong, I.G.; Green, R.K.; Sun, J.I.; Pezzopane, S.K.; Abrahamson, N.A.; Quittmeyer, R.C.

    1996-12-31

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  3. Estimation of fault propagation distance from fold shape: Implications for earthquake hazard assessment

    NASA Astrophysics Data System (ADS)

    Allmendinger, Richard W.; Shaw, John H.

    2000-12-01

    A numerical grid search using the trishear kinematic model can be used to extract both slip and the distance that a fault tip line has propagated during growth of a fault-propagation fold. The propagation distance defines the initial position of the tip line at the onset of slip. In the Santa Fe Springs anticline of the Los Angeles basin, we show that the tip line of the underlying Puente Hills thrust fault initiated at the same position as the 1987 magnitude 6.0 Whittier Narrows earthquake.

  4. Preliminary Earthquake Hazard Map of Afghanistan

    USGS Publications Warehouse

    Boyd, Oliver S.; Mueller, Charles S.; Rukstales, Kenneth S.

    2007-01-01

    Introduction Earthquakes represent a serious threat to the people and institutions of Afghanistan. As part of a United States Agency for International Development (USAID) effort to assess the resource potential and seismic hazards of Afghanistan, the Seismic Hazard Mapping group of the United States Geological Survey (USGS) has prepared a series of probabilistic seismic hazard maps that help quantify the expected frequency and strength of ground shaking nationwide. To construct the maps, we do a complete hazard analysis for each of ~35,000 sites in the study area. We use a probabilistic methodology that accounts for all potential seismic sources and their rates of earthquake activity, and we incorporate modeling uncertainty by using logic trees for source and ground-motion parameters. See the Appendix for an explanation of probabilistic seismic hazard analysis and discussion of seismic risk. Afghanistan occupies a southward-projecting, relatively stable promontory of the Eurasian tectonic plate (Ambraseys and Bilham, 2003; Wheeler and others, 2005). Active plate boundaries, however, surround Afghanistan on the west, south, and east. To the west, the Arabian plate moves northward relative to Eurasia at about 3 cm/yr. The active plate boundary trends northwestward through the Zagros region of southwestern Iran. Deformation is accommodated throughout the territory of Iran; major structures include several north-south-trending, right-lateral strike-slip fault systems in the east and, farther to the north, a series of east-west-trending reverse- and strike-slip faults. This deformation apparently does not cross the border into relatively stable western Afghanistan. In the east, the Indian plate moves northward relative to Eurasia at a rate of about 4 cm/yr. A broad, transpressional plate-boundary zone extends into eastern Afghanistan, trending southwestward from the Hindu Kush in northeast Afghanistan, through Kabul, and along the Afghanistan-Pakistan border

  5. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky and Portsmouth, Ohio

    SciTech Connect

    1997-03-01

    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the, ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah.

  6. Space-time behavior of continental intraplate earthquakes and implications for hazard assessment in China and the Central U.S.

    NASA Astrophysics Data System (ADS)

    Stein, Seth; Liu, Mian; Luo, Gang; Wang, Hui

    2014-05-01

    much faster than it accumulates today, suggesting that they result from recent fault activation that releases prestored strain energy in the crust. If so, this earthquake sequence is similar to aftershocks in that the rates of energy release should decay with time and the sequence of earthquakes will eventually end. We use simple physical analysis and numerical simulations to show that the current New Madrid earthquake sequence is likely ending or has ended. Recognizing that mid-continental earthquakes have long aftershock sequences and complex spatiotemporal occurrences is critical to improve hazard assessments

  7. Fragility analysis of flood protection structures in earthquake and flood prone areas around Cologne, Germany for multi-hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Tyagunov, Sergey; Vorogushyn, Sergiy; Munoz Jimenez, Cristina; Parolai, Stefano; Fleming, Kevin; Merz, Bruno; Zschau, Jochen

    2013-04-01

    The work presents a methodology for fragility analyses of fluvial earthen dikes in earthquake and flood prone areas. Fragility estimates are being integrated into the multi-hazard (earthquake-flood) risk analysis being undertaken within the framework of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) for the city of Cologne, Germany. Scenarios of probable cascading events due to the earthquake-triggered failure of flood protection dikes and the subsequent inundation of surroundings are analyzed for the area between the gauges Andernach and Düsseldorf along the Rhine River. Along this river stretch, urban areas are partly protected by earthen dikes, which may be prone to failure during exceptional floods and/or earthquakes. The seismic fragility of the dikes is considered in terms of liquefaction potential (factor of safety), estimated by the use of the simplified procedure of Seed and Idriss. It is assumed that initiation of liquefaction at any point throughout the earthen dikes' body corresponds to the failure of the dike and, therefore, this should be taken into account for the flood risk calculations. The estimated damage potential of such structures is presented as a two-dimensional surface (as a function of seismic hazard and water level). Uncertainties in geometrical and geotechnical dike parameters are considered within the framework of Monte Carlo simulations. Taking into consideration the spatial configuration of the existing flood protection system within the area under consideration, seismic hazard curves (in terms of PGA) are calculated for sites along the river segment of interest at intervals of 1 km. The obtained estimates are used to calculate the flood risk when considering the temporal coincidence of seismic and flood events. Changes in flood risk for the considered hazard cascade scenarios are quantified and compared to the single-hazard scenarios.

  8. Quantifying the Seismic Hazard From Natural and Induced Earthquakes (Invited)

    NASA Astrophysics Data System (ADS)

    Rubinstein, J. L.; Llenos, A. L.; Ellsworth, W. L.; McGarr, A.; Michael, A. J.; Mueller, C. S.; Petersen, M. D.

    2013-12-01

    In the past 12 years, seismicity rates in portions of the central and eastern United States (CEUS) have increased. In 2011, the year of peak activity, three M ≥ 5 earthquakes occurred, causing millions of dollars in damage. Much of the increase in seismicity is believed to have been induced by wastewater from oil and gas activity that is injected deep underground. This includes damaging earthquakes in southern Colorado, central Arkansas, and central Oklahoma in 2011. Earthquakes related to oil and gas activities contribute significantly to the total seismic hazard in some areas of the CEUS, but most of the tens of thousands of wastewater disposal wells in the CEUS do not cause damaging earthquakes. The challenge is to better understand this contribution to the hazard in a realistic way for those wells that are inducing earthquakes or wells that may induce earthquakes in the future. We propose a logic-tree approach to estimate the hazard posed by the change in seismicity that deemphasizes the need to evaluate whether the seismicity is natural or man-made. We first compile a list of areas of increased seismicity, including areas of known induced earthquakes. Using areas of increased seismicity (instead of just induced earthquakes) allows us to assess the hazard over a broader region, avoiding the often-difficult task of judging whether an earthquake sequence is induced. With the zones of increased seismicity defined, we then estimate the earthquake hazard for each zone using a four-branch logic tree: (1) The increased seismicity rate is natural, short-term variation within the longer-term background seismicity rate. Thus, these earthquakes would be added to the catalog when computing the background seismicity rate. (2) The increased seismicity rate represents a new and permanent addition to the background seismicity. In this branch, a new background seismicity rate begins at the time of the change in earthquake rate. (3) Induced earthquakes account for the

  9. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  10. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  11. Simulation-Based Hazard Assessment for Long-Period Ground Motions of the Nankai Trough Megathrust Earthquake

    NASA Astrophysics Data System (ADS)

    Maeda, T.; Morikawa, N.; Iwaki, A.; Aoi, S.; Fujiwara, H.

    2014-12-01

    We evaluate a long-period ground motion hazard for the Nankai Trough earthquakes (M8~9) in west Japan. Past large earthquakes in the Nankai Trough that have occurred in an interval of 100~200 years showed various occurrence patterns and caused serious damages due to strong ground motion and tsunami. However, such large interplate earthquake potentially causes damages due to long-period ground motion even at long-distance basins. For evaluating the long-period ground motion of large earthquakes, it is important to take into account the uncertainty of source model and the effect of 3-D underground structure. In this study, we evaluate the long-period ground motion by the finite difference method (FDM) using "characterized source models" and the 3-D underground structure model. We construct various characterized source models (369 scenarios). Although most of parameters of the model are determined based on the "recipe" for predicting strong ground motion, we assume various possible source parameters including rupture area, asperity configuration, and hypocenter location. To perform the large-scale simulation for many source models, we apply a 3-D FDM scheme using discontinuous grids and utilize the GPGPU for our simulation. We use the system called GMS (Ground Motion Simulator) for the FD simulation. The grid spacing for the shallow region is 200 m and 100 m in horizontal and vertical, respectively. The grid spacing for the deep region is three times coarser. The total number of grid points is about 3.2 billion, which is about the eighth in the case of using uniform grids. We use GMS adapted for multi GPU simulation on the supercomputer TSUBAME operated by Tokyo Institute of Technology. Simulated peak ground velocity (PGV) and velocity response spectra (Sv) are strongly affected by the hypocenter location and show a large variation up to 10-fold at each site even in a group that have the same source area. We evaluate hazard curves and maps for PGV and Sv using the

  12. Earthquake hazards on the cascadia subduction zone

    SciTech Connect

    Heaton, T.H.; Hartzell, S.H.

    1987-04-10

    Large subduction earthquakes on the Cascadia subduction zone pose a potential seismic hazard. Very young oceanic lithosphere (10 million years old) is being subducted beneath North America at a rate of approximately 4 centimeters per year. The Cascadia subduction zone shares many characteristics with subduction zones in southern Chile, southwestern Japan, and Colombia, where comparably young oceanic lithosphere is also subducting. Very large subduction earthquakes, ranging in energy magnitude (M/sub w/) between 8 and 9.5, have occurred along these other subduction zones. If the Cascadia subduction zone is also storing elastic energy, a sequence of several great earthquakes (M/sub w/ 8) or a giant earthquake (M/sub w/ 9) would be necessary to fill this 1200-kilometer gap. The nature of strong ground motions recorded during subduction earthquakes of M/sub w/ less than 8.2 is discussed. Strong ground motions from even larger earthquakes (M/sub w/ up to 9.5) are estimated by simple simulations. If large subduction earthquakes occur in the Pacific Northwest, relatively strong shaking can be expected over a large region. Such earthquakes may also be accompanied by large local tsunamis. 35 references, 6 figures.

  13. Earthquake hazards on the cascadia subduction zone.

    PubMed

    Heaton, T H; Hartzell, S H

    1987-04-10

    Large subduction earthquakes on the Cascadia subduction zone pose a potential seismic hazard. Very young oceanic lithosphere (10 million years old) is being subducted beneath North America at a rate of approximately 4 centimeters per year. The Cascadia subduction zone shares many characteristics with subduction zones in southern Chile, southwestern Japan, and Colombia, where comparably young oceanic lithosphere is also subducting. Very large subduction earthquakes, ranging in energy magnitude (M(w)) between 8 and 9.5, have occurred along these other subduction zones. If the Cascadia subduction zone is also storing elastic energy, a sequence of several great earthquakes (M(w) 8) or a giant earthquake (M(w) 9) would be necessary to fill this 1200-kilometer gap. The nature of strong ground motions recorded during subduction earthquakes of M(w) less than 8.2 is discussed. Strong ground motions from even larger earthquakes (M(w) up to 9.5) are estimated by simple simulations. If large subduction earthquakes occur in the Pacific Northwest, relatively strong shaking can be expected over a large region. Such earthquakes may also be accompanied by large local tsunamis.

  14. Seismic hazard and seismic risk assessment based on the unified scaling law for earthquakes: Himalayas and adjacent regions

    NASA Astrophysics Data System (ADS)

    Nekrasova, A. K.; Kossobokov, V. G.; Parvez, I. A.

    2015-03-01

    For the Himalayas and neighboring regions, the maps of seismic hazard and seismic risk are constructed with the use of the estimates for the parameters of the unified scaling law for earthquakes (USLE), in which the Gutenberg-Richter law for magnitude distribution of seismic events within a given area is applied in the modified version with allowance for linear dimensions of the area, namely, log N( M, L) = A + B (5 - M) + C log L, where N( M, L) is the expected annual number of the earthquakes with magnitude M in the area with linear dimension L. The spatial variations in the parameters A, B, and C for the Himalayas and adjacent regions are studied on two time intervals from 1965 to 2011 and from 1980 to 2011. The difference in A, B, and C between these two time intervals indicates that seismic activity experiences significant variations on a scale of a few decades. With a global consideration of the seismic belts of the Earth overall, the estimates of coefficient A, which determines the logarithm of the annual average frequency of the earthquakes with a magnitude of 5.0 and higher in the zone with a linear dimension of 1 degree of the Earth's meridian, differ by a factor of 30 and more and mainly fall in the interval from -1.1 to 0.5. The values of coefficient B, which describes the balance between the number of earthquakes with different magnitudes, gravitate to 0.9 and range from less than 0.6 to 1.1 and higher. The values of coefficient C, which estimates the fractal dimension of the local distribution of epicenters, vary from 0.5 to 1.4 and higher. In the Himalayas and neighboring regions, the USLE coefficients mainly fall in the intervals of -1.1 to 0.3 for A, 0.8 to 1.3 for B, and 1.0 to 1.4 for C. The calculations of the local value of the expected peak ground acceleration (PGA) from the maximal expected magnitude provided the necessary basis for mapping the seismic hazards in the studied region. When doing this, we used the local estimates of the

  15. 76 FR 72905 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold a... should be sent to National Earthquake Hazards Reduction Program Director, National Institute of...

  16. 76 FR 8712 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... Effectiveness of the National Earthquake Hazards Reduction Program (NEHRP). The agenda may change to...

  17. Increased Earthquake Rates in the Central and Eastern US Portend Higher Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Rubinstein, J. L.; Ellsworth, W. L.; Mueller, C. S.; Michael, A. J.; McGarr, A.; Petersen, M. D.; Weingarten, M.; Holland, A. A.

    2014-12-01

    Since 2009 the central and eastern United States has experienced an unprecedented increase in the rate of M≥3 earthquakes that is unlikely to be due to natural variation. Where the rates have increased so has the seismic hazard, making it important to understand these changes. Areas with significant seismicity increases are limited to areas where oil and gas production take place. By far the largest contributor to the seismicity increase is Oklahoma, where recent studies suggest that these rate changes may be due to fluid injection (e.g., Keranen et al., Geology, 2013; Science, 2014). Moreover, the area of increased seismicity in northern Oklahoma that began in 2013 coincides with the Mississippi Lime play, where well completions greatly increased the year before the seismicity increase. This suggests a link to oil and gas production either directly or from the disposal of significant amounts of produced water within the play. For the purpose of assessing the hazard due to these earthquakes, should they be treated differently from natural earthquakes? Previous studies suggest that induced seismicity may differ from natural seismicity in clustering characteristics or frequency-magnitude distributions (e.g., Bachmann et al., GJI, 2011; Llenos and Michael, BSSA, 2013). These differences could affect time-independent hazard computations, which typically assume that clustering and size distribution remain constant. In Oklahoma, as well as other areas of suspected induced seismicity, we find that earthquakes since 2009 tend to be considerably more clustered in space and time than before 2009. However differences between various regional and national catalogs leave unclear whether there are significant changes in magnitude distribution. Whether they are due to natural or industrial causes, the increased earthquake rates in these areas could increase the hazard in ways that are not accounted for in current hazard assessment practice. Clearly the possibility of induced

  18. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale

    NASA Astrophysics Data System (ADS)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.

    2016-05-01

    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  19. The 1909 Taipei earthquake: implication for seismic hazard in Taipei

    USGS Publications Warehouse

    Kanamori, Hiroo; Lee, William H.K.; Ma, Kuo-Fong

    2012-01-01

    The 1909 April 14 Taiwan earthquake caused significant damage in Taipei. Most of the information on this earthquake available until now is from the written reports on its macro-seismic effects and from seismic station bulletins. In view of the importance of this event for assessing the shaking hazard in the present-day Taipei, we collected historical seismograms and station bulletins of this event and investigated them in conjunction with other seismological data. We compared the observed seismograms with those from recent earthquakes in similar tectonic environments to characterize the 1909 earthquake. Despite the inevitably large uncertainties associated with old data, we conclude that the 1909 Taipei earthquake is a relatively deep (50–100 km) intraplate earthquake that occurred within the subducting Philippine Sea Plate beneath Taipei with an estimated M_W of 7 ± 0.3. Some intraplate events elsewhere in the world are enriched in high-frequency energy and the resulting ground motions can be very strong. Thus, despite its relatively large depth and a moderately large magnitude, it would be prudent to review the safety of the existing structures in Taipei against large intraplate earthquakes like the 1909 Taipei earthquake.

  20. Assessing earthquake hazards with fault trench and LiDAR maps in the Puget Lowland, Washington, USA (Invited)

    NASA Astrophysics Data System (ADS)

    Nelson, A. R.; Bradley, L.; Personius, S. F.; Johnson, S. Y.

    2010-12-01

    Deciphering the earthquake histories of faults over the past few thousands of years in tectonically complex forearc regions relies on detailed site-specific as well as regional geologic maps. Here we present examples of site-specific USGS maps used to reconstruct earthquake histories for faults in the Puget Lowland. Near-surface faults and folds in the Puget Lowland accommodate 4-7 mm/yr of north-south shortening resulting from northward migration of forearc blocks along the Cascadia convergent margin. The shortening has produced east-trending uplifts, basins, and associated reverse faults that traverse urban areas. Near the eastern and northern flanks of the Olympic Mountains, complex interactions between north-south shortening and mountain uplift are reflected by normal, oblique-slip, and reverse surface faults. Holocene oblique-slip movement has also been mapped on Whidbey Island and on faults in the foothills of the Cascade Mountains in the northeastern lowland. The close proximity of lowland faults to urban areas may pose a greater earthquake hazard there than do much longer but more distant plate-boundary faults. LiDAR imagery of the densely forested lowland flown over the past 12 years revealed many previously unknown 0.5-m to 6-m-high scarps showing Holocene movement on upper-plate faults. This imagery uses two-way traveltimes of laser light pulses to detect as little as 0.2 m of relative relief on the forest floor. The returns of laser pulses with the longest travel times yield digital elevation models of the ground surface, which we vertically exaggerate and digitally shade from multiple directions at variable transparencies to enhance identification of scarps. Our maps include imagery at scales of 1:40,000 to 1:2500 with contour spacings of 100 m to 0.5 m. Maps of the vertical walls of fault-scarp trenches show complex stratigraphies and structural relations used to decipher the histories of large surface-rupturing earthquakes. These logs (field mapping

  1. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    NASA Astrophysics Data System (ADS)

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in

  2. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L.

    1993-01-01

    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and

  3. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  4. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-17

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet....m. The primary purpose of this meeting is to receive information on NEHRP earthquake...

  5. Earthquake Hazard and Risk in Alaska

    NASA Astrophysics Data System (ADS)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  6. Assessment of tsunami hazard to the U.S. East Coast using relationships between submarine landslides and earthquakes

    USGS Publications Warehouse

    ten Brink, U.S.; Lee, H.J.; Geist, E.L.; Twichell, D.

    2009-01-01

    Submarine landslides along the continental slope of the U.S. Atlantic margin are potential sources for tsunamis along the U.S. East coast. The magnitude of potential tsunamis depends on the volume and location of the landslides, and tsunami frequency depends on their recurrence interval. However, the size and recurrence interval of submarine landslides along the U.S. Atlantic margin is poorly known. Well-studied landslide-generated tsunamis in other parts of the world have been shown to be associated with earthquakes. Because the size distribution and recurrence interval of earthquakes is generally better known than those for submarine landslides, we propose here to estimate the size and recurrence interval of submarine landslides from the size and recurrence interval of earthquakes in the near vicinity of the said landslides. To do so, we calculate maximum expected landslide size for a given earthquake magnitude, use recurrence interval of earthquakes to estimate recurrence interval of landslide, and assume a threshold landslide size that can generate a destructive tsunami. The maximum expected landslide size for a given earthquake magnitude is calculated in 3 ways: by slope stability analysis for catastrophic slope failure on the Atlantic continental margin, by using land-based compilation of maximum observed distance from earthquake to liquefaction, and by using land-based compilation of maximum observed area of earthquake-induced landslides. We find that the calculated distances and failure areas from the slope stability analysis is similar or slightly smaller than the maximum triggering distances and failure areas in subaerial observations. The results from all three methods compare well with the slope failure observations of the Mw = 7.2, 1929 Grand Banks earthquake, the only historical tsunamigenic earthquake along the North American Atlantic margin. The results further suggest that a Mw = 7.5 earthquake (the largest expected earthquake in the eastern U

  7. Earthquake Hazard and Risk in New Zealand

    NASA Astrophysics Data System (ADS)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  8. Earthquake Hazard for Aswan High Dam Area

    NASA Astrophysics Data System (ADS)

    Ismail, Awad

    2016-04-01

    Earthquake activity and seismic hazard analysis are important components of the seismic aspects for very essential structures such as major dams. The Aswan High Dam (AHD) created the second man-made reservoir in the world (Lake Nasser) and is constructed near urban areas pose a high-risk potential for downstream life and property. The Dam area is one of the seismically active regions in Egypt and is occupied with several cross faults, which are dominant in the east-west and north-south. Epicenters were found to cluster around active faults in the northern part of Lake and AHD location. The space-time distribution and the relation of the seismicity with the lake water level fluctuations were studied. The Aswan seismicity separates into shallow and deep seismic zones, between 0 and 14 and 14 and 30 km, respectively. These two seismic zones behave differently over time, as indicated by the seismicity rate, lateral extent, b-value, and spatial clustering. It is characterized by earthquake swarm sequences showing activation of the clustering-events over time and space. The effect of the North African drought (1982 to present) is clearly seen in the reservoir water level. As it decreased and left the most active fault segments uncovered, the shallow activity was found to be more sensitive to rapid discharging than to the filling. This study indicates that geology, topography, lineations in seismicity, offsets in the faults, changes in fault trends and focal mechanisms are closely related. No relation was found between earthquake activity and both-ground water table fluctuations and water temperatures measured in wells located around the Kalabsha area. The peak ground acceleration is estimated in the dam site based on strong ground motion simulation. This seismic hazard analyses have indicated that AHD is stable with the present seismicity. The earthquake epicenters have recently took place approximately 5 km west of the AHD structure. This suggests that AHD dam must be

  9. Late Holocene liquefaction features in the Dominican Republic: A powerful tool for earthquake hazard assessment in the northeastern Caribbean

    USGS Publications Warehouse

    Tuttle, M.P.; Prentice, C.S.; Dyer-Williams, K.; Pena, L.R.; Burr, G.

    2003-01-01

    Several generations of sand blows and sand dikes, indicative of significant and recurrent liquefaction, are preserved in the late Holocene alluvial deposits of the Cibao Valley in northern Dominican Republic. The Cibao Valley is structurally controlled by the Septentrional fault, an onshore section of the North American-Caribbean strike-slip plate boundary. The Septentrional fault was previously studied in the central part of the valley, where it sinistrally offsets Holocene terrace risers and soil horizons. In the eastern and western parts of the valley, the Septentrional fault is buried by Holocene alluvial deposits, making direct study of the structure difficult. Liquefaction features that formed in these Holocene deposits as a result of strong ground shaking provide a record of earthquakes in these areas. Liquefaction features in the eastern Cibao Valley indicate that at least one historic earthquake, probably the moment magnitude, M 8, 4 August 1946 event, and two to four prehistoric earthquakes of M 7 to 8 struck this area during the past 1100 yr. The prehistoric earthquakes appear to cluster in time and could have resulted from rupture of the central and eastern sections of the Septentrional fault circa A.D. 1200. Liquefaction features in the western Cibao Valley indicate that one historic earthquake, probably the M 8, 7 May 1842 event, and two prehistoric earthquakes of M 7-8 struck this area during the past 1600 yr. Our findings suggest that rupture of the Septentrional fault circa A.D. 1200 may have extended beyond the central Cibao Valley and generated an earthquake of M 8. Additional information regarding the age and size distribution of liquefaction features is needed to reconstruct the prehistoric earthquake history of Hispaniola and to define the long-term behavior and earthquake potential of faults associated with the North American-Caribbean plate boundary.

  10. Update earthquake risk assessment in Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-12-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  11. Earthquake hazard after a mainshock in california.

    PubMed

    Reasenberg, P A; Jones, L M

    1989-03-03

    After a strong earthquake, the possibility of the occurrence of either significant aftershocks or an even stronger mainshock is a continuing hazard that threatens the resumption of critical services and reoccupation of essential but partially damaged structures. A stochastic parametric model allows determination of probabilities for aftershocks and larger mainshocks during intervals following the mainshock. The probabilities depend strongly on the model parameters, which are estimated with Bayesian statistics from both the ongoing aftershock sequence and from a suite of historic California aftershock sequences. Probabilities for damaging aftershocks and greater mainshocks are typically well-constrained after the first day of the sequence, with accuracy increasing with time.

  12. 77 FR 75610 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... Director. Any draft meeting materials will be posted prior to the meeting on the National...

  13. 78 FR 8109 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will hold... Director. Any draft meeting materials will be posted prior to the meeting on the National...

  14. National Earthquake Hazards Reduction Program; time to expand

    USGS Publications Warehouse

    Steinbrugge, K.V.

    1990-01-01

    All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role? 

  15. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    NASA Astrophysics Data System (ADS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-11-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh.

  16. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 25. Parameters for Specifying Intensity-Related Earthquake Ground Motions.

    DTIC Science & Technology

    1987-09-01

    adjacent to causative faults. 16 0) (I) go gal w) LLcc10 Wc, LL. off l UFO tb~~ mm c IL I.-. ujco~ C1 t 7- LWLL wi0 0 0 WU.4 zz zw4 WuE S< amAl Ito 1 NEAR...and Sponheuer, W. 1969. Scale of Seismic Intensity: Proc. Fourth World Conf. on Earthquake Engineering, Santiago, Chile . Murphy, J. R., and O’Brien, L

  17. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 28. Recommended Accelerograms for Earthquake Ground Motions

    DTIC Science & Technology

    1992-06-01

    icALAKAA Autott’olcIT.’C~ fP .%WE 0AlA Am~ $Asa FAS fmlfwa~ o(Wv((W 4.20 -0.40 ALW so. . U. ttZ pIA VAUXI . Ak*- 1Y. 16 (ICCo.2 VWLIIJ t., c’vscc 01WL...n:nd the Krinitzsky- Chang curves for MM intensity, 14. SUBJECT TERMS 15. NUMBER O1r PAGES Accelerograms Earthquakes43 catalogue Ground motions 16 ... 16 Selection Process................................................... 16 Generic Accelec.ograms

  18. Vulnerability of port and harbor communities to earthquake and tsunami hazards: The use of GIS in community hazard planning

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.

    2004-01-01

    AbstractEarthquakes and tsunamis pose significant threats to Pacific Northwest coastal port and harbor communities. Developing holistic mitigation and preparedness strategies to reduce the potential for loss of life and property damage requires community-wide vulnerability assessments that transcend traditional site-specific analyses. The ability of a geographic information system (GIS) to integrate natural, socioeconomic, and hazards information makes it an ideal assessment tool to support community hazard planning efforts. This article summarizes how GIS was used to assess the vulnerability of an Oregon port and harbor community to earthquake and tsunami hazards, as part of a larger risk-reduction planning initiative. The primary purposes of the GIS were to highlight community vulnerability issues and to identify areas that both are susceptible to hazards and contain valued port and harbor community resources. Results of the GIS analyses can help decision makers with limited mitigation resources set priorities for increasing community resiliency to natural hazards.

  19. Comprehensive Seismic Monitoring for Emergency Response and Hazards Assessment: Recent Developments at the USGS National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Buland, R. P.; Guy, M.; Kragness, D.; Patton, J.; Erickson, B.; Morrison, M.; Bryon, C.; Ketchum, D.; Benz, H.

    2009-12-01

    The USGS National Earthquake Information Center (NEIC) has put into operation a new generation of seismic acquisition, processing and distribution subsystems that seamlessly integrate regional, national and global seismic network data for routine monitoring of earthquake activity and response to large, damaging earthquakes. The system, Bulletin Hydra, was designed to meet Advanced National Seismic System (ANSS) design goals to handle thousands of channels of real-time seismic data, compute and distribute time-critical seismic information for emergency response applications, and manage the integration of contributed earthquake products and information, arriving from near-real-time up to six weeks after an event. Bulletin Hydra is able meet these goals due to a modular, scalable, and flexible architecture that supports on-the-fly consumption of new data, readily allows for the addition of new scientific processing modules, and provides distributed client workflow management displays. Through the Edge subsystem, Bulletin Hydra accepts waveforms in half a dozen formats. In addition, Bulletin Hydra accepts contributed seismic information including hypocenters, magnitudes, moment tensors, unassociated and associated picks, and amplitudes in a variety of formats including earthworm import/export pairs and EIDS. Bulletin Hydra has state-driven algorithms for computing all IASPEI standard magnitudes (e.g. mb, mb_BB, ML, mb_LG, Ms_20, and Ms_BB) as well as Md, Ms(VMAX), moment tensor algorithms for modeling different portions of the wave-field at different distances (e.g. teleseismic body-wave, centroid, and regional moment tensors), and broadband depth. All contributed and derived data are centrally managed in an Oracle database. To improve on single station observations, Bulletin Hydra also does continuous real-time beam forming of high-frequency arrays. Finally, workflow management displays are used to assist NEIC analysts in their day-to-day duties. All combined

  20. Prediction of earthquake hazard by hidden Markov model (around Bilecik, NW Turkey)

    NASA Astrophysics Data System (ADS)

    Can, Ceren Eda; Ergun, Gul; Gokceoglu, Candan

    2014-09-01

    Earthquakes are one of the most important natural hazards to be evaluated carefully in engineering projects, due to the severely damaging effects on human-life and human-made structures. The hazard of an earthquake is defined by several approaches and consequently earthquake parameters such as peak ground acceleration occurring on the focused area can be determined. In an earthquake prone area, the identification of the seismicity patterns is an important task to assess the seismic activities and evaluate the risk of damage and loss along with an earthquake occurrence. As a powerful and flexible framework to characterize the temporal seismicity changes and reveal unexpected patterns, Poisson hidden Markov model provides a better understanding of the nature of earthquakes. In this paper, Poisson hidden Markov model is used to predict the earthquake hazard in Bilecik (NW Turkey) as a result of its important geographic location. Bilecik is in close proximity to the North Anatolian Fault Zone and situated between Ankara and Istanbul, the two biggest cites of Turkey. Consequently, there are major highways, railroads and many engineering structures are being constructed in this area. The annual frequencies of earthquakes occurred within a radius of 100 km area centered on Bilecik, from January 1900 to December 2012, with magnitudes ( M) at least 4.0 are modeled by using Poisson-HMM. The hazards for the next 35 years from 2013 to 2047 around the area are obtained from the model by forecasting the annual frequencies of M ≥ 4 earthquakes.

  1. Prediction of earthquake hazard by hidden Markov model (around Bilecik, NW Turkey)

    NASA Astrophysics Data System (ADS)

    Can, Ceren; Ergun, Gul; Gokceoglu, Candan

    2014-09-01

    Earthquakes are one of the most important natural hazards to be evaluated carefully in engineering projects, due to the severely damaging effects on human-life and human-made structures. The hazard of an earthquake is defined by several approaches and consequently earthquake parameters such as peak ground acceleration occurring on the focused area can be determined. In an earthquake prone area, the identification of the seismicity patterns is an important task to assess the seismic activities and evaluate the risk of damage and loss along with an earthquake occurrence. As a powerful and flexible framework to characterize the temporal seismicity changes and reveal unexpected patterns, Poisson hidden Markov model provides a better understanding of the nature of earthquakes. In this paper, Poisson hidden Markov model is used to predict the earthquake hazard in Bilecik (NW Turkey) as a result of its important geographic location. Bilecik is in close proximity to the North Anatolian Fault Zone and situated between Ankara and Istanbul, the two biggest cites of Turkey. Consequently, there are major highways, railroads and many engineering structures are being constructed in this area. The annual frequencies of earthquakes occurred within a radius of 100 km area centered on Bilecik, from January 1900 to December 2012, with magnitudes (M) at least 4.0 are modeled by using Poisson-HMM. The hazards for the next 35 years from 2013 to 2047 around the area are obtained from the model by forecasting the annual frequencies of M ≥ 4 earthquakes.

  2. Nationwide tsunami hazard assessment project in Japan

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2014-12-01

    In 2012, we began a project of nationwide Probabilistic Tsunami Hazard Assessment (PTHA) in Japan to support various measures (Fujiwara et al., 2013, JpGU; Hirata et al., 2014, AOGS). The most important strategy in the nationwide PTHA is predominance of aleatory uncertainty in the assessment but use of epistemic uncertainty is limited to the minimum, because the number of all possible combinations among epistemic uncertainties diverges quickly when the number of epistemic uncertainties in the assessment increases ; we consider only a type of earthquake occurrence probability distribution as epistemic uncertainty. We briefly show outlines of the nationwide PTHA as follows; (i) we consider all possible earthquakes in the future, including those that the Headquarters for Earthquake Research Promotion (HERP) of Japanese Government, already assessed. (ii) We construct a set of simplified earthquake fault models, called "Characterized Earthquake Fault Models (CEFMs)", for all of the earthquakes by following prescribed rules (Toyama et al., 2014, JpGU; Korenaga et al., 2014, JpGU). (iii) For all of initial water surface distributions caused by a number of the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. (iv) Finally, we integrate information about the tsunamis calculated from the numerous CEFMs to get nationwide tsunami hazard assessments. One of the most popular representations of the integrated information is a tsunami hazard curve for coastal tsunami heights, incorporating uncertainties inherent in tsunami simulation and earthquake fault slip heterogeneity (Abe et al., 2014, JpGU). We will show a PTHA along the eastern coast of Honshu, Japan, based on approximately 1,800 tsunami sources located within the subduction zone along the Japan Trench, as a prototype of the nationwide PTHA. This study is supported by part of the research

  3. Active Fault Mapping of Naga-Disang Thrust (Belt of Schuppen) for Assessing Future Earthquake Hazards in NE India

    NASA Astrophysics Data System (ADS)

    Kumar, A.

    2014-12-01

    We observe the geodynamic appraisal of Naga-Disang Thrust North East India. The Disang thrust extends NE-SW over a length of 480 km and it defines the eastern margin of Neogene basin. It branches out from Haflong-Naga thrust and in the NE at Bulbulia in the right bank of Noa Dihing River, it is terminated by Mishmi thrust, which extends into Myanmar as 'Sagaing fault,which dip generally towards SE. It extends between Dauki fault in the SW and Mishmi thrust in the NE. When the SW end of 'Belt of Schuppen' moved upwards and towards east along the Dauki fault, the NE end moved downwards and towards west along the Mishmi thrust, causing its 'S' shaped bending. The SRTM generated DEM is used to map the topographic expression of the schuppen belt, where these thrusts are significantly marked by topographic break. Satellite imagery map also shows presence lineaments supporting the post tectonic activities along Naga-Disang Thrusts. The southern part of 'Belt of Schuppen' extends along the sheared western limb of southerly plunging Kohima synform, a part of Indo Burma Ranges (IBR) and it is seismically active.The crustal velocity at SE of Schuppen is 39.90 mm/yr with a azimuth of 70.780 at Lumami, 38.84 mm/yr (Azimuth 54.09) at Senapati and 36.85 mm/yr (Azimuth 54.09) at Imphal. The crustal velocity at NW of Schuppen belt is 52.67 mm/yr (Azimuth 57.66) near Dhauki Fault in Meghalaya. It becomes 43.60 mm/yr (Azimuth76.50) - 44.25 (Azimuth 73.27) at Tiding and Kamlang Nagar around Mishmi thrust. The presence of Schuppen is marked by a change in high crustal velocity from Indian plate to low crustal velocity in Mishmi Suture as well as Indo Burma Ranges. The difference in crustal velocities results in building up of strain along the Schuppen which may trigger a large earthquake in the NE India in future. The belt of schuppean seems to be seismically active, however, the enough number of large earthquakes are not recorded. These observations are significant on Naga

  4. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  5. Fault Imaging with High-Resolution Seismic Reflection for Earthquake Hazard and Geothermal Resource Assessment in Reno, Nevada

    SciTech Connect

    Frary, Roxanna

    2012-05-05

    The Truckee Meadows basin is situated adjacent to the Sierra Nevada microplate, on the western boundary of the Walker Lane. Being in the transition zone between a range-front normal fault on the west and northwest-striking right-lateral strike slip faults to the east, there is no absence of faulting in this basin. The Reno- Sparks metropolitan area is located in this basin, and with a signi cant population living here, it is important to know where these faults are. High-resolution seismic reflection surveys are used for the imaging of these faults along the Truckee River, across which only one fault was previously mapped, and in southern Reno near and along Manzanita Lane, where a swarm of short faults has been mapped. The reflection profiles constrain the geometries of these faults, and suggest additional faults not seen before. Used in conjunction with depth to bedrock calculations and gravity measurements, the seismic reflection surveys provide de nitive locations of faults, as well as their orientations. O sets on these faults indicate how active they are, and this in turn has implications for seismic hazard in the area. In addition to seismic hazard, the faults imaged here tell us something about the conduits for geothermal fluid resources in Reno.

  6. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    USGS Publications Warehouse

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto; Murru, Maura; Falcone, Giuseppe; Parsons, Thomas E.

    2017-01-01

    The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.

  7. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto; Murru, Maura; Falcone, Giuseppe; Parsons, Tom

    2017-02-01

    The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.

  8. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  9. Workshop on evaluation of earthquake hazards and risk in the Puget Sound and Portland areas

    SciTech Connect

    Hays, W.W.; Kitzmiller, C.

    1988-01-01

    Three tasks were undertaken in the forum provided by the workshop: (1) assessing the present state-of-knowledge of earthquake hazards in Washington and Oregon including scientific, engineering, and hazard-reduction components; (2) determining the need for additional scientific, engineering, and societal response information to implement an effective earthquake-hazard reduction program; and (3) developing a strategy for implementing programs to reduce potential earthquake losses and to foster preparedness and mitigation. Thirty-five papers were given at the workshop and each of these has been abstracted for the U.S. Department of Energy's Energy Data Base (EDB). In addition, the volume includes a glossary of technical terms used in earthquake engineering in Appendix A.

  10. Guide and Checklist for Nonstructural Earthquake Hazards in California Schools.

    ERIC Educational Resources Information Center

    2003

    The recommendations included in this document are intended to reduce seismic hazards associated with the non-structural components of schools buildings, including mechanical systems, ceiling systems, partitions, light fixtures, furnishings, and other building contents. It identifies potential earthquake hazards and provides recommendations for…

  11. Tank farms hazards assessment

    SciTech Connect

    Broz, R.E.

    1994-09-30

    Hanford contractors are writing new facility specific emergency procedures in response to new and revised US Department of Energy (DOE) Orders on emergency preparedness. Emergency procedures are required for each Hanford facility that has the potential to exceed the criteria for the lowest level emergency, an Alert. The set includes: (1) a facility specific procedure on Recognition and Classification of Emergencies, (2) area procedures on Initial Emergency Response and, (3) an area procedure on Protective Action Guidance. The first steps in developing these procedures are to identify the hazards at each facility, identify the conditions that could release the hazardous material, and calculate the consequences of the releases. These steps are called a Hazards Assessment. The final product is a document that is similar in some respects to a Safety Analysis Report (SAR). The document could br produced in a month for a simple facility but could take much longer for a complex facility. Hanford has both types of facilities. A strategy has been adopted to permit completion of the first version of the new emergency procedures before all the facility hazards Assessments are complete. The procedures will initially be based on input from a task group for each facility. This strategy will but improved emergency procedures in place sooner and therefore enhance Hanford emergency preparedness. The purpose of this document is to summarize the applicable information contained within the Waste Tank Facility ``Interim Safety Basis Document, WHC-SD-WM-ISB-001`` as a resource, since the SARs covering Waste Tank Operations are not current in all cases. This hazards assessment serves to collect, organize, document and present the information utilized during the determination process.

  12. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  13. PUREX facility hazards assessment

    SciTech Connect

    Sutton, L.N.

    1994-09-23

    This report documents the hazards assessment for the Plutonium Uranium Extraction Plant (PUREX) located on the US Department of Energy (DOE) Hanford Site. Operation of PUREX is the responsibility of Westinghouse Hanford Company (WHC). This hazards assessment was conducted to provide the emergency planning technical basis for PUREX. DOE Order 5500.3A requires an emergency planning hazards assessment for each facility that has the potential to reach or exceed the lowest level emergency classification. In October of 1990, WHC was directed to place PUREX in standby. In December of 1992 the DOE Assistant Secretary for Environmental Restoration and Waste Management authorized the termination of PUREX and directed DOE-RL to proceed with shutdown planning and terminal clean out activities. Prior to this action, its mission was to reprocess irradiated fuels for the recovery of uranium and plutonium. The present mission is to establish a passively safe and environmentally secure configuration at the PUREX facility and to preserve that condition for 10 years. The ten year time frame represents the typical duration expended to define, authorize and initiate follow-on decommissioning and decontamination activities.

  14. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  15. Roaming earthquakes in China highlight midcontinental hazards

    NASA Astrophysics Data System (ADS)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  16. Seismic hazard assessments at Islamic Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Khalil, A. E.; Deif, A.; Abdel Hafiez, H. E.

    2015-12-01

    Islamic Cairo is one of the important Islamic monumental complexes in Egypt, near the center of present-day metropolitan Cairo. The age of these buildings is up to one thousand years. Unfortunately, many of the buildings are suffering from huge mishandling that may lead to mass damage. Many buildings and masjids were partially and totally collapsed because of 12th October 1992 Cairo earthquake that took place at some 25 km from the study area with a magnitude Mw = 5.8. Henceforth, potential damage assessments there are compulsory. The deterministic and probabilistic techniques were used to predict the expected future large earthquakes' strong-motion characteristics in the study area. The current study started with compiling the available studies concerned with the distribution of the seismogenic sources and earthquake catalogs. The deterministic method is used to provide a description of the largest earthquake effect on the area of interest, while the probabilistic method, on the other hand, is used to define the uniform hazard curves at three time periods 475, 950, 2475 years. Both deterministic and probabilistic results were obtained for bedrock conditions and the resulted hazard levels were deaggregated to identify the contribution of each seismic source to the total hazard. Moreover, the results obtained show that the expected seismic activities combined with the present situation of the buildings pose high alert to rescue both the cultural heritage and expected human losses.

  17. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  18. Central US earthquake catalog for hazard maps of Memphis, Tennessee

    USGS Publications Warehouse

    Wheeler, R.L.; Mueller, C.S.

    2001-01-01

    An updated version of the catalog that was used for the current national probabilistic seismic-hazard maps would suffice for production of large-scale hazard maps of the Memphis urban area. Deaggregation maps provide guidance as to the area that a catalog for calculating Memphis hazard should cover. For the future, the Nuttli and local network catalogs could be examined for earthquakes not presently included in the catalog. Additional work on aftershock removal might reduce hazard uncertainty. Graphs of decadal and annual earthquake rates suggest completeness at and above magnitude 3 for the last three or four decades. Any additional work on completeness should consider the effects of rapid, local population changes during the Nation's westward expansion. ?? 2001 Elsevier Science B.V. All rights reserved.

  19. Probabilistic Tsunami Hazard Assessment - Application to the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Sorensen, M. B.; Spada, M.; Babeyko, A.; Wiemer, S.; Grünthal, G.

    2009-12-01

    Following several large tsunami events around the world in the recent years, the tsunami hazard is becoming an increasing concern. The traditional way of assessing tsunami hazard has been through deterministic scenario calculations which provide the expected wave heights due to a given tsunami source, usually a worst-case scenario. For quantitative hazard and risk assessment, however, it is necessary to move towards a probabilistic framework. In this study we focus on earthquake generated tsunamis and present a scheme for probabilistic tsunami hazard assessment (PTHA). Our PTHA methodology is based on the use of Monte-Carlo simulations and follows probabilistic seismic hazard assessment methodologies closely. The PTHA is performed in four steps. First, earthquake and tsunami catalogues are analyzed in order to define a number of potential tsunami sources in the study area. For each of these sources, activity rates, maximum earthquake magnitude and uncertainties are assigned. Following, a synthetic earthquake catalogue is established, based on the information about the sources. The third step is to calculate multiple synthetic tsunami scenarios for all potentially tsunamigenic earthquakes in the synthetic catalogue. The tsunami scenarios are then combined at the fourth step to generate hazard curves and maps. We implement the PTHA methodology in the Mediterranean Sea, where numerous tsunami events have been reported in history. We derive a 100000 year-long catalog of potentially tsunamigenic earthquakes and calculate tsunami propagation scenarios for ca. 85000 M6.5+ earthquakes from the synthetic catalog. Results show that the highest tsunami hazard is attributed to the Eastern Mediterranean region, but that also the Western Mediterranean can experience significant tsunami waves for long return periods. Hazard maps will be presented for a range of probability levels together with hazard curves for selected critical locations.

  20. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  1. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  2. Earthquake induced landslide hazard field observatory in the Avcilar peninsula

    NASA Astrophysics Data System (ADS)

    Bigarre, Pascal; Coccia, Stella; Theoleyre, Fiona; Ergintav, Semih; Özel, Oguz; Yalçinkaya, Esref; Lenti, Luca; Martino, Salvatore; Gamba, Paolo; Zucca, Francesco; Moro, Marco

    2015-04-01

    SAR temporal series has been undertaken, providing global but accurate Identification and characterization of gravitational phenomena covering the aera. Evaluation of the resolution and identification of landslide hazard-related features using space multispectral/hyperspectral image data has been realized. Profit has been gained from a vast drilling and geological - geotechnical survey program undertaken by the Istanbul Metropolitan Area, to get important data to complete the geological model of the landslide as well as one deep borehole to set up permanent instrumentation on a quite large slow landslide, fully encircled by a dense building environment. The selected landslide was instrumented in 2014 with a real-time observational system including GPS, rainfall, piezometer and seismic monitoring. Objective of this permanent monitoring system is three folds: first to detect and quantify interaction between seismic motion, rainfall and mass movement, building a database opened to the scientific community in the future, second to help to calibrate dynamic numerical geomechanical simulations intending to study the sensitivity to seismic loading, and last but not least. Last but not least important geophysical field work has been conducted to assess seismic site effects already noticed during the 1999 earthquake .Data, metadata and main results are from now progressively compiled and formatted for appropriate integration in the cloud monitoring infrastructure for data sharing.

  3. 2016 one-year seismic hazard forecast for the Central and Eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-03-28

    The U.S. Geological Survey (USGS) has produced a 1-year seismic hazard forecast for 2016 for the Central and Eastern United States (CEUS) that includes contributions from both induced and natural earthquakes. The model assumes that earthquake rates calculated from several different time windows will remain relatively stationary and can be used to forecast earthquake hazard and damage intensity for the year 2016. This assessment is the first step in developing an operational earthquake forecast for the CEUS, and the analysis could be revised with updated seismicity and model parameters. Consensus input models consider alternative earthquake catalog durations, smoothing parameters, maximum magnitudes, and ground motion estimates, and represent uncertainties in earthquake occurrence and diversity of opinion in the science community. Ground shaking seismic hazard for 1-percent probability of exceedance in 1 year reaches 0.6 g (as a fraction of standard gravity [g]) in northern Oklahoma and southern Kansas, and about 0.2 g in the Raton Basin of Colorado and New Mexico, in central Arkansas, and in north-central Texas near Dallas. Near some areas of active induced earthquakes, hazard is higher than in the 2014 USGS National Seismic Hazard Model (NHSM) by more than a factor of 3; the 2014 NHSM did not consider induced earthquakes. In some areas, previously observed induced earthquakes have stopped, so the seismic hazard reverts back to the 2014 NSHM. Increased seismic activity, whether defined as induced or natural, produces high hazard. Conversion of ground shaking to seismic intensity indicates that some places in Oklahoma, Kansas, Colorado, New Mexico, Texas, and Arkansas may experience damage if the induced seismicity continues unabated. The chance of having Modified Mercalli Intensity (MMI) VI or greater (damaging earthquake shaking) is 5–12 percent per year in north-central Oklahoma and southern Kansas, similar to the chance of damage caused by natural earthquakes

  4. Earthquake Hazard Analysis using Geological Characteristics and Geographic Information System (GIS) in the Southeastern Part of Korea

    NASA Astrophysics Data System (ADS)

    Song, Kyo-Young

    2010-05-01

    Earthquake Hazard Analysis using Geological Characteristics and GIS in the Southeastern Part of Korea The purpose of this study is to investigate earthquake hazards using geologic characteristics and geographic information system (GIS) for assessment and mitigation of earthquake hazards. The southeastern part of Korean peninsula, especially Ulsan and Pohang cities, was chosen for construction of GIS database and analysis of earthquake hazards such as liquefaction, landslide. Two municipal areas are represented as ones of the most populous industrial cities in Korea. However, several large-scale faults such as the Yangsan fault occurred in the vicinity of those areas. In this study, important factors closely related to earthquake hazards such as seismicity, geology, soil distribution, groundwater depth and ground slope data were compiled for spatial database using GIS, and ranked by relative susceptibility of earthquake hazards. To classify vulnerable areas and analyze probability for susceptibility of earthquake hazards, each factor was computed and applied to established dataset for liquefaction and landslide induced from earthquake. To present, the probability of liquefaction in the study area is calculated to about 0.012~0.133 when g value is 0.13~0.14 g. But if the moment magnitude increases to 7.0, the probability of liquefaction increases up to 0.802. The probability of landslide is almost null at present, but it increases rapidly when the moment magnitude reaches 5.0. The landslide is expected in all unstable slopes when the moment magnitude exceeds 6.0. The acquired results from the study area indicate that the liquefaction and landslide induced from earthquake is closely related to the geology. Therefore, general geology such as kind of rocks and age of rocks is very important factor in analyzing earthquake hazards.

  5. Community Exposure and Sensitivity to Earthquake Hazards in Washington State

    NASA Astrophysics Data System (ADS)

    Ratliff, J.; Wood, N. J.; Weaver, C. S.

    2011-12-01

    Communities in Washington State are potentially threatened by earthquakes from many sources, including the Cascadia Subduction zone and myriad inland faults (Seattle fault, Tacoma fault, etc.). The USGS Western Geographic Science Center, in collaboration with the State of Washington Military Department Emergency Management Division, has been working to identify Washington community vulnerability to twenty-one earthquake scenarios to provide assistance for mitigation, preparedness, and outreach. We calculate community earthquake exposure and sensitivity by overlaying demographic and economic data with peak ground acceleration values of each scenario in a geographic information system. We summarize community and county earthquake vulnerability to assist emergency managers by the number of earthquake scenarios affecting each area, as well as the number of residents, occupied households, businesses (individual and sector), and employees in each predicted Modified Mercalli Intensity value (ranging from V to IX). Percentages based on community, county, and scenario totals also provide emergency managers insight to community sensitivity to the earthquake scenarios. Results indicate significant spatial and temporal residential variations as well as spatial economic variations in exposure and sensitivity to earthquake hazards in the State of Washington, especially for communities west of the Cascade Range.

  6. Remotely Triggered Earthquakes in Intraplate Regions: Distributed Hazard, Dependent Events

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Mueller, K.; Bilham, R.; Ambraseys, N.; Martin, S.

    2003-12-01

    The central and eastern United States has experienced only 5 historic earthquakes with Mw above 7.0, the 1886 Charleston earthquake and four during the New Madrid sequence of 1811-1812 (three principal mainshocks and the so-called ``dawn aftershock'' following the first mainshock.) Careful consideration of historic accounts yields compelling evidence for a number of remotely triggered earthquakes in both 1812 and 1886, including several events large enough to be potentially damaging. We propose that one of the (alleged) New Madrid mainshocks, on 23 January 1812, may itself be a remotely triggered earthquake, with a location some 200 km north of the New Madrid Seismic Zone. Our proposed source location is near the location of the 1968 southern Illinois earthquake, which occurred on a blind thrust fault at 20-25 km depth. Intensity data for the 1812 event are consistent with expectations for a similarly deep event. Such triggered events presumably do not represent a wholly new source of hazard but rather a potential source of dependent hazard. That is, the common assumption is that the triggering will cause only a ``clock advance,'' rather than causing earthquakes that would not have otherwise occurred. However, in a low strain-rate region, a given dynamic stress change can potentially represent a much larger clock advance than the same change would cause in a high strain-rate region. Moreover, in regions with low seismicity and a short historic record, overlooked remotely triggered historic earthquakes may be important events. It is thus possible that significant events are currently missing from the historic catalogs. Such events--even if large--can be difficult to identify without instrumental data. The (interplate) 1905 Kangra, India earthquake, further illustrates this point. In this case, early seismic records provide corroboration of an early triggered event whose existence is suggested--but difficult to prove--based on detailed macroseismic data. In the

  7. Ice Mass Fluctuations and Earthquake Hazard

    NASA Technical Reports Server (NTRS)

    Sauber, J.

    2006-01-01

    In south central Alaska, tectonic strain rates are high in a region that includes large glaciers undergoing ice wastage over the last 100-150 years [Sauber et al., 2000; Sauber and Molnia, 2004]. In this study we focus on the region referred to as the Yakataga segment of the Pacific-North American plate boundary zone in Alaska. In this region, the Bering and Malaspina glacier ablation zones have average ice elevation decreases from 1-3 meters/year (see summary and references in Molnia, 2005). The elastic response of the solid Earth to this ice mass decrease alone would cause several mm/yr of horizontal motion and uplift rates of up to 10-12 mm/yr. In this same region observed horizontal rates of tectonic deformation range from 10 to 40 mm/yr to the north-northwest and the predicted tectonic uplift rates range from -2 mm/year near the Gulf of Alaska coast to 12mm/year further inland [Savage and Lisowski, 1988; Ma et al, 1990; Sauber et al., 1997, 2000, 2004; Elliot et al., 2005]. The large ice mass changes associated with glacial wastage and surges perturb the tectonic rate of deformation at a variety of temporal and spatial scales. The associated incremental stress change may enhance or inhibit earthquake occurrence. We report recent (seasonal to decadal) ice elevation changes derived from data from NASA's ICESat satellite laser altimeter combined with earlier DEM's as a reference surface to illustrate the characteristics of short-term ice elevation changes [Sauber et al., 2005, Muskett et al., 2005]. Since we are interested in evaluating the effect of ice changes on faulting potential, we calculated the predicted surface displacement changes and incremental stresses over a specified time interval and calculated the change in the fault stability margin using the approach given by Wu and Hasegawa [1996]. Additionally, we explored the possibility that these ice mass fluctuations altered the seismic rate of background seismicity. Although we primarily focus on

  8. Model Uncertainty, Earthquake Hazard, and the WGCEP-2002 Forecast

    NASA Astrophysics Data System (ADS)

    Page, M. T.; Carlson, J. M.

    2005-12-01

    Model uncertainty is prevalent in Probabilistic Seismic Hazard Analysis (PSHA) because the true mechanism generating risk is unknown. While it is well-understood how to incorporate parameter uncertainty in PSHA, model uncertainty is more difficult to incorporate due to the high degree of dependence between different earthquake-recurrence models. We find that the method used by the 2002 Working Group on California Earthquake Probabilities (WG02) to combine the probability distributions given by multiple models has several adverse effects on their result. In particular, taking a linear combination of the various models ignores issues of model dependence and leads to large uncertainties in the final hazard estimate. Furthermore, choosing model weights based on data can systematically bias the final probability distribution. The weighting scheme of the WG02 report also depends upon an arbitrary ordering of models. In addition to analyzing current statistical problems, we present alternative methods for rigorously incorporating model uncertainty into PSHA.

  9. Integrating Real-time Earthquakes into Natural Hazard Courses

    NASA Astrophysics Data System (ADS)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.

    2001-12-01

    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm

  10. Maximum Earthquake Magnitude Assessments by Japanese Government Committees (Invited)

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2013-12-01

    The 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history and such a gigantic earthquake was not foreseen around Japan. After the 2011 disaster, various government committees in Japan have discussed and assessed the maximum credible earthquake size around Japan, but their values vary without definite consensus. I will review them with earthquakes along the Nankai Trough as an example. The Central Disaster Management Council, under Cabinet Office, set up a policy for the future tsunami disaster mitigation. The possible future tsunamis are classified into two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, for which saving people's lives is the first priority with soft measures such as tsunami hazard maps, evacuation facilities or disaster education. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared. The assessments of L1 and L2 events are left to local governments. The CDMC also assigned M 9.1 as the maximum size of earthquake along the Nankai trough, then computed the ground shaking and tsunami inundation for several scenario earthquakes. The estimated loss is about ten times the 2011 disaster, with maximum casualties of 320,000 and economic loss of 2 trillion dollars. The Headquarters of Earthquake Research Promotion, under MEXT, was set up after the 1995 Kobe earthquake and has made long-term forecast of large earthquakes and published national seismic hazard maps. The future probability of earthquake occurrence, for example in the next 30 years, was calculated from the past data of large earthquakes, on the basis of characteristic earthquake model. The HERP recently revised the long-term forecast of Naknai trough earthquake; while the 30 year probability (60 - 70 %) is similar to the previous estimate, they noted the size can be M 8 to 9, considering the variability of past

  11. Awareness and understanding of earthquake hazards at school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  12. 2017 one-year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert A.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce the 2017 one-year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one-year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one-year) in five focus areas: Oklahoma-Kansas, the Raton Basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 magnitude (M) ≥ 4 and three M ≥ 5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma-Kansas focus area two earthquakes with M ≥ 4 occurred near Trinidad, Colorado (in the Raton Basin focus area), but no earthquakes with M ≥ 2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared to 2015, which may be related to decreased wastewater injection, caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  13. Earthquake Hazard in the New Madrid Seismic Zone Remains a Concern

    USGS Publications Warehouse

    Frankel, A.D.; Applegate, D.; Tuttle, M.P.; Williams, R.A.

    2009-01-01

    There is broad agreement in the scientific community that a continuing concern exists for a major destructive earthquake in the New Madrid seismic zone. Many structures in Memphis, Tenn., St. Louis, Mo., and other communities in the central Mississippi River Valley region are vulnerable and at risk from severe ground shaking. This assessment is based on decades of research on New Madrid earthquakes and related phenomena by dozens of Federal, university, State, and consulting earth scientists. Considerable interest has developed recently from media reports that the New Madrid seismic zone may be shutting down. These reports stem from published research using global positioning system (GPS) instruments with results of geodetic measurements of strain in the Earth's crust. Because of a lack of measurable strain at the surface in some areas of the seismic zone over the past 14 years, arguments have been advanced that there is no buildup of stress at depth within the New Madrid seismic zone and that the zone may no longer pose a significant hazard. As part of the consensus-building process used to develop the national seismic hazard maps, the U.S. Geological Survey (USGS) convened a workshop of experts in 2006 to evaluate the latest findings in earthquake hazards in the Eastern United States. These experts considered the GPS data from New Madrid available at that time that also showed little to no ground movement at the surface. The experts did not find the GPS data to be a convincing reason to lower the assessment of earthquake hazard in the New Madrid region, especially in light of the many other types of data that are used to construct the hazard assessment, several of which are described here.

  14. Seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir; Nekrasova, Anastasia

    2014-05-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. Any kind of risk estimates R(g) at location g results from a convolution of the natural hazard H(g) with the exposed object under consideration O(g) along with its vulnerability V(O(g)). Note that g could be a point, or a line, or a cell on or under the Earth surface and that distribution of hazards, as well as objects of concern and their vulnerability, could be time-dependent. There exist many different risk estimates even if the same object of risk and the same hazard are involved. It may result from the different laws of convolution, as well as from different kinds of vulnerability of an object of risk under specific environments and conditions. Both conceptual issues must be resolved in a multidisciplinary problem oriented research performed by specialists in the fields of hazard, objects of risk, and object vulnerability, i.e. specialists in earthquake engineering, social sciences and economics. To illustrate this general concept, we first construct seismic hazard assessment maps based on the Unified Scaling Law for Earthquakes (USLE). The parameters A, B, and C of USLE, i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an area of linear size L, are used to estimate the expected maximum

  15. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  16. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation

  17. Nationwide Assessment of Seismic Hazard for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.; Mumladze, T.

    2014-12-01

    The work presents a framework for assessment of seismic hazards on national level for the Georgia. Based on a historical review of the compilation of seismic hazard zoning maps for the Georgia became evident that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation. The methodology for the probabilistic assessment of seismic hazard used here includes the following steps: produce comprehensive catalogue of historical earthquakes (up to 1900) and the period of instrumental observations with uniform scale of magnitudes; produce models of seismic source zones (SSZ) and their parameterization; develop appropriate ground motion prediction equation (GMPE) models; develop seismic hazard curves for spectral amplitudes at each period and maps in digital format. Firstly, the new seismic catalog of Georgia was created, with 1700 eqs from ancient times on 2012, Mw³4.0. Secondly, were allocated seismic source zones (SSZ). The identification of area SSZ was obtained on the bases of structural geology, parameters of seismicity and seismotectonics. In constructing the SSZ, the slope of the appropriate active fault plane, the width of the dynamic influence of the fault, power of seismoactive layer are taken into account. Finally each SSZ was defined with the parameters: the geometry, the percentage of focal mechanism, predominant azimuth and dip angle values, activity rates, maximum magnitude, hypocenter depth distribution, lower and upper seismogenic depth values. Thirdly, seismic hazard maps were calculated based on modern approach of selecting and ranking global and regional ground motion prediction equation for region. Finally, probabilistic seismic hazard assessment in terms of ground acceleration were calculated for the territory of Georgia. On the basis of obtained area seismic sources probabilistic seismic hazard maps were calculated showing peak ground acceleration (PGA) and spectral accelerations (SA) at

  18. Seismic hazard assessment in Grecce: Revisited

    NASA Astrophysics Data System (ADS)

    Makropoulos, Kostas; Chousianitis, Kostas; Kaviris, George; Kassaras, Ioannis

    2013-04-01

    Greece is the most earthquake prone country in the eastern Mediterranean territory and one of the most active areas globally. Seismic Hazard Assessment (SHA) is a useful procedure to estimate the expected earthquake magnitude and strong ground-motion parameters which are necessary for earthquake resistant design. Several studies on the SHA of Greece are available, constituting the basis of the National Seismic Code. However, the recently available more complete, accurate and homogenous seismological data (the new earthquake catalogue of Makropoulos et al., 2012), the revised seismic zones determined within the framework of the SHARE project (2012), new empirical attenuation formulas extracted for several regions in Greece, as well as new algorithms of SHA, are innovations that motivated the present study. Herewith, the expected earthquake magnitude for Greece is evaluated by applying the zone-free, upper bounded Gumbel's third asymptotic distribution of extreme values method. The peak ground acceleration (PGA), velocity (PGV) and displacement (PGD) are calculated at the seismic bedrock using two methods: (a) the Gumbel's first asymptotic distribution of extreme values, since it is valid for initial open-end distributions and (b) the Cornell-McGuire approach, using the CRISIS2007 (Ordaz et. al., 2007) software. The latter takes into account seismic source zones for which seismicity parameters are assigned following a Poisson recurrence model. Thus, each source is characterized by a series of seismic parameters, such as the magnitude recurrence and the recurrence rate for threshold magnitude, while different predictive equations can be assigned to different seismic source zones. Recent available attenuation parameters were considered. Moreover, new attenuation parameters for the very seismically active Corinth Gulf deduced during this study, from recordings of the RASMON accelerometric array, were used. The hazard parameters such as the most probable annual maximum

  19. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    SciTech Connect

    Blanchard, A.

    2000-02-28

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program.

  20. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  1. Seismic Hazard and Risk Assessment in Multi-Hazard Prone Urban Areas: The Case Study of Cologne, Germany

    NASA Astrophysics Data System (ADS)

    Tyagunov, S.; Fleming, K.; Parolai, S.; Pittore, M.; Vorogushyn, S.; Wieland, M.; Zschau, J.

    2012-04-01

    Most hazard and risk assessment studies usually analyze and represent different kinds of hazards and risks separately, although risk assessment and mitigation programs in multi-hazard prone urban areas should take into consideration possible interactions of different hazards. This is particularly true for communities located in seismically active zones, where, on the one hand, earthquakes are capable of triggering other types of hazards, while, on the other hand, one should bear in mind that temporal coincidence or succession of different hazardous events may influence the vulnerability of the existing built environment and, correspondingly, the level of the total risk. Therefore, possible inter-dependencies and inter-influences of different hazards should be reflected properly in the hazard, vulnerability and risk analyses. This work presents some methodological aspects and preliminary results of a study being implemented within the framework of the MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) project. One of the test cases of the MATRIX project is the city of Cologne, which is one of the largest cities of Germany. The area of Cologne, being exposed to windstorm, flood and earthquake hazards, has already been considered in comparative risk assessments. However, possible interactions of these different hazards have been neglected. The present study is aimed at the further development of a holistic multi-risk assessment methodology, taking into consideration possible time coincidence and inter-influences of flooding and earthquakes in the area.

  2. Volcano and earthquake hazards in the Crater Lake region, Oregon

    USGS Publications Warehouse

    Bacon, Charles R.; Mastin, Larry G.; Scott, Kevin M.; Nathenson, Manuel

    1997-01-01

    Crater Lake lies in a basin, or caldera, formed by collapse of the Cascade volcano known as Mount Mazama during a violent, climactic eruption about 7,700 years ago. This event dramatically changed the character of the volcano so that many potential types of future events have no precedent there. This potentially active volcanic center is contained within Crater Lake National Park, visited by 500,000 people per year, and is adjacent to the main transportation corridor east of the Cascade Range. Because a lake is now present within the most likely site of future volcanic activity, many of the hazards at Crater Lake are different from those at most other Cascade volcanoes. Also significant are many faults near Crater Lake that clearly have been active in the recent past. These faults, and historic seismicity, indicate that damaging earthquakes can occur there in the future. This report describes the various types of volcano and earthquake hazards in the Crater Lake area, estimates of the likelihood of future events, recommendations for mitigation, and a map of hazard zones. The main conclusions are summarized below.

  3. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    SciTech Connect

    Thenhaus, P.C. )

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-12 in the area of New Madrid, Missouri. They are considered to be the greatest earthquakes in the conterminous U.S. because they were felt and caused damage at far greater distances than any other earthquakes in US history. In contrast to California, where earthquakes are felt frequently, the damaging earthquakes that have occurred in the Eastern US are generally regarded as only historical phenomena. A fundamental problem in the Eastern US, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the tools that geoscientists have to study the region. The so-called earthquake hazard is defined by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. The term earthquake risk, on the other hand, refers to aspects of the expected damage to manmade structures and to lifelines as a result of the earthquake hazard.

  4. How Can Museum Exhibits Enhance Earthquake and Tsunami Hazard Resiliency?

    NASA Astrophysics Data System (ADS)

    Olds, S. E.

    2015-12-01

    Creating a natural disaster-ready community requires interoperating scientific, technical, and social systems. In addition to the technical elements that need to be in place, communities and individuals need to be prepared to react when a natural hazard event occurs. Natural hazard awareness and preparedness training and education often takes place through informal learning at science centers and formal k-12 education programs as well as through awareness raising via strategically placed informational tsunami warning signs and placards. Museums and science centers are influential in raising science literacy within a community, however can science centers enhance earthquake and tsunami resiliency by providing hazard science content and preparedness exhibits? Museum docents and informal educators are uniquely situated within the community. They are transmitters and translators of science information to broad audiences. Through interaction with the public, docents are well positioned to be informants of the knowledge beliefs, and feelings of science center visitors. They themselves are life-long learners, both constantly learning from the museum content around them and sharing this content with visitors. They are also members of a community where they live. In-depth interviews with museum informal educators and docents were conducted at a science center in coastal Pacific Northwest. This region has a potential to be struck by a great 9+ Mw earthquake and subsequent tsunami. During the interviews, docents described how they applied learning from natural hazard exhibits at a science visitor center to their daily lives. During the individual interviews, the museum docents described their awareness (knowledge, attitudes, and behaviors) of natural hazards where they live and work, the feelings evoked as they learned about their hazard vulnerability, the extent to which they applied this learning and awareness to their lives, such as creating an evacuation plan, whether

  5. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  6. Earthquake hazard mapping for lifeline engineering Coquitlam, British Columbia

    SciTech Connect

    Gohl, W.B.; Hawson, H.H.; Dou, H.; Nyberg, N.; Lee, R.; Wong, H.

    1995-12-31

    A series of maps plotted at a 1:15,000 scale were prepared to illustrate geotechnical aspects of seismic hazard for the 475 year return period earthquake event within the City of Coquitlam located in the Vancouver Lower Mainland of British Columbia. The maps were prepared to facilitate evaluation of lifeline damage potential within the City of Coquitlam (e.g. roads, sewers, water supply lines, oil/gas pipelines, power lines, compressor/pumping stations, water reservoirs, bridges, and rail lines) and to assist in evaluation of the impact of seismic ground shaking on new infrastructure.

  7. Using a physics-based earthquake simulator to evaluate seismic hazard in NW Iran

    NASA Astrophysics Data System (ADS)

    Khodaverdian, A.; Zafarani, H.; Rahimian, M.

    2016-07-01

    NW Iran is a region of active deformation in the Eurasia-Arabia collision zone. This high strain field has caused intensive faulting accompanied by several major (M > 6.5) earthquakes as it is evident from historical records. Whereas seismic data (i.e. instrumental and historical catalogues) are either short, or inaccurate and inhomogeneous, physics-based long-term simulations are beneficial to better assess seismic hazard. In this study, a deterministic seismicity model, which consists of major active faults, is first constructed, and used to generate a synthetic catalogue of large-magnitude (M > 5.5) earthquakes. The frequency-magnitude distribution of the synthetic earthquake catalogue, which is based on the physical characteristic and slip rate of the mapped faults, is consistent with the empirical distribution evaluated using record of instrumental and historical events. The obtained results are also in accordance with palaeoseismic studies and other independent kinematic deformation models of the Iranian Plateau. Using the synthetic catalogue, characteristic magnitude for all 16 active faults in the study area is determined. Magnitude and epicentre of these earthquakes are comparable with the historical records. Large earthquake recurrence times and their variations are evaluated, either for an individual fault or for the region as a whole. Goodness-of-fitness tests revealed that recurrence times can be well described by the Weibull distribution. Time-dependent conditional probabilities for large earthquakes in the study area are also estimated for different time intervals. The resulting synthetic catalogue can be utilized as a useful data set for hazard and risk assessment instead of short, incomplete and inhomogeneous available catalogues.

  8. Reducing Vulnerability of Ports and Harbors to Earthquake and Tsunami Hazards

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Recent scientific research suggests the Pacific Northwest could experience catastrophic earthquakes in the near future, both from distant and local sources, posing a significant threat to coastal communities. Damage could result from numerous earthquake-related hazards, such as severe ground shaking, soil liquefaction, landslides, land subsidence/uplift, and tsunami inundation. Because of their geographic location, ports and harbors are especially vulnerable to these hazards. Ports and harbors, however, are important components of many coastal communities, supporting numerous activities critical to the local and regional economy and possibly serving as vital post-event, response-recovery transportation links. A collaborative, multi-year initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to earthquake and tsunami hazards, involving Oregon Sea Grant (OSG), Washington Sea Grant (WSG), the National Oceanic and Atmospheric Administration Coastal Services Center (CSC), and the U.S. Geological Survey Center for Science Policy (CSP). Specific products of this research, planning, and outreach initiative include a regional stakeholder issues and needs assessment, a community-based mitigation planning process, a Geographic Information System (GIS) — based vulnerability assessment methodology, an educational web-site and a regional data archive. This paper summarizes these efforts, including results of two pilot port-harbor community projects, one in Yaquina Bay, Oregon and the other in Sinclair Inlet, Washington. Finally, plans are outlined for outreach to other port and harbor communities in the Pacific Northwest and beyond, using "getting started" workshops and a web-based tutorial.

  9. Cruise report for 01-99-SC: southern California earthquake hazards project

    USGS Publications Warehouse

    Normark, William R.; Reid, Jane A.; Sliter, Ray W.; Holton, David; Gutmacher, Christina E.; Fisher, Michael A.; Childs, Jonathan R.

    1999-01-01

    The focus of the Southern California Earthquake Hazards project is to identify the landslide and earthquake hazards and related ground-deformation processes occurring in the offshore areas that have significant potential to impact the inhabitants of the Southern California coastal region. The project activity is supported through the Coastal and Marine Geology Program of the Geologic Division of the U. S. Geological Survey (USGS) and is a component of the Geologic Division's Science Strategy under Goal 1—Conduct Geologic Hazard Assessments for Mitigation Planning (Bohlen et al., 1998). The project research is specifically stated under Activity 1.1.2 of the Science Strategy: Earthquake Hazard Assessments and Loss Reduction Products in Urban Regions. This activity involves "research, seismic and geodetic monitoring, field studies, geologic mapping, and analyses needed to provide seismic hazard assessments of major urban centers in earthquake-prone regions including adjoining coastal and offshore areas." The southern California urban areas, which form the most populated urban corridor along the U.S. Pacific margin, are among a few specifically designated for special emphasis under the Division's science strategy (Bohlen et al., 1998). The primary objective of the project is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this objective, we are conducting field investigations to observe the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (Fig. 1). In addition, acoustic imaging should help determine the subsurface dimensions of the faults and identify the size and frequency of submarine landslides, both of which are necessary for evaluating the potential for

  10. The Pacific Northwest; linkage between earthquake and volcano hazards

    USGS Publications Warehouse

    Crosson, R.S.

    1990-01-01

    The Pacific Northwest (Oregon, Washington, and northern California) is experiencing rapid industrial and population growth. The same conditions that make the region attractive- close proximity to both mountains and oceans, volcanoes and spectacular inland waters- also present significant geologic hazards that are easily overlooked in the normal timetable of human activities. The catastrophic eruption of Mount St. Helens 10 years ago serves as a dramatic reminder of the forces of nature that can be unleashed through volcanism. other volcanoes such as  mount Rainier, a majestic symbol of Washington, or Mount hood in Oregon, lie closer to population centers and could present far greater hazards should they become active. Earthquakes may affect even larger regions, prodcuging more cumulative damage. 

  11. Errors in Seismic Hazard Assessment are Creating Huge Human Losses

    NASA Astrophysics Data System (ADS)

    Bela, J.

    2015-12-01

    The current practice of representing earthquake hazards to the public based upon their perceived likelihood or probability of occurrence is proven now by the global record of actual earthquakes to be not only erroneous and unreliable, but also too deadly! Earthquake occurrence is sporadic and therefore assumptions of earthquake frequency and return-period are both not only misleading, but also categorically false. More than 700,000 people have now lost their lives (2000-2011), wherein 11 of the World's Deadliest Earthquakes have occurred in locations where probability-based seismic hazard assessments had predicted only low seismic low hazard. Unless seismic hazard assessment and the setting of minimum earthquake design safety standards for buildings and bridges are based on a more realistic deterministic recognition of "what can happen" rather than on what mathematical models suggest is "most likely to happen" such future huge human losses can only be expected to continue! The actual earthquake events that did occur were at or near the maximum potential-size event that either already had occurred in the past; or were geologically known to be possible. Haiti's M7 earthquake, 2010 (with > 222,000 fatalities) meant the dead could not even be buried with dignity. Japan's catastrophic Tohoku earthquake, 2011; a M9 Megathrust earthquake, unleashed a tsunami that not only obliterated coastal communities along the northern Japanese coast, but also claimed > 20,000 lives. This tsunami flooded nuclear reactors at Fukushima, causing 4 explosions and 3 reactors to melt down. But while this history of huge human losses due to erroneous and misleading seismic hazard estimates, despite its wrenching pain, cannot be unlived; if faced with courage and a more realistic deterministic estimate of "what is possible", it need not be lived again. An objective testing of the results of global probability based seismic hazard maps against real occurrences has never been done by the

  12. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  13. Secondary impact hazard assessment

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A series of light gas gun shots (4 to 7 km/sec) were performed with 5 mg nylon and aluminum projectiles to determine the size, mass, velocity, and spatial distribution of spall and ejecta from a number of graphite/epoxy targets. Similar determinations were also performed on a few aluminum targets. Target thickness and material were chosen to be representative of proposed Space Station structure. The data from these shots and other information were used to predict the hazard to Space Station elements from secondary particles resulting from impacts of micrometeoroids and orbital debris on the Space Station. This hazard was quantified as an additional flux over and above the primary micrometeoroid and orbital debris flux that must be considered in the design process. In order to simplify the calculations, eject and spall mass were assumed to scale directly with the energy of the projectile. Other scaling systems may be closer to reality. The secondary particles considered are only those particles that may impact other structure immediately after the primary impact. The addition to the orbital debris problem from these primary impacts was not addressed. Data from this study should be fed into the orbital debris model to see if Space Station secondaries make a significant contribution to orbital debris. The hazard to a Space Station element from secondary particles above and beyond the micrometeoroid and orbital debris hazard is categorized in terms of two factors: (1) the 'view factor' of the element to other Space Station structure or the geometry of placement of the element, and (2) the sensitivity to damage, stated in terms of energy. Several example cases were chosen, the Space Station module windows, windows of a Shuttle docked to the Space Station, the habitat module walls, and the photovoltaic solar cell arrays. For the examples chosen the secondary flux contributed no more than 10 percent to the total flux (primary and secondary) above a given calculated

  14. Transparent Global Seismic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen

    2013-04-01

    Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits

  15. Numerical earthquake model of the 20 April 2015 southern Ryukyu subduction zone M6.4 event and its impact on seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong

    2015-10-01

    The M6.4 earthquake that took place on the 20 April 2015 off the shore of eastern Taiwan was the largest event in the vicinity of Taiwan during 2015. The mainshock was located in the southern Ryukyu subduction zone, which is the interface between the Philippine Sea Plate and the Eurasian Plate. People in Taipei experienced strong ground shaking for more than 40 s, even though the epicenter was located more than 150 km away. In order to understand the origin of ground motions from this earthquake and how it caused such strong shaking in Taipei, a numerical earthquake model is analyzed, including models of source rupture and wave propagation. First, a joint source inversion was performed using teleseismic body wave and local ground motion data. Source inversion results show that a large slip occurred near the hypocenter, which rapidly released seismic energy in the first 2 s. Then, the rupture propagated toward the shallow fault plane. A large amount of seismic energy was released during this rupture stage that slipped for more than 8 s before the end of the rupture. The estimated stress drop is 2.48 MPa, which is consistent with values for subduction zone earthquakes. Forward simulation using this inverted source rupture model and a 3D seismic velocity model based on the spectral-element method was then performed. Results indicate that the strong ground motion in Taipei resulted from two factors: (1) the Taipei basin amplification effect and (2) the specific source radiation pattern. The results of this numerical earthquake model imply that future subduction zone events that occur in offshore eastern Taiwan are likely to cause relatively strong ground shaking in northern Taiwan, especially in the Taipei metropolitan area.

  16. Numerical earthquake model of the 20 April 2015 southern Ryukyu subduction zone M6.4 event and its impact on seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Lee, S. J.

    2015-12-01

    The M6.4 earthquake that took place on the 20th April 2015 off the shore of eastern Taiwan was the largest event that occurred in the vicinity of Taiwan during 2015. The mainshock located in the southern Ryukyu subduction zone, which is the interface between the Philippine Sea Plate and the Eurasian Plate. People in Taipei experienced strong ground shaking for more than 40 s, even though the epicenter was located more than 150 km away. In order to understand the origin of this earthquake and how it caused such strong shaking in Taipei, a numerical earthquake model is analyzed, including models of source rupture and wave propagation. First, a joint source inversion is performed using teleseismic body wave and local ground motion data. Source inversion results show that a large slip occurred near the hypocenter, which rapidly released seismic energy in the first 2 s. Then, the rupture propagated toward the shallow fault plane. A large amount of seismic energy was released during this rupture stage that slipped for more than 8 s before the end of the rupture. The estimated stress drop is 2.48 MPa, which is consistent with values for subduction zone earthquakes. Forward simulation using this inverted source rupture model based on the spectral-element method is then performed. Results indicate that the strong ground motion in Taipei resulted from two factors: (1) the Taipei basin amplification effect and (2) the specific source radiation pattern. The results of this numerical earthquake model imply that future subduction zone events that occur in offshore eastern Taiwan are likely to cause relatively strong ground shaking in northern Taiwan, especially in the Taipei metropolitan area.

  17. Earthquake Risk Assessment and Risk Transfer

    NASA Astrophysics Data System (ADS)

    Liechti, D.; Zbinden, A.; Rüttener, E.

    Research on risk assessment of natural catastrophes is very important for estimating its economical and social impact. The loss potentials of such disasters (e.g. earthquake and storms) for property owners, insurance and nationwide economies are driven by the hazard, the damageability (vulnerability) of buildings and infrastructures and depend on the ability to transfer these losses to different parties. In addition, the geographic distribution of the exposed values, the uncertainty of building vulnerability and the individual deductible are main factors determining the size of a loss. The deductible is the key element that steers the distribution of losses between insured and insurer. Therefore the risk analysis concentrates on deductible and vulnerability of insured buildings and maps their variations to allow efficient decisions. With consideration to stochastic event sets, the corresponding event losses can be modelled as expected loss grades of a Beta probability density function. Based on deductible and standard deviation of expected loss grades, the loss for the insured and for the insurer can be quantified. In addition, the varying deductible impact on different geographic regions can be described. This analysis has been carried out for earthquake insurance portfolios with various building types and different deductibles. Besides quantifying loss distributions between insured and insurer based on uncertainty assumptions and deductible consideration, mapping yields ideas to optimise the risk transfer process and can be used for developing risk mitigation strategies.

  18. Salient beliefs about earthquake hazards and household preparedness.

    PubMed

    Becker, Julia S; Paton, Douglas; Johnston, David M; Ronan, Kevin R

    2013-09-01

    Prior research has found little or no direct link between beliefs about earthquake risk and household preparedness. Furthermore, only limited work has been conducted on how people's beliefs influence the nature and number of preparedness measures adopted. To address this gap, 48 qualitative interviews were undertaken with residents in three urban locations in New Zealand subject to seismic risk. The study aimed to identify the diverse hazard and preparedness-related beliefs people hold and to articulate how these are influenced by public education to encourage preparedness. The study also explored how beliefs and competencies at personal, social, and environmental levels interact to influence people's risk management choices. Three main categories of beliefs were found: hazard beliefs; preparedness beliefs; and personal beliefs. Several salient beliefs found previously to influence the preparedness process were confirmed by this study, including beliefs related to earthquakes being an inevitable and imminent threat, self-efficacy, outcome expectancy, personal responsibility, responsibility for others, and beliefs related to denial, fatalism, normalization bias, and optimistic bias. New salient beliefs were also identified (e.g., preparedness being a "way of life"), as well as insight into how some of these beliefs interact within the wider informational and societal context.

  19. Evansville Area Earthquake Hazards Mapping Project (EAEHMP) - Progress Report, 2008

    USGS Publications Warehouse

    Boyd, Oliver S.; Haase, Jennifer L.; Moore, David W.

    2009-01-01

    Maps of surficial geology, deterministic and probabilistic seismic hazard, and liquefaction potential index have been prepared by various members of the Evansville Area Earthquake Hazard Mapping Project for seven quadrangles in the Evansville, Indiana, and Henderson, Kentucky, metropolitan areas. The surficial geologic maps feature 23 types of surficial geologic deposits, artificial fill, and undifferentiated bedrock outcrop and include alluvial and lake deposits of the Ohio River valley. Probabilistic and deterministic seismic hazard and liquefaction hazard mapping is made possible by drawing on a wealth of information including surficial geologic maps, water well logs, and in-situ testing profiles using the cone penetration test, standard penetration test, down-hole shear wave velocity tests, and seismic refraction tests. These data were compiled and collected with contributions from the Indiana Geological Survey, Kentucky Geological Survey, Illinois State Geological Survey, United States Geological Survey, and Purdue University. Hazard map products are in progress and are expected to be completed by the end of 2009, with a public roll out in early 2010. Preliminary results suggest that there is a 2 percent probability that peak ground accelerations of about 0.3 g will be exceeded in much of the study area within 50 years, which is similar to the 2002 USGS National Seismic Hazard Maps for a firm rock site value. Accelerations as high as 0.4-0.5 g may be exceeded along the edge of the Ohio River basin. Most of the region outside of the river basin has a low liquefaction potential index (LPI), where the probability that LPI is greater than 5 (that is, there is a high potential for liquefaction) for a M7.7 New Madrid type event is only 20-30 percent. Within the river basin, most of the region has high LPI, where the probability that LPI is greater than 5 for a New Madrid type event is 80-100 percent.

  20. Probabilistic Seismic Hazard Assessment from Incomplete and Uncertain Data

    NASA Astrophysics Data System (ADS)

    Smit, Ansie; Kijko, Andrzej

    2016-04-01

    A question that frequently arises with seismic hazard assessment is why are our assessments so poor? Often the answer is that in many cases the standard applied methodologies do not take into account the nature of seismic event catalogs. In reality these catalogues are incomplete with uncertain magnitude estimates and a significant discrepancy between the empirical data and applied occurrence model. Most probabilistic seismic hazard analysis procedures require knowledge of at least three seismic source parameters: the mean seismic activity rate λ, the Gutenberg-Richter b-value, and the area-characteristic (seismogenic source) maximum possible earthquake magnitude Mmax. In almost all currently used seismic hazard assessment procedures utilizing these three parameters, it's explicitly assumed that all three remain constant over a specified time and space. However, closer examination of most earthquake catalogues indicates that there are significant spatial and temporal variations in the seismic activity rate λ as well as the Gutenberg-Richter b-value. In the proposed methodology the maximum likelihood estimation of these earthquake hazard parameters takes into account the incompleteness of catalogues, uncertainty in the earthquake magnitude determination as well as the uncertainty associated with the applied earthquake occurrence models. The uncertainty in the earthquake occurrence models are introduced by assuming that both, the mean, seismic activity rate λ and the b-value of Gutenberg-Richter are random variables, each described by the Gamma distribution. The approach results in the extension of the classic frequency-magnitude Gutenberg-Richter relation and the Poisson distribution of number of earthquakes, with their compounded counterparts. The proposed procedure is applied in the estimation of the seismic parameters for the area of Ceres-Tulbagh, South Africa, which experienced the strongest earthquake in the country's recorded history. In this example it is

  1. Multi-hazards risk assessment at different levels

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2012-04-01

    Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The

  2. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 15. Tsunamis, Seiches, and Landslide-Induced Water Waves.

    DTIC Science & Technology

    1979-11-01

    EXCEPT ALEUTIAN ISLANDS) ,LANDSLIDES OR SUBAQUEOUS SLIDES )CAN PRODUCE ZONE 5 ELEVATIONS1 *% (e.g. LITUYA BAY , ALASKA) ALEUTIAN ISLANDS (SAME AS GULF...has been documentedwas generated in 1958 by a landslide that was triggered by an earthquake and slid into Lituya Bay , Alaska. The landslide generated...generated waves in Lituya Bay in 1853, 1874, and 1936 (Miller, 1960). 118. Subaqueous landslides triggered by the 1964 Alaskan tsunami caused widespread

  3. Comprehensive seismic hazard assessment of Tripura and Mizoram states

    NASA Astrophysics Data System (ADS)

    Sitharam, T. G.; Sil, Arjun

    2014-06-01

    Northeast India is one of the most highly seismically active regions in the world with more than seven earthquakes on an average per year of magnitude 5.0 and above. Reliable seismic hazard assessment could provide the necessary design inputs for earthquake resistant design of structures in this region. In this study, deterministic as well as probabilistic methods have been attempted for seismic hazard assessment of Tripura and Mizoram states at bedrock level condition. An updated earthquake catalogue was collected from various national and international seismological agencies for the period from 1731 to 2011. The homogenization, declustering and data completeness analysis of events have been carried out before hazard evaluation. Seismicity parameters have been estimated using G-R relationship for each source zone. Based on the seismicity, tectonic features and fault rupture mechanism, this region was divided into six major subzones. Region specific correlations were used for magnitude conversion for homogenization of earthquake size. Ground motion equations (Atkinson and Boore 2003; Gupta 2010) were validated with the observed PGA (peak ground acceleration) values before use in the hazard evaluation. In this study, the hazard is estimated using linear sources, identified in and around the study area. Results are presented in the form of PGA using both DSHA (deterministic seismic hazard analysis) and PSHA (probabilistic seismic hazard analysis) with 2 and 10% probability of exceedance in 50 years, and spectral acceleration (T = 0. 2 s, 1.0 s) for both the states (2% probability of exceedance in 50 years). The results are important to provide inputs for planning risk reduction strategies, for developing risk acceptance criteria and financial analysis for possible damages in the study area with a comprehensive analysis and higher resolution hazard mapping.

  4. Hazard maps of earthquake induced permanent displacements validated by site numerical simulation

    NASA Astrophysics Data System (ADS)

    Vessia, Giovanna; Pisano, Luca; Parise, Mario; Tromba, Giuseppe

    2016-04-01

    Hazard maps of seismically induced instability at the urban scale can be drawn by means of GIS spatial interpolation tools starting from (1) a Digital terrain model (DTM) and (2) geological and geotechnical hydro-mechanical site characterization. These maps are commonly related to a fixed return period of the natural phenomenon under study, or to a particular hazard scenario from the most significant past events. The maps could be used to guide the planning activity as well as the emergency actions, but the main limit of such maps is that typically no reliability analyses is performed. Spatial variability and uncertainties in subsoil properties, poor description of geomorphological evidence of active instability, and geometrical approximations and simplifications in DTMs, among the others, could be responsible for inaccurate maps. In this study, a possible method is proposed to control and increase the overall reliability of an hazard scenario map for earthquake-induced slope instability. The procedure can be summarized as follows: (1) GIS Statistical tools are used to improve the spatial distribution of the hydro-mechanical properties of the surface lithologies; (2) Hazard maps are drawn from the preceding information layer on both groundwater and mechanical properties of surficial deposits combined with seismic parameters propagated by means of Ground Motion Propagation Equations; (3) Point numerical stability analyses carried out by means of the Finite Element Method (e.g. Geostudio 2004) are performed to anchor hazard maps prediction to point quantitative analyses. These numerical analyses are used to generate a conversion scale from urban to point estimates in terms of permanent displacements. Although this conversion scale differs from case to case, it could be suggested as a general method to convert the results of large scale map analyses to site hazard assessment. In this study, the procedure is applied to the urban area of Castelfranci (Avellino province

  5. Large Historical Earthquakes and Tsunami Hazards in the Western Mediterranean: Source Characteristics and Modelling

    NASA Astrophysics Data System (ADS)

    Harbi, Assia; Meghraoui, Mustapha; Belabbes, Samir; Maouche, Said

    2010-05-01

    The western Mediterranean region was the site of numerous large earthquakes in the past. Most of these earthquakes are located at the East-West trending Africa-Eurasia plate boundary and along the coastline of North Africa. The most recent recorded tsunamigenic earthquake occurred in 2003 at Zemmouri-Boumerdes (Mw 6.8) and generated ~ 2-m-high tsunami wave. The destructive wave affected the Balearic Islands and Almeria in southern Spain and Carloforte in southern Sardinia (Italy). The earthquake provided a unique opportunity to gather instrumental records of seismic waves and tide gauges in the western Mediterranean. A database that includes a historical catalogue of main events, seismic sources and related fault parameters was prepared in order to assess the tsunami hazard of this region. In addition to the analysis of the 2003 records, we study the 1790 Oran and 1856 Jijel historical tsunamigenic earthquakes (Io = IX and X, respectively) that provide detailed observations on the heights and extension of past tsunamis and damage in coastal zones. We performed the modelling of wave propagation using NAMI-DANCE code and tested different fault sources from synthetic tide gauges. We observe that the characteristics of seismic sources control the size and directivity of tsunami wave propagation on both northern and southern coasts of the western Mediterranean.

  6. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    USGS Publications Warehouse

    Thenhaus, P.C.

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-1812 in the area of New Madrid, Missouri. they are considered to be the greatest earthquakes in the conterminous U.S because they were felt and caused damage at far greater distances than any other earthquakes in U.S history. The large population currently living within the damage area of these earthquakes means that widespread destruction and loss of life is likely if the sequence were repeated. In contrast to California, where the earthquakes are felt frequently, the damaging earthquakes that have occurred in the Easter U.S-in 155 (Cape Ann, Mass.), 1811-12 (New Madrid, Mo.), 1886 (Charleston S.C) ,and 1897 (Giles County, Va.- are generally regarded as only historical phenomena (fig. 1). The social memory of these earthquakes no longer exists. A fundamental problem in the Eastern U.S, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the "tools" that geoscientists have to study the region. The so-called earthquake hazard is defined  by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. the term "earthquake risk," on the other hand, refers to aspects of the expected damage to manmade strctures and to lifelines as a result of the earthquake hazard.  

  7. Earthquake Hazard and Risk in Sub-Saharan Africa: current status of the Global Earthquake model (GEM) initiative in the region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay; Midzi, Vunganai; Ateba, Bekoa; Mulabisana, Thifhelimbilu; Marimira, Kwangwari; Hlatywayo, Dumisani J.; Akpan, Ofonime; Amponsah, Paulina; Georges, Tuluka M.; Durrheim, Ray

    2013-04-01

    Large magnitude earthquakes have been observed in Sub-Saharan Africa in the recent past, such as the Machaze event of 2006 (Mw, 7.0) in Mozambique and the 2009 Karonga earthquake (Mw 6.2) in Malawi. The December 13, 1910 earthquake (Ms = 7.3) in the Rukwa rift (Tanzania) is the largest of all instrumentally recorded events known to have occurred in East Africa. The overall earthquake hazard in the region is on the lower side compared to other earthquake prone areas in the globe. However, the risk level is high enough for it to receive attention of the African governments and the donor community. The latest earthquake hazard map for the sub-Saharan Africa was done in 1999 and updating is long overdue as several development activities in the construction industry is booming allover sub-Saharan Africa. To this effect, regional seismologists are working together under the GEM (Global Earthquake Model) framework to improve incomplete, inhomogeneous and uncertain catalogues. The working group is also contributing to the UNESCO-IGCP (SIDA) 601 project and assessing all possible sources of data for the catalogue as well as for the seismotectonic characteristics that will help to develop a reasonable hazard model in the region. In the current progress, it is noted that the region is more seismically active than we thought. This demands the coordinated effort of the regional experts to systematically compile all available information for a better output so as to mitigate earthquake risk in the sub-Saharan Africa.

  8. Towards Practical, Real-Time Estimation of Spatial Aftershock Probabilities: A Feasibility Study in Earthquake Hazard

    NASA Astrophysics Data System (ADS)

    Morrow, P.; McCloskey, J.; Steacy, S.

    2001-12-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemetered seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following days to tens of days. Specifically, the project aims to assess the

  9. Regional liquefaction hazard evaluation following the 2010-2011 Christchurch (New Zealand) earthquake sequence

    NASA Astrophysics Data System (ADS)

    Begg, John; Brackley, Hannah; Irwin, Marion; Grant, Helen; Berryman, Kelvin; Dellow, Grant; Scott, David; Jones, Katie; Barrell, David; Lee, Julie; Townsend, Dougal; Jacka, Mike; Harwood, Nick; McCahon, Ian; Christensen, Steve

    2013-04-01

    Following the damaging 4 Sept 2010 Mw7.1 Darfield Earthquake, the 22 Feb 2011 Christchurch Earthquake and subsequent damaging aftershocks, we completed a liquefaction hazard evaluation for c. 2700 km2 of the coastal Canterbury region. Its purpose was to distinguish at a regional scale areas of land that, in the event of strong ground shaking, may be susceptible to damaging liquefaction from areas where damaging liquefaction is unlikely. This information will be used by local government for defining liquefaction-related geotechnical investigation requirements for consent applications. Following a review of historic records of liquefaction and existing liquefaction assessment maps, we undertook comprehensive new work that included: a geologic context from existing geologic maps; geomorphic mapping using LiDAR and integrating existing soil map data; compilation of lithological data for the surficial 10 m from an extensive drillhole database; modelling of depth to unconfined groundwater from existing subsurface and surface water data. Integrating and honouring all these sources of information, we mapped areas underlain by materials susceptible to liquefaction (liquefaction-prone lithologies present, or likely, in the near-surface, with shallow unconfined groundwater) from areas unlikely to suffer widespread liquefaction damage. Comparison of this work with more detailed liquefaction susceptibility assessment based on closely spaced geotechnical probes in Christchurch City provides a level of confidence in these results. We tested our susceptibility map by assigning a matrix of liquefaction susceptibility rankings to lithologies recorded in drillhole logs and local groundwater depths, then applying peak ground accelerations for four earthquake scenarios from the regional probabilistic seismic hazard model (25 year return = 0.13g; 100 year return = 0.22g; 500 year return = 0.38g and 2500 year return = 0.6g). Our mapped boundary between liquefaction-prone areas and areas

  10. Earthquake and Flood Risk Assessments for Europe and Central Asia

    NASA Astrophysics Data System (ADS)

    Murnane, R. J.; Daniell, J. E.; Ward, P.; Winsemius, H.; Tijssen, A.; Toro, J.

    2015-12-01

    We report on a flood and earthquake risk assessment for 32 countries in Europe and Central Asia with a focus on how current flood and earthquake risk might evolve in the future due to changes in climate, population, and GDP. The future hazard and exposure conditions used for the risk assessment are consistent with selected IPCC AR5 Representative Concentration Pathways (RCPs) and Shared Socioeconomic Pathways (SSPs). Estimates of 2030 and 2080 population and GDP are derived using the IMAGE model forced by the socioeconomic conditions associated with the SSPs. Flood risk is modeled using the probabilistic GLOFRIS global flood risk modeling cascade which starts with meteorological fields derived from reanalysis data or climate models. For 2030 and 2080 climate conditions, the meteorological fields are generated from five climate models forced by the RCP4.5 and RCP8.5 scenarios. Future flood risk is estimated using population and GDP exposures consistent with the SSP2 and SSP3 scenarios. Population and GDP are defined as being affected by a flood when a grid cell receives any depth of flood inundation. The earthquake hazard is quantified using a 10,000-year stochastic catalog of over 15.8 million synthetic earthquake events of at least magnitude 5. Ground motion prediction and estimates of local site conditions are used to determine PGA. Future earthquake risk is estimated using population and GDP exposures consistent with all five SSPs. Population and GDP are defined as being affected by an earthquake when a grid cell experiences ground motion equaling or exceeding MMI VI. For most countries, changes in exposure alter flood risk to a greater extent than changes in climate. For both flood and earthquake, the spread in risk grows over time. There are large uncertainties due to the methodology; however, the results are not meant to be definitive. Instead they will be used to initiate discussions with governments regarding efforts to manage disaster risk.

  11. A Probabilistic Tsunami Hazard Assessment Methodology

    NASA Astrophysics Data System (ADS)

    Gonzalez, Frank; Geist, Eric; Jaffe, Bruce; Kanoglu, Utku; Mofjeld, Harold; Synolakis, Costas; Titov, Vasily; Arcas, Diego

    2010-05-01

    A methodology for probabilistic tsunami hazard assessment (PTHA) will be described for multiple near- and far-field seismic sources. The method integrates tsunami inundation modeling with the approach of probabilistic seismic hazard assessment (PSHA). A database of inundation simulations is developed, with each simulation corresponding to an earthquake source for which the seismic parameters and mean interevent time have been estimated. A Poissonian model is then adopted for estimating the probability that tsunami flooding will exceed a given level during a specified period of time, taking into account multiple sources and multiple causes of uncertainty. Uncertainty in the tidal stage at tsunami arrival is dealt with by developing a parametric expression for the probability density function of the sum of the tides and a tsunami; uncertainty in the slip distribution of the near-field source is dealt with probabilistically by considering multiple sources in which width and slip values vary, subject to the constraint of a constant moment magnitude. The method was applied to Seaside, Oregon, to obtain estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. These results will be presented and discussed, including the primary remaining sources of uncertainty -- those associated with interevent time estimates, the modeling of background sea level, and temporal changes in bathymetry and topography. PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk.

  12. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  13. Tsunami Hazard Assessment in Guam

    NASA Astrophysics Data System (ADS)

    Arcas, D.; Uslu, B.; Titov, V.; Chamberlin, C.

    2008-12-01

    The island of Guam is located approximately 1500 miles south of Japan, in the vicinity of the Mariana Trench. It is surrounded in close proximity by three subduction zones, Nankai-Taiwan, East Philippines and Mariana Trench that pose a considerable near to intermediate field tsunami threat. Tsunami catalogues list 14 tsunamigenic earthquake with Mw≥8.0 since 1900 only in this region, (Soloviev and Go, 1974; Lander, 1993; Iida, 1984; Lander and Lowell, 2002), however the island has not been significantly affected by some of the largest far-field events of the past century, such as the 1952 Kamchatka, 1960 Chile, and the 1964 Great Alaska earthquake. An assessment of the tsunami threat to the island from both near and far field sources, using forecast tools originally developed at NOAA's Pacific Marine Environmental Laboratory (PMEL) for real-time forecasting of tsunamis is presented here. Tide gauge records from 1952 Kamchatka, 1964 Alaska, and 1960 Chile earthquakes at Apra Harbor are used to validate our model set up, and to explain the limited impact of these historical events on Guam. Identification of worst-case scenarios, and determination of tsunamigenic effective source regions are presented for five vulnerable locations on the island via a tsunami sensitivity study. Apra Harbor is the site of a National Ocean Service (NOS) tide gauge and the biggest harbor on the island. Tumon Bay, Pago Bay, Agana Bay and Inarajan Bay are densely populated areas that require careful investigation. The sensitivity study shows that earthquakes from Eastern Philippines present a major threat to west coast facing sites, whereas the Marina Trench poses the biggest concern to the east coast facing sites.

  14. Tsunami hazard warning and risk prediction based on inaccurate earthquake source parameters

    NASA Astrophysics Data System (ADS)

    Goda, K.; Abilova, K.

    2015-12-01

    This study investigates the issues related to underestimation of the earthquake source parameters in the context of tsunami early warning and tsunami risk assessment. The magnitude of a very large event may be underestimated significantly during the early stage of the disaster, resulting in the issuance of incorrect tsunami warnings. Tsunamigenic events in the Tohoku region of Japan, where the 2011 tsunami occurred, are focused on as a case study to illustrate the significance of the problems. The effects of biases in the estimated earthquake magnitude on tsunami loss are investigated using a rigorous probabilistic tsunami loss calculation tool that can be applied to a range of earthquake magnitudes by accounting for uncertainties of earthquake source parameters (e.g. geometry, mean slip, and spatial slip distribution). The quantitative tsunami loss results provide with valuable insights regarding the importance of deriving accurate seismic information as well as the potential biases of the anticipated tsunami consequences. Finally, usefulness of rigorous tsunami risk assessment is discussed in defining critical hazard scenarios based on the potential consequences due to tsunami disasters.

  15. Tsunami hazard warning and risk prediction based on inaccurate earthquake source parameters

    NASA Astrophysics Data System (ADS)

    Goda, Katsuichiro; Abilova, Kamilla

    2016-02-01

    This study investigates the issues related to underestimation of the earthquake source parameters in the context of tsunami early warning and tsunami risk assessment. The magnitude of a very large event may be underestimated significantly during the early stage of the disaster, resulting in the issuance of incorrect tsunami warnings. Tsunamigenic events in the Tohoku region of Japan, where the 2011 tsunami occurred, are focused on as a case study to illustrate the significance of the problems. The effects of biases in the estimated earthquake magnitude on tsunami loss are investigated using a rigorous probabilistic tsunami loss calculation tool that can be applied to a range of earthquake magnitudes by accounting for uncertainties of earthquake source parameters (e.g., geometry, mean slip, and spatial slip distribution). The quantitative tsunami loss results provide valuable insights regarding the importance of deriving accurate seismic information as well as the potential biases of the anticipated tsunami consequences. Finally, the usefulness of rigorous tsunami risk assessment is discussed in defining critical hazard scenarios based on the potential consequences due to tsunami disasters.

  16. A Quantitative Appraisal of Earthquake Hazard Parameters Evaluated from Bayesian Approach for Different Regions in Iranian Plateau

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Türker, Tügba; Bayrak, Yusuf

    2016-06-01

    In this study, we used the program for seismic hazard Bayesian estimate which was elaborated by Alexey Lyubushin. Our study is the next in the sequence of applications of this software to seismic hazard assessment in different regions of the world. However, earthquake hazard parameters of maximum regional magnitude (M_{ max }), β value and seismic activity rate or intensity ( λ) and their uncertainties for the 15 different source regions in Iranian Plateau have been evaluated with the help of a complete and homogeneous earthquake catalogue during the period 1900-2014 with M_{{w}} ≥4.0. The estimated M_{ max } values varies between 6.25 and 8.37. Lowest value is observed in the Zagros foredeep whereas highest value is observed in the Makran. Also, it is observed that there is a strong relationship between the estimated maximum earthquake magnitudes estimated by Bayesian approach and maximum observed magnitudes. Moreover, in this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in 15 different source regions. Based on computed earthquake hazard parameters, the prerequisite guides to the earthquake estimation of the parameters referred to as the most seismically active regions of Iranian Plateau. The Makran and East Iran show earthquake magnitude greater than 8.0 in next 100-years with 90 % probability level as compared to other regions, which declares that these regions are more susceptible to occurrence of large earthquakes. The outcomes which obtained in the study may have useful implications in the probabilistic seismic hazard studies of Iranian Plateau.

  17. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    ,

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  18. Earthquake engineering research: 1982

    NASA Astrophysics Data System (ADS)

    The Committee on Earthquake Engineering Research addressed two questions: What progress has research produced in earthquake engineering and which elements of the problem should future earthquake engineering pursue. It examined and reported in separate chapters of the report: Applications of Past Research, Assessment of Earthquake Hazard, Earthquake Ground Motion, Soil Mechanics and Earth Structures, Analytical and Experimental Structural Dynamics, Earthquake Design of Structures, Seismic Interaction of Structures and Fluids, Social and Economic Aspects, Earthquake Engineering Education, Research in Japan.

  19. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  20. Earthquake and tsunami hazard in West Sumatra: integrating science, outreach, and local stakeholder needs

    NASA Astrophysics Data System (ADS)

    McCaughey, J.; Lubis, A. M.; Huang, Z.; Yao, Y.; Hill, E. M.; Eriksson, S.; Sieh, K.

    2012-04-01

    The Earth Observatory of Singapore (EOS) is building partnerships with local to provincial government agencies, NGOs, and educators in West Sumatra to inform their policymaking, disaster-risk-reduction, and education efforts. Geodetic and paleoseismic studies show that an earthquake as large as M 8.8 is likely sometime in the coming decades on the Mentawai patch of the Sunda megathrust. This earthquake and its tsunami would be devastating for the Mentawai Islands and neighboring areas of the western Sumatra coast. The low-lying coastal Sumatran city of Padang (pop. ~800,000) has been the object of many research and outreach efforts, especially since 2004. Padang experienced deadly earthquakes in 2007 and 2009 that, though tragedies in their own right, served also as wake-up calls for a larger earthquake to come. However, there remain significant barriers to linking science to policy: extant hazard information is sometimes contradictory or confusing for non-scientists, while turnover of agency leadership and staff means that, in the words of one local advocate, "we keep having to start from zero." Both better hazard knowledge and major infrastructure changes are necessary for risk reduction in Padang. In contrast, the small, isolated villages on the outlying Mentawai Islands have received relatively fewer outreach efforts, yet many villages have the potential for timely evacuation with existing infrastructure. Therefore, knowledge alone can go far toward risk reduction. The tragic October 2010 Mentawai tsunami has inspired further disaster-risk reduction work by local stakeholders. In both locations, we are engaging policymakers and local NGOs, providing science to help inform their work. Through outreach contacts, the Mentawai government requested that we produce the first-ever tsunami hazard map for their islands; this aligns well with scientific interests at EOS. We will work with the Mentawai government on the presentation and explanation of the hazard map, as

  1. RISMUR II: New seismic hazard and risk study in Murcia Region after the Lorca Earthquake, 2011

    NASA Astrophysics Data System (ADS)

    Benito, Belen; Gaspar, Jorge; Rivas, Alicia; Quiros, Ligia; Ruiz, Sandra; Hernandez, Roman; Torres, Yolanda; Staller, Sandra

    2016-04-01

    The Murcia Region, is one of the highest seimic activity of Spain, located SE Iberian Peninsula. A system of active faults are included in the región, where the most recent damaging eartquakes took place in our country: 1999, 2002, 2005 and 2011. The last one ocurred in Lorca, causing 9 deads and notably material losses, including the artistic stock. The seismic emergency plann of the Murcia Region was developed in 2006, based of the results of the risk Project RISMUR I, which among other conslusions pointed out Lorca as one of the municipalities with highest risk in the province,. After the Lorca earthquake in 2011, a revisión of the previous study has been developed through the Project RISMUR II, including data of this earthquake , as well as updted Data Base of: seismicity, active faults, strong motion records, cadastre, vulnerability, etc. In adittion, the new study includes, some methodology innovations: modelization of faults as independent units for hazard assessment, analytic methods for risk estimations using data of the earthquake for calibration of capacity and fragility curves. In this work the results of RISMUR II are presented, which are compared with those reached in RISMUR I. The main conclusions are: Increasing of the hazard along the central system fault SW-NE (Alhama de Murcia, Totana nad Carracoy), which involve highest expected damages in the nearest populations to these faults: Lorca, Totana, Alcantarilla and Murcia.

  2. Challenges in assessing seismic hazard in intraplate Europe

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Liu, Mian; Camelbeeck, Thierry; Merino, Miguel; Landgraf, Angela; Hintersberger, Esther; Kübler, Simon

    2016-04-01

    Intraplate seismicity is often characterized by episodic, clustered and migrating earth- quakes and extended after-shock sequences. Can these observations - primarily from North America, China and Australia - usefully be applied to seismic hazard assessment for intraplate Europe? Existing assessments are based on instrumental and historical seismicity of the past c. 1000 years, as well as some data for active faults. This time span probably fails to capture typical large-event recurrence intervals of the order of tens of thousands of years. Palaeoseismology helps to lengthen the observation window, but preferentially produces data in regions suspected to be seismically active. Thus the expected maximum magnitudes of future earthquakes are fairly uncertain, possibly underestimated, and earthquakes are likely to occur in unexpected locations. These issues particularly arise in considering the hazards posed by low-probability events to both heavily populated areas and critical facilities. For example, are the variations in seismicity (and thus assumed seismic hazard) along the Rhine Graben a result of short sampling or are they real? In addition to a better assessment of hazards with new data and models, it is important to recognize and communicate uncertainties in hazard estimates. The more users know about how much confidence to place in hazard maps, the more effectively the maps can be used.

  3. Bayesian network learning for natural hazard assessments

    NASA Astrophysics Data System (ADS)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  4. PAGER - Rapid Assessment of an Earthquake's Impact

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.

    2007-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system to rapidly assess the number of people and regions exposed to severe shaking by an earthquake, and inform emergency responders, government agencies, and the media to the scope of the potential disaster. PAGER monitors the U.S. Geological Survey?s near real-time U.S. and global earthquake detections and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts.

  5. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    NASA Astrophysics Data System (ADS)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  6. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    SciTech Connect

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  7. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    USGS Publications Warehouse

    McNamara, Daniel E.; Yeck, William; Barnhart, William D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, Amod; Hough, S.E.; Benz, Harley M.; Earle, Paul

    2016-01-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard.Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a ~ 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10–15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  8. Sedimentary Basins: A Deeper Look at Seattle and Portland's Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Thompson, M.; Frankel, A. D.; Wirth, E. A.; Vidale, J. E.; Han, J.

    2015-12-01

    to assess the shaking hazards for Portland due to local earthquakes and great earthquakes on the CSZ.

  9. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    NASA Astrophysics Data System (ADS)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  10. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily

  11. Monogenetic volcanic hazards and assessment

    NASA Astrophysics Data System (ADS)

    Connor, C.; Connor, L. J.; Richardson, J. A.

    2012-12-01

    Many of the Earth's major cities are build on the products of monogenetic volcanic eruptions and within geologically active basaltic volcanic fields. These cities include Mexico City (Mexico), Auckland (New Zealand), Melbourne (Australia), and Portland (USA) to name a few. Volcanic hazards in these areas are complex, and involve the potential formation of new volcanic vents and associated hazards, such as lava flows, tephra fallout, and ballistic hazards. Hazard assessment is complicated by the low recurrence rate of volcanism in most volcanic fields. We have developed a two-stage process for probabilistic modeling monogenetic volcanic hazards. The first step is an estimation of the possible locations of future eruptive vents based on kernel density estimation and recurrence rate of volcanism using Monte Carlo simulation and accounting for uncertainties in age determinations. The second step is convolution of this spatial density / recurrence rate model with hazard codes for modeling lava inundation, tephra fallout, and ballistic impacts. A methodology is presented using this two-stage approach to estimate lava flow hazard in several monogenetic volcanic fields, including at a nuclear power plant site near the Shamiram Plateau, a Quaternary volcanic field in Armenia. The location of possible future vents is determined by estimating spatial density from a distribution of 18 mapped vents using a 2-D elliptical Gaussian kernel function. The SAMSE method, a modified asymptotic mean squared error approach, uses the distribution of known eruptive vents to optimally determine a smoothing bandwidth for the Gaussian kernel function. The result is a probability map of vent density. A large random sample (N=10000) of vent locations is drawn from this probability map. For each randomly sampled vent location, a lava flow inundation model is executed. Lava flow input parameters (volume and average thickness) are determined from distributions fit to field observations of the low

  12. Magnetohydrodynamics and its hazard assessment

    NASA Astrophysics Data System (ADS)

    Chan, W.-T.

    1981-11-01

    Potential occupational and environmental hazards of a typical combined open-cycle MHD/steam cycle power plant are critically assessed on the basis of direct/indirect research information. Among the potential occupational hazards, explosion at the coal feed system or at the superconducting magnet; combustor rupture in a confined pit; high intensity dc magnetic field exposure at the channel; and combustion products leakage from the pressurized systems are of primary concern. While environmental emissions of SO(x), NO(x) and fine particulates are considered under control in experimental scale, control effectiveness at high capacity operation remains uncertain. Gaseous emission of some highly toxic trace elements including radioactive species may be of concern without gas cleaning device in the MHD design.

  13. Hazards assessment for the Hazardous Waste Storage Facility

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency.

  14. NGNP SITE 2 HAZARDS ASSESSMENT

    SciTech Connect

    Wayne Moe

    2011-10-01

    The Next Generation Nuclear Plant (NGNP) Project initiated at Idaho National Laboratory (INL) by the U.S. Department of Energy pursuant to the 2005 Energy Policy Act, is based on research and development activities supported by the Generation IV Nuclear Energy Systems Initiative. The principal objective of the NGNP Project is to support commercialization of the high temperature gas-cooled reactor (HTGR) technology. The HTGR is a helium-cooled and graphite-moderated reactor that can operate at temperatures much higher than those of conventional light water reactor (LWR) technologies. Accordingly, it can be applied in many industrial applications as a substitute for burning fossil fuels, such as natural gas, to generate process heat in addition to producing electricity, which is the principal application of current LWRs. Nuclear energy in the form of LWRs has been used in the U.S. and internationally principally for the generation of electricity. However, because the HTGR operates at higher temperatures than LWRs, it can be used to displace the use of fossil fuels in many industrial applications. It also provides a carbon emission-free energy supply. For example, the energy needs for the recovery and refining of petroleum, for the petrochemical industry and for production of transportation fuels and feedstocks using coal conversion processes require process heat provided at temperatures approaching 800 C. This temperature range is readily achieved by the HTGR technology. This report summarizes a site assessment authorized by INL under the NGNP Project to determine hazards and potential challenges that site owners and HTGR designers need to be aware of when developing the HTGR design for co-location at industrial facilities, and to evaluate the site for suitability considering certain site characteristics. The objectives of the NGNP site hazard assessments are to do an initial screening of representative sites in order to identify potential challenges and restraints

  15. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at

  16. Earthquake stress triggers, stress shadows, and seismic hazard

    USGS Publications Warehouse

    Harris, R.A.

    2000-01-01

    Many aspects of earthquake mechanics remain an enigma at the beginning of the twenty-first century. One potential bright spot is the realization that simple calculations of stress changes may explain some earthquake interactions, just as previous and ongoing studies of stress changes have begun to explain human- induced seismicity. This paper, which is an update of Harris1, reviews many published works and presents a compilation of quantitative earthquake-interaction studies from a stress change perspective. This synthesis supplies some clues about certain aspects of earthquake mechanics. It also demonstrates that much work remains to be done before we have a complete story of how earthquakes work.

  17. Identification of Potential Hazard using Hazard Identification and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Sari, R. M.; Syahputri, K.; Rizkya, I.; Siregar, I.

    2017-03-01

    This research was conducted in the paper production’s company. These Paper products will be used as a cigarette paper. Along in the production’s process, Company provides the machines and equipment that operated by workers. During the operations, all workers may potentially injured. It known as a potential hazard. Hazard identification and risk assessment is one part of a safety and health program in the stage of risk management. This is very important as part of efforts to prevent occupational injuries and diseases resulting from work. This research is experiencing a problem that is not the identification of potential hazards and risks that would be faced by workers during the running production process. The purpose of this study was to identify the potential hazards by using hazard identification and risk assessment methods. Risk assessment is done using severity criteria and the probability of an accident. According to the research there are 23 potential hazard that occurs with varying severity and probability. Then made the determination Risk Assessment Code (RAC) for each potential hazard, and gained 3 extreme risks, 10 high risks, 6 medium risks and 3 low risks. We have successfully identified potential hazard using RAC.

  18. Earthquake hazards to domestic water distribution systems in Salt Lake County, Utah

    USGS Publications Warehouse

    Highland, Lynn M.

    1985-01-01

    A magnitude-7. 5 earthquake occurring along the central portion of the Wasatch Fault, Utah, may cause significant damage to Salt Lake County's domestic water system. This system is composed of water treatment plants, aqueducts, distribution mains, and other facilities that are vulnerable to ground shaking, liquefaction, fault movement, and slope failures. Recent investigations into surface faulting, landslide potential, and earthquake intensity provide basic data for evaluating the potential earthquake hazards to water-distribution systems in the event of a large earthquake. Water supply system components may be vulnerable to one or more earthquake-related effects, depending on site geology and topography. Case studies of water-system damage by recent large earthquakes in Utah and in other regions of the United States offer valuable insights in evaluating water system vulnerability to earthquakes.

  19. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  20. Seismic hazard assessment in Central Asia using smoothed seismicity approaches

    NASA Astrophysics Data System (ADS)

    Ullah, Shahid; Bindi, Dino; Zuccolo, Elisa; Mikhailova, Natalia; Danciu, Laurentiu; Parolai, Stefano

    2014-05-01

    Central Asia has a long history of large to moderate frequent seismicity and is therefore considered one of the most seismically active regions with a high hazard level in the world. In the hazard map produced at global scale by GSHAP project in 1999( Giardini, 1999), Central Asia is characterized by peak ground accelerations with return period of 475 years as high as 4.8 m/s2. Therefore Central Asia was selected as a target area for EMCA project (Earthquake Model Central Asia), a regional project of GEM (Global Earthquake Model) for this area. In the framework of EMCA, a new generation of seismic hazard maps are foreseen in terms of macro-seismic intensity, in turn to be used to obtain seismic risk maps for the region. Therefore Intensity Prediction Equation (IPE) had been developed for the region based on the distribution of intensity data for different earthquakes occurred in Central Asia since the end of 19th century (Bindi et al. 2011). The same observed intensity distribution had been used to assess the seismic hazard following the site approach (Bindi et al. 2012). In this study, we present the probabilistic seismic hazard assessment of Central Asia in terms of MSK-64 based on two kernel estimation methods. We consider the smoothed seismicity approaches of Frankel (1995), modified for considering the adaptive kernel proposed by Stock and Smith (2002), and of Woo (1996), modified for considering a grid of sites and estimating a separate bandwidth for each site. The activity rate maps are shown from Frankel approach showing the effects of fixed and adaptive kernel. The hazard is estimated for rock site condition based on 10% probability of exceedance in 50 years. Maximum intensity of about 9 is observed in the Hindukush region.

  1. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  2. Challenges in Assessing Seismic Hazard in Intraplate Europe

    NASA Astrophysics Data System (ADS)

    Hintersberger, E.; Kuebler, S.; Landgraf, A.; Stein, S. A.

    2014-12-01

    Intraplate regions are often characterized by scattered, clustered and migrating seismicity and the occurrence of low-strain areas next to high-strain ones. Increasing evidence for large paleoearthquakes in such regions together with population growth and development of critical facilities, call for better assessments of earthquake hazards. Existing seismic hazard assessment for intraplate Europe is based on instrumental and historical seismicity of the past 1000 years, as well some active fault data. These observations face important limitations due to the quantity and quality of the available data bases. Even considering the long record of historical events in some populated areas of Europe, this time-span of thousand years likely fails to capture some faults' typical large-event recurrence intervals that are in the order of tens of thousands of years. Paleoseismology helps lengthen the observation window, but only produces point measurements, and preferentially in regions suspected to be seismically active. As a result, the expected maximum magnitudes of future earthquakes are quite uncertain, likely to be underestimated, and earthquakes are likely to occur in unexpected locations. These issues in particular arise in the heavily populated Rhine Graben and Vienna Basin areas, and in considering the hazard to critical facilities like nuclear power plants posed by low-probability events.

  3. Multiple-site estimations in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Sokolov, Vladimir; Ismail-Zadeh, Alik

    2016-04-01

    We analyze specific features of multiple-site probabilistic seismic hazard assessment (PSHA), i.e. annual rate of ground motion level exceedance in at least one site of several sites of interest located within in an area or along a linear extended object. The relation between the multiple-scale hazard estimations and strong ground-motion records obtained during the 2008 Wenchuan (China) Mw 7.9 earthquake is discussed. The ground-motion records may be considered as an example of ground motion exceeding the design level estimated using the classical point-wise PSHA. We showed that the multiple-site hazard (MSH) assessment, when being performed for standard return period 475 years, provide reasonable estimations of the ground motions that may occur during the earthquake, parameters of which are close to maximum possible events accepted in PSHA for the region. Thus the MSH may be useful in estimation of maximum considered earthquake ground motion for the considered territory taking into account its extent.

  4. Probabilistic Seismic Hazard assessment for Sultanate of Oman

    NASA Astrophysics Data System (ADS)

    El Hussain, I. W.; Deif, A.; El-Hady, S.; Toksoz, M. N.; Al-Jabri, K.; Al-Hashmi, S.; Al-Toubi, K. I.; Al-Shijbi, Y.; Al-Saifi, M.

    2010-12-01

    Seismic hazard assessment for Oman is conducted utilizing probabilistic approach. Probabilistic Seismic Hazard Assessment (PSHA) has been performed within a logic tree framework. An earthquake catalogue for Oman was compiled and declustered to include only independent earthquakes. The declustered catalogue was used to define seismotectonic source model with 26 source zones that characterize earthquakes in the tectonic environments in and around Oman. The recurrence parameters for all the seismogenic zones are determined using the doubly bounded exponential distribution except the seismogenic zones of Makran subduction zone which were modeled using the characteristic distribution. The maximum earthquakes on known faults were determined geologically and the remaining zones were determined statistically from the compiled catalogue. Horizontal ground accelerations in terms of geometric mean were calculated using ground-motion prediction relationships that were developed from seismic data obtained from the shallow active environment, stable craton environment, and from subduction earthquakes. In this analysis, we have used alternative seismotectonic source models, maximum magnitude, and attenuation models and weighted them to account for the epistemic uncertainty. The application of this methodology leads to the definition of 5% damped seismic hazard maps at rock sites for 72, 475, and 2475 year return periods for spectral accelerations at periods of 0.0 (corresponding to peak ground acceleration), 0.1, 0.2, 0.3, 1.0 and 2.0 sec. Mean and 84th percentile acceleration contour maps were represented. The results also were displayed as uniform hazard spectra for rock sites in the cities of Khasab, Diba, Sohar, Muscat, Nizwa, Sur, and Salalah in Oman and the cities of Abu Dhabi and Dubai in UAE. The PGA across Oman ranges from 20 cm/sec2 in the Mid-West and 115 cm/sec2 at the northern part for 475 years return period and between 40 cm/sec2 and 180 cm/sec2 for 2475 years

  5. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  6. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  7. Damage-consistent hazard assessment - the revival of intensities

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2016-04-01

    Proposed key-note speech (Introduction of session). Current civil engineering standards for residential buildings in many countries are based on (frequently probabilistic) seismic hazard assessments using ground motion parameters like peak ground accelerations or pseudo displacements as hazard parameters. This approach has its roots in the still wide spread force-based design of structures using simplified methods like linear response spectra in combination with equivalent static forces procedures for the design of structures. In the engineering practice this has led to practical problems because it's not economic to design structures against the maximum forces of earthquakes. Furthermore, a completely linear-elastic response of structures is seldom required. Different types of reduction factors (performance-dependent response factors) considering for example overstrength, structural redundancy and structural ductility have been developed in different countries for compensating the use of simplified and conservative design methods. This has the practical consequence that the methods used in engineering as well as the output results of hazard assessment studies are poorly related to the physics of damaging. Reliable predictions for the response of structures under earthquake loading using such simplified design methods are not feasible. In dependence of the type of structures damage may be controlled by hazard parameters that are different from ground motion accelerations. Furthermore, a realistic risk assessment has to be based on reliable predictions of damage. This is crucial for effective decision-making. This opens the space for a return to the use of intensities as the key output parameter of seismic hazard assessment. Site intensities (e.g. EMS-98) are very well correlated to the damage of structures. They can easily be converted into the required set of engineering parameters or even directly into earthquake time-histories suitable for structural analysis

  8. Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)

    SciTech Connect

    Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.

    2010-09-24

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.

  9. Microzonation of Seismic Hazards and Estimation of Human Fatality for Scenario Earthquakes in Chianan Area, Taiwan

    NASA Astrophysics Data System (ADS)

    Liu, K. S.; Chiang, C. L.; Ho, T. T.; Tsai, Y. B.

    2015-12-01

    In this study, we assess seismic hazards in the 57 administration districts of Chianan area, Taiwan in the form of ShakeMaps as well as to estimate potential human fatalities from scenario earthquakes on the three Type I active faults in this area. As a result, it is noted that two regions with high MMI intensity greater than IX in the map of maximum ground motion. One is in the Chiayi area around Minsyong, Dalin and Meishan due to presence of the Meishan fault and large site amplification factors which can reach as high as 2.38 and 2.09 for PGA and PGV, respectively, in Minsyong. The other is in the Tainan area around Jiali, Madou, Siaying, Syuejia, Jiangjyun and Yanshuei due to a disastrous earthquake occurred near the border between Jiali and Madou with a magnitude of Mw 6.83 in 1862 and large site amplification factors which can reach as high as 2.89 and 2.97 for PGA and PGV, respectively, in Madou. In addition, the probabilities in 10, 30, and 50-year periods with seismic intensity exceeding MMII VIII in above areas are greater than 45%, 80% and 95%, respectively. Moreover, from the distribution of probabilities, high values of greater than 95% over a 10 year period with seismic intensity corresponding to CWBI V and MMI VI are found in central and northern Chiayi and northern Tainan. At last, from estimation of human fatalities for scenario earthquakes on three active faults in Chianan area, it is noted that the numbers of fatalities increase rapidly for people above age 45. Compared to the 1946 Hsinhua earthquake, the number of fatality estimated from the scenario earthquake on the Hsinhua active fault is significantly high. However, the higher number of fatality in this case is reasonable after considering the probably reasons. Hence, we urge local and the central governments to pay special attention on seismic hazard mitigation in this highly urbanized area with large number of old buildings.

  10. Assessing volcanic hazards with Vhub

    NASA Astrophysics Data System (ADS)

    Palma, J. L.; Charbonnier, S.; Courtland, L.; Valentine, G.; Connor, C.; Connor, L.

    2012-04-01

    Vhub (online at vhub.org) is a virtual organization and community cyberinfrastructure designed for collaboration in volcanology research, education, and outreach. One of the core objectives of this project is to accelerate the transfer of research tools to organizations and stakeholders charged with volcano hazard and risk mitigation (such as volcano observatories). Vhub offers a clearinghouse for computational models of volcanic processes and data analysis, documentation of those models, and capabilities for online collaborative groups focused on issues such as code development, configuration management, benchmarking, and validation. Vhub supports computer simulations and numerical modeling at two levels: (1) some models can be executed online via Vhub, without needing to download code and compile on the user's local machine; (2) other models are not available for online execution but for offline use in the user's computer. VHub also has wikis, blogs and group functions around specific topics to encourage collaboration, communication and discussion. Some of the simulation tools currently available to Vhub users are: Energy Cone (rapid delineation of the impact zone by pyroclastic density currents), Tephra2 (tephra dispersion forecast tool), Bent (atmospheric plume analysis), Hazmap (simulate sedimentation of volcanic particles) and TITAN2D (mass flow simulation tool). The list of online simulations available on Vhub is expected to expand considerably as the volcanological community becomes more involved in the project. This presentation focuses on the implementation of online simulation tools, and other Vhub's features, for assessing volcanic hazards following approaches similar to those reported in the literature. Attention is drawn to the minimum computational resources needed by the user to carry out such analyses, and to the tools and media provided to facilitate the effective use of Vhub's infrastructure for hazard and risk assessment. Currently the project

  11. Assessment and Prediction of Natural Hazards from Satellite Imagery.

    PubMed

    Gillespie, Thomas W; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan

    2007-10-01

    Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth's surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth's surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space.

  12. Statistical analysis of time-dependent earthquake occurrence and its impact on hazard in the low seismicity region Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Hainzl, Sebastian; Scherbaum, Frank; Beauval, Céline

    2007-11-01

    The time-dependence of earthquake occurrence is mostly ignored in standard seismic hazard assessment even though earthquake clustering is well known. In this work, we attempt to quantify the impact of more realistic dynamics on the seismic hazard estimations. We include the time and space dependences between earthquakes into the hazard analysis via Monte Carlo simulations. Our target region is the Lower Rhine Embayment, a low seismicity area in Germany. Including aftershock sequences by using the epidemic type aftershock-sequence (ETAS) model, we find that on average the hypothesis of uncorrelated random earthquake activity underestimates the hazard by 5-10 per cent. Furthermore, we show that aftershock activity of past large earthquakes can locally increase the hazard even centuries later. We also analyse the impact of the so-called long-term behaviour, assuming a quasi-periodic occurrence of main events on a major fault in that region. We found that a significant impact on hazard is only expected for the special case of a very regular recurrence of the main shocks.

  13. Hazards assessment for the INEL Landfill Complex

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-02-01

    This report documents the hazards assessment for the INEL Landfill Complex (LC) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and the DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes the hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding the LC, the buildings and structures at the LC, and the processes that are used at the LC are described in this report. All hazardous materials, both radiological and nonradiological, at the LC were identified and screened against threshold quantities according to DOE Order 5500.3A guidance. Asbestos at the Asbestos Pit was the only hazardous material that exceeded its specified threshold quantity. However, the type of asbestos received and the packaging practices used are believed to limit the potential for an airborne release of asbestos fibers. Therefore, in accordance with DOE Order 5500.3A guidance, no further hazardous material characterization or analysis was required for this hazards assessment.

  14. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions

  15. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    USGS Publications Warehouse

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the

  16. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  17. Probabilistic seismic hazard assessment for the two layer fault system of Antalya (SW Turkey) area

    NASA Astrophysics Data System (ADS)

    Dipova, Nihat; Cangir, Bülent

    2017-03-01

    Southwest Turkey, along Mediterranean coast, is prone to large earthquakes resulting from subduction of the African plate under the Eurasian plate and shallow crustal faults. Maximum observed magnitude of subduction earthquakes is Mw = 6.5 whereas that of crustal earthquakes is Mw = 6.6. Crustal earthquakes are sourced from faults which are related with Isparta Angle and Cyprus Arc tectonic structures. The primary goal of this study is to assess seismic hazard for Antalya area (SW Turkey) using a probabilistic approach. A new earthquake catalog for Antalya area, with unified moment magnitude scale, was prepared in the scope of the study. Seismicity of the area has been evaluated by the Gutenberg-Richter recurrence relationship. For hazard computation, CRISIS2007 software was used following the standard Cornell-McGuire methodology. Attenuation model developed by Youngs et al. Seismol Res Lett 68(1):58-73, (1997) was used for deep subduction earthquakes and Chiou and Youngs Earthq Spectra 24(1):173-215, (2008) model was used for shallow crustal earthquakes. A seismic hazard map was developed for peak ground acceleration and for rock ground with a hazard level of a 10% probability of exceedance in 50 years. Results of the study show that peak ground acceleration values on bedrock change between 0.215 and 0.23 g in the center of Antalya.

  18. Earthquakes

    EPA Pesticide Factsheets

    Information on this page will help you understand environmental dangers related to earthquakes, what you can do to prepare and recover. It will also help you recognize possible environmental hazards and learn what you can do to protect you and your family

  19. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the

  20. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  1. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  2. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  3. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  4. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions,...

  5. Preliminary Probabilistic Tsunami Hazard Assessment of Canadian Coastlines

    NASA Astrophysics Data System (ADS)

    Leonard, L. J.; Rogers, G. C.; Mazzotti, S.

    2012-12-01

    We present a preliminary probabilistic tsunami hazard assessment of Canadian coastlines from local and far-field, earthquake and large landslide sources. Our multifaceted analysis is based on published historical, paleotsunami and paleoseismic data, modelling, and empirical relations between fault area, earthquake magnitude and tsunami runup. We consider geological sources with known tsunami impacts on Canadian coasts (e.g., Cascadia and other Pacific subduction zones; the 1755 Lisbon tsunami source; Atlantic continental slope failures) as well as potential sources with previously unknown impact (e.g., Explorer plate subduction; Caribbean subduction zones; crustal faults). The cumulative estimated tsunami hazard for potentially damaging runup (≥ 1.5 m) of the outer Canadian Pacific coastline is ~40-80% in 50 y, respectively one and two orders of magnitude greater than the outer Atlantic (~1-15%) and the Arctic (< 1%). For larger runup with significant damage potential (≥ 3 m), Pacific hazard is ~10-30% in 50 y, again much larger than both the Atlantic (~1-5%) and Arctic (< 1%). For outer Pacific coastlines, the ≥ 1.5 m runup hazard is dominated by far-field subduction zone sources, but the probability of runup ≥ 3 m is highest for local megathrust sources, particularly the Cascadia subduction zone; potential thrust sources along the Explorer and Queen Charlotte margins may also be significant for the more northern coasts of British Columbia, where there is a lack of known paleo-event data. For the more sheltered inner Pacific coastlines of Juan de Fuca and Georgia Straits, the hazard at both levels is contributed mainly by Cascadia megathrust events. Tsunami hazard on the Atlantic coastline is dominated by poorly-constrained far-field subduction sources; a lesser hazard is posed by near-field continental slope failures similar to the 1929 Grand Banks event. Tsunami hazard on the Arctic coastline is poorly constrained, but is likely dominated by continental

  6. The U.S. Geological Survey Earthquake Hazards Program Website: Summary of Recent and Ongoing Developments

    NASA Astrophysics Data System (ADS)

    Wald, L. A.; Zirbes, M.; Robert, S.; Wald, D.; Presgrace, B.; Earle, P.; Schwarz, S.; Haefner, S.; Haller, K.; Rhea, S.

    2003-12-01

    The U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) website (http://earthquake.usgs.gov/) focuses on 1) earthquake reporting for informed decisions after an earthquake, 2) hazards information for informed decisions and planning before an earthquake, and 3) the basics of earthquake science to help the users of the information understand what is presented. The majority of website visitors are looking for information about current earthquakes in the U.S. and around the world, and the second most visited portion of the website are the education-related pages. People are eager for information, and they are most interested in "what's in my backyard?" Recent and future web developments are aimed at answering this question, making the information more relevant to users, and enabling users to more quickly and easily find the information they are looking for. Recent and/or current web developments include the new enhanced Recent Global Earthquakes and U.S. Earthquakes webpages, the Earthquake in the News system, the Rapid Accurate Tectonic Summaries (RATS), online Significant Earthquake Summary Posters (ESP's), and the U.S. Quaternary Fault & Fold Database, the details of which are covered individually in greater detail in this or other sessions. Future planned developments include a consistent look across all EHP webpages, an integrated one-stop-shopping earthquake notification (EQMail) subscription webpage, new navigation tabs, and a backend database allowing the user to search for earthquake information across all the various EHP websites (on different webservers) based on a topic or region. Another goal is to eventually allow a user to input their address (Zip Code?) and in return receive all the relevant EHP information (and links to more detailed information) such as closest fault, the last significant nearby earthquake, a local seismicity map, and a local hazard map, for example. This would essentially be a dynamic report based on the entered location

  7. Increasing seismicity in the U. S. midcontinent: Implications for earthquake hazard

    USGS Publications Warehouse

    Ellsworth, William L.; Llenos, Andrea L.; McGarr, Arthur F.; Michael, Andrew J.; Rubinstein, Justin L.; Mueller, Charles S.; Petersen, Mark D.; Calais, Eric

    2015-01-01

    Earthquake activity in parts of the central United States has increased dramatically in recent years. The space-time distribution of the increased seismicity, as well as numerous published case studies, indicates that the increase is of anthropogenic origin, principally driven by injection of wastewater coproduced with oil and gas from tight formations. Enhanced oil recovery and long-term production also contribute to seismicity at a few locations. Preliminary hazard models indicate that areas experiencing the highest rate of earthquakes in 2014 have a short-term (one-year) hazard comparable to or higher than the hazard in the source region of tectonic earthquakes in the New Madrid and Charleston seismic zones.

  8. Aftereffects of Subduction-Zone Earthquakes: Potential Tsunami Hazards along the Japan Sea Coast.

    PubMed

    Minoura, Koji; Sugawara, Daisuke; Yamanoi, Tohru; Yamada, Tsutomu

    2015-01-01

    The 2011 Tohoku-Oki Earthquake is a typical subduction-zone earthquake and is the 4th largest earthquake after the beginning of instrumental observation of earthquakes in the 19th century. In fact, the 2011 Tohoku-Oki Earthquake displaced the northeast Japan island arc horizontally and vertically. The displacement largely changed the tectonic situation of the arc from compressive to tensile. The 9th century in Japan was a period of natural hazards caused by frequent large-scale earthquakes. The aseismic tsunamis that inflicted damage on the Japan Sea coast in the 11th century were related to the occurrence of massive earthquakes that represented the final stage of a period of high seismic activity. Anti-compressive tectonics triggered by the subduction-zone earthquakes induced gravitational instability, which resulted in the generation of tsunamis caused by slope failing at the arc-back-arc boundary. The crustal displacement after the 2011 earthquake infers an increased risk of unexpected local tsunami flooding in the Japan Sea coastal areas.

  9. Investigating the Radiation Pattern of Earthquakes in the Central and Eastern United States and Comments on Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Bekaert, D. P.; Hooper, A. J.; Samsonov, S. V.; Wright, T. J.; González, P. J.; Pathier, E.; Kostoglodov, V.

    2014-12-01

    The radiation pattern emitted from earthquakes is not currently considered in many seismic hazard assessments. This may be due to the fact that the focal mechanisms of potential ruptures are not well studied or are assumed to be random. In this case, all mechanisms are given equal likelihood, and the effect of radiation pattern is essentially averaged. But for about a dozen earthquake sources in the central and eastern United States (CEUS), faults with known mechanism are incorporated into the hazard assessment, but the radiation pattern is not included. In this study, we investigate the radiation pattern from larger CEUS earthquakes, one of which, the 2011 M5.7 Prague earthquake, was sampled by the relatively uniform and broad coverage of USArray. The radiation pattern from this event is readily apparent below about 1 Hz out to several hundred kilometers from the epicenter and decays with increasing frequency and distance, consistent with the effects of scattering attenuation. This decay is modeled with an apparent attenuation that is 5-­10 times greater than the attenuation of Lg waves for the CEUS. We consider the radiation pattern of potential sources in the New Madrid seismic zone to show the effect of radiation pattern on the seismic hazard assessment of major metropolitan areas in the region including Memphis, Tenn., Evansville, Ind., St Louis, Mo., and Little Rock, Ark. For the scenarios we choose, earthquakes with expected mechanisms within the seismic zone, both strike-slip and thrust, tend to focus energy to the southwest towards Little Rock and to the northeast towards Evansville. Eastern Memphis and St Louis, on the other hand, tend to be in lobes of reduced seismic shaking. This can have a significant impact on seismic hazard assessment for these cities, increasing hazard for the former and decreasing it for the latter, particularly for larger structures that are sensitive to longer shaking periods. It is more complicated, however, when considering

  10. Investigating the Radiation Pattern of Earthquakes in the Central and Eastern United States and Comments on Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Boyd, O. S.

    2015-12-01

    The radiation pattern emitted from earthquakes is not currently considered in many seismic hazard assessments. This may be due to the fact that the focal mechanisms of potential ruptures are not well studied or are assumed to be random. In this case, all mechanisms are given equal likelihood, and the effect of radiation pattern is essentially averaged. But for about a dozen earthquake sources in the central and eastern United States (CEUS), faults with known mechanism are incorporated into the hazard assessment, but the radiation pattern is not included. In this study, we investigate the radiation pattern from larger CEUS earthquakes, one of which, the 2011 M5.7 Prague earthquake, was sampled by the relatively uniform and broad coverage of USArray. The radiation pattern from this event is readily apparent below about 1 Hz out to several hundred kilometers from the epicenter and decays with increasing frequency and distance, consistent with the effects of scattering attenuation. This decay is modeled with an apparent attenuation that is 5-­10 times greater than the attenuation of Lg waves for the CEUS. We consider the radiation pattern of potential sources in the New Madrid seismic zone to show the effect of radiation pattern on the seismic hazard assessment of major metropolitan areas in the region including Memphis, Tenn., Evansville, Ind., St Louis, Mo., and Little Rock, Ark. For the scenarios we choose, earthquakes with expected mechanisms within the seismic zone, both strike-slip and thrust, tend to focus energy to the southwest towards Little Rock and to the northeast towards Evansville. Eastern Memphis and St Louis, on the other hand, tend to be in lobes of reduced seismic shaking. This can have a significant impact on seismic hazard assessment for these cities, increasing hazard for the former and decreasing it for the latter, particularly for larger structures that are sensitive to longer shaking periods. It is more complicated, however, when considering

  11. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    USGS Publications Warehouse

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  12. Earthquake related tsunami hazard along the western coast of Thailand

    NASA Astrophysics Data System (ADS)

    Løvholt, F.; Bungum, H.; Harbitz, C. B.; Glimsdal, S.; Lindholm, C. D.; Pedersen, G.

    2006-11-01

    The primary background for the present study was a project to assist the authorities in Thailand with development of plans for how to deal with the future tsunami risk in both short and long term perspectives, in the wake of the devastating 26 December 2004 Sumatra-Andaman earthquake and tsunami. The study is focussed on defining and analyzing a number of possible future earthquake scenarios (magnitudes 8.5, 8.0 and 7.5) with associated return periods, each one accompanied by specific tsunami modelling. Along the most affected part of the western coast of Thailand, the 2004 tsunami wave caused a maximum water level ranging from 5 to 15 m above mean sea level. These levels and their spatial distributions have been confirmed by detailed numerical simulations. The applied earthquake source is developed based on available seismological and geodetic inversions, and the simulation using the source as initial condition agree well with sea level records and run-up observations. A conclusion from the study is that another megathrust earthquake generating a tsunami affecting the coastline of western Thailand is not likely to occur again for several hundred years. This is in part based on the assumption that the Southern Andaman Microplate Boundary near the Simeulue Islands constitutes a geologic barrier that will prohibit significant rupture across it, and in part on the decreasing subduction rates north of the Banda Ache region. It is also concluded that the largest credible earthquake to be prepared for along the part of the Sunda-Andaman arc that could affect Thailand, is within the next 50-100 years an earthquake of magnitude 8.5, which is expected to occur with more spatial and temporal irregularity than the megathrust events. Numerical simulations have shown such earthquakes to cause tsunamis with maximum water levels up to 1.5-2.0 m along the western coast of Thailand, possibly 2.5-3.0 m on a high tide. However, in a longer time perspective (say more than 50-100 years

  13. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  14. Tsunami Forecast Technology for Asteroid Impact Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Moore, C. W.

    2015-12-01

    Over 75% of all historically documented tsunamis have been generated by earthquakes. As the result, all existing Tsunami Warning and Forecast systems focus almost exclusively on detecting, warning and forecasting earthquake-generated tsunamis.The sequence of devastating tsunamis across the globe over the past 10 years has significantly heightened awareness and preparation activities associated with these high-impact events. Since the catastrophic 2004 Sumatra tsunami, NOAA has invested significant efforts in modernizing the U.S. tsunami warning system. Recent developments in tsunami modeling capability, inundation forecasting, sensing networks, dissemination capability and local preparation and mitigation activities have gone a long way toward enhancing tsunami resilience within the United States. The remaining quarter of the tsunami hazard problem is related to other mechanisms of tsunami generation, that may not have received adequate attention. Among those tsunami sources, the asteroid impact may be the most exotic, but possible one of the most devastating tsunami generation mechanisms. Tsunami forecast capabilities that have been developed for the tsunami warning system can be used to explore both, hazard assessment and the forecast of a tsunami generated by the asteroid impact. Existing tsunami flooding forecast technology allows for forecast for non-seismically generated tsunamis (asteroid impact, meteo-generated tsunamis, landslides, etc.), given an adequate data for the tsunami source parameters. Problems and opportunities for forecast of tsunamis from asteroid impact will be discussed. Preliminary results of impact-generated tsunami analysis for forecast and hazard assessment will be presented.

  15. Near real-time aftershock hazard maps for earthquakes

    NASA Astrophysics Data System (ADS)

    McCloskey, J.; Nalbant, S. S.

    2009-04-01

    Stress interaction modelling is routinely used to explain the spatial relationships between earthquakes and their aftershocks. On 28 October 2008 a M6.4 earthquake occurred near the Pakistan-Afghanistan border killing several hundred and causing widespread devastation. A second M6.4 event occurred 12 hours later 20km to the south east. By making some well supported assumptions concerning the source event and the geometry of any likely triggered event it was possible to map those areas most likely to experience further activity. Using Google earth, it would further have been possible to identify particular settlements in the source area which were particularly at risk and to publish their locations globally within about 3 hours of the first earthquake. Such actions could have significantly focused the initial emergency response management. We argue for routine prospective testing of such forecasts and dialogue between social and physical scientists and emergency response professionals around the practical application of these techniques.

  16. Assessing natural hazard risk using images and data

    NASA Astrophysics Data System (ADS)

    Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.

    2012-12-01

    Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.

  17. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  18. Too generous to a fault? Is reliable earthquake safety a lost art? Errors in expected human losses due to incorrect seismic hazard estimates

    NASA Astrophysics Data System (ADS)

    Bela, James

    2014-11-01

    "One is well advised, when traveling to a new territory, to take a good map and then to check the map with the actual territory during the journey." In just such a reality check, Global Seismic Hazard Assessment Program (GSHAP) maps (prepared using PSHA) portrayed a "low seismic hazard," which was then also assumed to be the "risk to which the populations were exposed." But time-after-time-after-time the actual earthquakes that occurred were not only "surprises" (many times larger than those implied on the maps), but they were often near the maximum potential size (Maximum Credible Earthquake or MCE) that geologically could occur. Given these "errors in expected human losses due to incorrect seismic hazard estimates" revealed globally in these past performances of the GSHAP maps (> 700,000 deaths 2001-2011), we need to ask not only: "Is reliable earthquake safety a lost art?" but also: "Who and what were the `Raiders of the Lost Art?' "

  19. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  20. Understanding Earthquake Hazard & Disaster in Himalaya - A Perspective on Earthquake Forecast in Himalayan Region of South Central Tibet

    NASA Astrophysics Data System (ADS)

    Shanker, D.; Paudyal, ,; Singh, H.

    2010-12-01

    It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were

  1. Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S. A.; Spencer, B. D.

    2015-12-01

    The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.

  2. Oklahoma experiences largest earthquake during ongoing regional wastewater injection hazard mitigation efforts

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Hayes, G. P.; McNamara, D. E.; Rubinstein, J. L.; Barnhart, W. D.; Earle, P. S.; Benz, H. M.

    2017-01-01

    The 3 September 2016, Mw 5.8 Pawnee earthquake was the largest recorded earthquake in the state of Oklahoma. Seismic and geodetic observations of the Pawnee sequence, including precise hypocenter locations and moment tensor modeling, shows that the Pawnee earthquake occurred on a previously unknown left-lateral strike-slip basement fault that intersects the mapped right-lateral Labette fault zone. The Pawnee earthquake is part of an unprecedented increase in the earthquake rate in Oklahoma that is largely considered the result of the deep injection of waste fluids from oil and gas production. If this is, indeed, the case for the M5.8 Pawnee earthquake, then this would be the largest event to have been induced by fluid injection. Since 2015, Oklahoma has undergone wide-scale mitigation efforts primarily aimed at reducing injection volumes. Thus far in 2016, the rate of M3 and greater earthquakes has decreased as compared to 2015, while the cumulative moment—or energy released from earthquakes—has increased. This highlights the difficulty in earthquake hazard mitigation efforts given the poorly understood long-term diffusive effects of wastewater injection and their connection to seismicity.

  3. 283-E and 283-W hazards assessment

    SciTech Connect

    Sutton, L.N.

    1994-09-26

    This report documents the hazards assessment for the 200 area water treatment plants 283-E and 283-W located on the US DOE Hanford Site. Operation of the water treatment plants is the responsibility of ICF Kaiser Hanford Company (ICF KH). This hazards assessment was conducted to provide emergency planning technical basis for the water treatment plants. This document represents an acceptable interpretation of the implementing guidance document for DOE ORDER 5500.3A which requires an emergency planning hazards assessment for each facility that has the potential to reach or exceed the lowest level emergency classification.

  4. KSC VAB Aeroacoustic Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.

    2010-01-01

    NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.

  5. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-01-01

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and in preparing emergency response plans. The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group of California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping (NSHM) Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault to the east of the study area. Earthquake scenarios are intended to depict the potential consequences of significant earthquakes. They are not necessarily the largest or most damaging earthquakes possible. Earthquake scenarios are both large enough and likely enough that emergency planners should consider them in regional emergency response plans. Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM).For the Hilton Creek Fault, two alternative scenarios were developed in addition to the NSHM scenario to account for different opinions in how far north the fault extends into the Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice

  6. Earthquake hazard and damage on traditional rural structures in Turkey

    NASA Astrophysics Data System (ADS)

    Korkmaz, H. H.; Korkmaz, S. Z.; Donduren, M. S.

    2010-03-01

    During the last earthquakes in Turkey, reinforced concrete structures in the cities and masonry structures in the rural part were exposed to damage and failure. Masonry houses such as earthen, brick and stone structures are composed of building blocks with weak inter-binding action which have low tension capacity. Bending and shear forces generate tensile stresses which cannot be well tolerated. In this paper, the performance of masonry structures during recent earthquakes in Turkey is discussed with illustrative photographs taken after earthquakes. The followings are the main weakness in the materials and unreinforced masonry constructions and other reasons for the extensive damage of masonry buildings. Very low tensile and shear strength particularly with poor mortar, brittle behaviour in tension as well as compression, stress concentration at corners of windows and doors, overall unsymmetry in plan and elevation of building, unsymmetry due to imbalance in the sizes and positions of walls and openings in the walls, defects in construction such as use of substandard materials, unfilled joints between bricks, not-plump walls, improper bonding between walls at right angles etc.

  7. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  8. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  9. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    USGS Publications Warehouse

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude

  10. Earthquakes and faults at Mt. Etna (Italy): time-dependent approach to the seismic hazard of the eastern flank

    NASA Astrophysics Data System (ADS)

    Peruzza, L.; Azzaro, R.; D'Amico, S.; Tuve', T.

    2009-04-01

    A time dependent approach to seismic hazard assessment, based on a renewal model using the Brownian Passage Time (BPT) distribution, has been applied to the best-known seismogenic faults at Mt. Etna volcano. These structures have been characterised by frequent coseismic surface displacement, and a long list of historically well-documented earthquakes occurred in the last 200 years (CMTE catalogue, Azzaro et al., 2000, 2002, 2006). Seismic hazard estimates, given in terms of earthquake rupture forecast, are conditioned to the time elapsed since the last event: impending events are expected on the S. Tecla Fault, and secondly on the Moscatello Fault, both involved in the highly active, geodynamic processes affecting the eastern flank of Mt. Etna. Mean recurrence time of major events is calibrated by merging the inter-event times observed at each fault; aperiodicity is tuned on b-values, following the approach proposed by Zoeller et al. (2008). Finally we compare these mean recurrence times with the values obtained by using only geometrical and kinematic information, as defined in Peruzza et al. (2008) for faults in Italy. Time-dependent hazard assessment is compared with the stationary assumption of seismicity, and validated in a retrospective forward model. Forecasted rates in a 5 years perspective (1st April 2009 to 1st April 2014), on magnitude bins compatible with macroseismic data are available for testing in the frame of the CSEP (Collaboratory for the study of Earthquake Predictability, www.cseptesting.org) project. Azzaro R., Barbano M.S., Antichi B., Rigano R.; 2000: Macroseismic catalogue of Mt. Etna earthquakes from 1832 to 1998. Acta Volcanol., con CD-ROM, 12 (1), 3-36. http://www.ct.ingv.it/Sismologia/macro/default.htm Azzaro R., D'Amico S., Mostaccio A., Scarfì L.; 2002: Terremoti con effetti macrosismici in Sicilia orientale - Calabria meridionale nel periodo Gennaio 1999 - Dicembre 2001. Quad. di Geof., 27, 1-59. Azzaro R., D'Amico S., Mostaccio A

  11. Apparent stress, fault maturity and seismic hazard for normal-fault earthquakes at subduction zones

    USGS Publications Warehouse

    Choy, G.L.; Kirby, S.H.

    2004-01-01

    The behavior of apparent stress for normal-fault earthquakes at subduction zones is derived by examining the apparent stress (?? a = ??Es/Mo, where E s is radiated energy and Mo is seismic moment) of all globally distributed shallow (depth, ?? 1 MPa) are also generally intraslab, but occur where the lithosphere has just begun subduction beneath the overriding plate. They usually occur in cold slabs near trenches where the direction of plate motion across the trench is oblique to the trench axis, or where there are local contortions or geometrical complexities of the plate boundary. Lower ??a (< 1 MPa) is associated with events occurring at the outer rise (OR) complex (between the OR and the trench axis), as well as with intracrustal events occurring just landward of the trench. The average apparent stress of intraslab-normal-fault earthquakes is considerably higher than the average apparent stress of interplate-thrust-fault earthquakes. In turn, the average ?? a of strike-slip earthquakes in intraoceanic environments is considerably higher than that of intraslab-normal-fault earthquakes. The variation of average ??a with focal mechanism and tectonic regime suggests that the level of ?? a is related to fault maturity. Lower stress drops are needed to rupture mature faults such as those found at plate interfaces that have been smoothed by large cumulative displacements (from hundreds to thousands of kilometres). In contrast, immature faults, such as those on which intraslab-normal-fault earthquakes generally occur, are found in cold and intact lithosphere in which total fault displacement has been much less (from hundreds of metres to a few kilometres). Also, faults on which high ??a oceanic strike-slip earthquakes occur are predominantly intraplate or at evolving ends of transforms. At subduction zones, earthquakes occurring on immature faults are likely to be more hazardous as they tend to generate higher amounts of radiated energy per unit of moment than

  12. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  13. Multi Hazard Assessment: The Azores Archipelagos (PT) case

    NASA Astrophysics Data System (ADS)

    Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos

    2016-04-01

    The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall

  14. Assessment and Prediction of Natural Hazards from Satellite Imagery

    PubMed Central

    Gillespie, Thomas W.; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan

    2013-01-01

    Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth’s surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth’s surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space. PMID:25170186

  15. Simulations of seismic hazard for the Pacific Northwest of the United States from earthquakes associated with the Cascadia subduction zone

    USGS Publications Warehouse

    Petersen, M.D.; Cramer, C.H.; Frankel, A.D.

    2002-01-01

    We investigate the impact of different rupture and attenuation models for the Cascadia subduction zone by simulating seismic hazard models for the Pacific Northwest of the U.S. at 2% probability of exceedance in 50 years. We calculate the sensitivity of hazard (probabilistic ground motions) to the source parameters and the attenuation relations for both intraslab and interface earthquakes and present these in the framework of the standard USGS hazard model that includes crustal earthquakes. Our results indicate that allowing the deep intraslab earthquakes to occur anywhere along the subduction zone increases the peak ground acceleration hazard near Portland, Oregon by about 20%. Alternative attenuation relations for deep earthquakes can result in ground motions that differ by a factor of two. The hazard uncertainty for the plate interface and intraslab earthquakes is analyzed through a Monte-Carlo logic tree approach and indicates a seismic hazard exceeding 1 g (0.2 s spectral acceleration) consistent with the U.S. National Seismic Hazard Maps in western Washington, Oregon, and California and an overall coefficient of variation that ranges from 0.1 to 0.4. Sensitivity studies indicate that the paleoseismic chronology and the magnitude of great plate interface earthquakes contribute significantly to the hazard uncertainty estimates for this region. Paleoseismic data indicate that the mean earthquake recurrence interval for great earthquakes is about 500 years and that it has been 300 years since the last great earthquake. We calculate the probability of such a great earthquake along the Cascadia plate interface to be about 14% when considering a time-dependent model and about 10% when considering a time-independent Poisson model during the next 50-year interval.

  16. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  17. Post-earthquake ignition vulnerability assessment of Küçükçekmece District

    NASA Astrophysics Data System (ADS)

    Yildiz, S. S.; Karaman, H.

    2013-12-01

    In this study, a geographic information system (GIS)-based model was developed to calculate the post-earthquake ignition probability of a building, considering damage to the building's interior gas and electrical distribution system and the overturning of appliances. In order to make our model more reliable and realistic, a weighting factor was used to define the possible existence of each appliance or other contents in the given occupancy. A questionnaire was prepared to weigh the relevance of the different components of post-earthquake ignitions using the analytical hierarchy process (AHP). The questionnaire was evaluated by researchers who were experienced in earthquake engineering and post-earthquake fires. The developed model was implemented to HAZTURK's (Hazards Turkey) earthquake loss assessment software, as developed by the Mid-America Earthquake Center with the help of Istanbul Technical University. The developed post-earthquake ignition tool was applied to Küçükçekmece, Istanbul, in Turkey. The results were evaluated according to structure types, occupancy types, the number of storeys, building codes and specified districts. The evaluated results support the theory that post-earthquake ignition probability is inversely proportional to the number of storeys and the construction year, depending upon the building code.

  18. An evaluation of earthquake hazard parameters in the Iranian Plateau based on the Gumbel III distribution

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Bayrak, Yusuf

    2016-04-01

    The Gumbel's third asymptotic distribution (GIII) of the extreme value method is employed to evaluate the earthquake hazard parameters in the Iranian Plateau. This research quantifies spatial mapping of earthquake hazard parameters like annual and 100-year mode beside their 90 % probability of not being exceeded (NBE) in the Iranian Plateau. Therefore, we used a homogeneous and complete earthquake catalogue during the period 1900-2013 with magnitude M w ≥ 4.0, and the Iranian Plateau is separated into equal area mesh of 1° late × 1° long. The estimated result of annual mode with 90 % probability of NBE is expected to exceed the values of M w 6.0 in the Eastern part of Makran, most parts of Central and East Iran, Kopeh Dagh, Alborz, Azerbaijan, and SE Zagros. The 100-year mode with 90 % probability of NBE is expected to overpass the value of M w 7.0 in the Eastern part of Makran, Central and East Iran, Alborz, Kopeh Dagh, and Azerbaijan. The spatial distribution of 100-year mode with 90 % probability of NBE uncovers the high values of earthquake hazard parameters which are frequently connected with the main tectonic regimes of the studied area. It appears that there is a close communication among the seismicity and the tectonics of the region.

  19. Virtual California, ETAS, and OpenHazards web services: Responding to earthquakes in the age of Big Data

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Schultz, K.; Rundle, J. B.; Glasscoe, M. T.; Donnellan, A.

    2014-12-01

    The response to the 2014 m=6 Napa earthquake showcased data driven services and technologies that aided first responders and decision makers to quickly assess damage, estimate aftershock hazard, and efficiently allocate resources where where they were most needed. These tools have been developed from fundamental research as part of a broad collaboration -- facilitated in no small party by the California Earthquake Clearinghouse, between researchers, policy makers, and executive decision makers and practiced and honed during numerous disaster response exercises over the past several years. On 24 August 2014, and the weeks following the m=6 Napa event, it became evident that these technologies will play an important role in the response to natural (and other) disasters in the 21st century. Given the continued rapid growth of computational capabilities, remote sensing technologies, and data gathering capacities -- including by unpiloted aerial vehicles (UAVs), it is reasonable to expect that both the volume and variety of data available during a response scenario will grow significantly in the decades to come. Inevitably, modern Data Science will be critical to effective disaster response in the 21st century. In this work, we discuss the roles that earthquake simulators, statistical seismicity models, and remote sensing technologies played in the the 2014 Napa earthquake response. We further discuss "Big Data" technologies and data models that facilitate the transformation of raw data into disseminable information and actionable products, and we outline a framework for the next generation of disaster response data infrastructure.

  20. Earthquake Hazard When the Rate Is Non-Stationary: The Challenge of the U. S. Midcontinent

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.; Cochran, E. S.; Llenos, A. L.; McGarr, A.; Michael, A. J.; Mueller, C. S.; Petersen, M. D.; Rubinstein, J. L.

    2014-12-01

    In July 2014, the U. S. Geological Survey released an update of the 2008 National Seismic Hazard Map for the coterminous U. S. The Map provides guidance for the seismic provisions of the building codes and portrays ground motions with a 2% chance of being exceeded in an exposure time of 50 years. Over most of the midcontinent the hazard model is derived by projecting the long-term historic, declustered earthquake rate forward in time. However, parts of the midcontinent have experienced increased seismicity levels since 2009 - locally by 2 orders of magnitude - which is incompatible with the underlying assumption of a constant-rate Poisson process. The 2014 Map acknowledged this problem, and for its intended purpose of underpinning seismic design used seismicity rates that are consistent with the entire historic record. Both the developers of the Map and its critics acknowledge that the remarkable rise of seismicity in Oklahoma and nearby states must be addressed if we are to fully capture the hazard in both space and time. The nature of the space/time distribution of the increased seismicity, as well as numerous published case studies strongly suggest that much of the increase is of anthropogenic origin. If so, the assumptions and procedures used to forecast natural earthquake rates from past rates may not be appropriate. Here we discuss key issues that must be resolved include: the geographic location of areas with elevated seismicity, either active now or potentially active in the future; local geologic conditions including faults and the state of stress; the spatial smoothing of catalog seismicity; the temporal evolution of the earthquake rate change; earthquake sequence statistics including clustering behavior; the magnitude-frequency distribution of the excess earthquakes, particularly to higher and yet unobserved magnitudes; possible source process differences between natural and induced earthquakes; and the appropriate ground motion prediction equations.

  1. The Cascadia Subduction Zone and related subduction systems: seismic structure, intraslab earthquakes and processes, and earthquake hazards

    USGS Publications Warehouse

    Kirby, Stephen H.; Wang, Kelin; Dunlop, Susan

    2002-01-01

    The following report is the principal product of an international workshop titled “Intraslab Earthquakes in the Cascadia Subduction System: Science and Hazards” and was sponsored by the U.S. Geological Survey, the Geological Survey of Canada and the University of Victoria. This meeting was held at the University of Victoria’s Dunsmuir Lodge, Vancouver Island, British Columbia, Canada on September 18–21, 2000 and brought 46 participants from the U.S., Canada, Latin America and Japan. This gathering was organized to bring together active research investigators in the science of subduction and intraslab earthquake hazards. Special emphasis was given to “warm-slab” subduction systems, i.e., those systems involving young oceanic lithosphere subducting at moderate to slow rates, such as the Cascadia system in the U.S. and Canada, and the Nankai system in Japan. All the speakers and poster presenters provided abstracts of their presentations that were a made available in an abstract volume at the workshop. Most of the authors subsequently provided full articles or extended abstracts for this volume on the topics that they discussed at the workshop. Where updated versions were not provided, the original workshop abstracts have been included. By organizing this workshop and assembling this volume, our aim is to provide a global perspective on the science of warm-slab subduction, to thereby advance our understanding of internal slab processes and to use this understanding to improve appraisals of the hazards associated with large intraslab earthquakes in the Cascadia system. These events have been the most frequent and damaging earthquakes in western Washington State over the last century. As if to underscore this fact, just six months after this workshop was held, the magnitude 6.8 Nisqually earthquake occurred on February 28th, 2001 at a depth of about 55 km in the Juan de Fuca slab beneath the southern Puget Sound region of western Washington. The Governor

  2. St. Louis Area Earthquake Hazards Mapping Project - December 2008-June 2009 Progress Report

    USGS Publications Warehouse

    Williams, R.A.; Bauer, R.A.; Boyd, O.S.; Chung, J.; Cramer, C.H.; Gaunt, D.A.; Hempen, G.L.; Hoffman, D.; McCallister, N.S.; Prewett, J.L.; Rogers, J.D.; Steckel, P.J.; Watkins, C.M.

    2009-01-01

    This report summarizes the mission, the project background, the participants, and the progress of the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) for the period from December 2008 through June 2009. During this period, the SLAEHMP held five conference calls and two face-to-face meetings in St. Louis, participated in several earthquake awareness public meetings, held one outreach field trip for the business and government community, collected and compiled new borehole and digital elevation data from partners, and published a project summary.

  3. St. Louis Area Earthquake Hazards Mapping Project - A PowerPoint Presentation

    USGS Publications Warehouse

    Williams, Robert A.

    2009-01-01

    This Open-File Report contains illustrative materials, in the form of PowerPoint slides, used for an oral presentation given at the Earthquake Insight St. Louis, Mo., field trip held on May 28, 2009. The presentation focused on summarizing the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) justification, goals, achievements, and products, for an audience of business and public officials. The individual PowerPoint slides highlight, in an abbreviated format, the topics addressed; they are discussed below and are explained with additional text as appropriate.

  4. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  5. Protocol for aquatic hazard assessment of selenium

    SciTech Connect

    Lemly, A.D.

    1995-05-24

    A procedure is described for conducting an aquatic hazard assessment of selenium. Hazard is characterized in terms of the potential for food-chain bioaccumulation and reproductive impairment in fish and aquatic birds, which are the most sensitive biological responses for estimating ecosystem-level impacts of selenium contamination. Five degrees of hazard are possible depemding on the expected environmental concentrations of selenium, exposure of fish and aquatic birds to toxic concentrations, and resultant potential for reproductive impairment. An example is given to illustrate how the protocol is applied to selenium data from a typical contaminant monitoring program.

  6. Cruise report for A1-98-SC southern California Earthquake Hazards Project

    USGS Publications Warehouse

    Normark, William R.; Bohannon, Robert G.; Sliter, Ray; Dunhill, Gita; Scholl, David W.; Laursen, Jane; Reid, Jane A.; Holton, David

    1999-01-01

    The focus of the Southern California Earthquake Hazards project, within the Western Region Coastal and Marine Geology team (WRCMG), is to identify the landslide and earthquake hazards and related ground-deformation processes that can potentially impact the social and economic well-being of the inhabitants of the Southern California coastal region, the most populated urban corridor along the U.S. Pacific margin. The primary objective is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this overall objective, we are investigating the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (see Fig. 1). In addition, the project will examine the Pliocene-Pleistocene record of how this deformation has shifted in space and time. The results of this study should improve our knowledge of shifting deformation for both the long-term (105 to several 106 yr) and short-term (<50 ky) time frames and enable us to identify actively deforming structures that may constitute current significant seismic hazards.

  7. Reducing the Risks of Nonstructural Earthquake Damage: A Practical Guide. Earthquake Hazards Reduction Series 1.

    ERIC Educational Resources Information Center

    Reitherman, Robert

    The purpose of this booklet is to provide practical information to owners, operators, and occupants of office and commercial buildings on the vulnerabilities posed by earthquake damage to nonstructural items and the means available to deal with these potential problems. Examples of dangerous nonstructural damages that have occurred in past…

  8. Assessment of landslide hazards resulting from the February 13, 2001, El Salvador earthquake; a report to the government of El Salvador and the U. S. Agency for International Development

    USGS Publications Warehouse

    Baum, Rex L.; Crone, Anthony J.; Escobar, Demetreo; Harp, Edwin L.; Major, Jon J.; Martinez, Mauricio; Pullinger, Carlos; Smith, Mark E.

    2001-01-01

    On February 13, 2001, a magnitude 6.5 earthquake occurred about 40 km eastsoutheast of the capital city of San Salvador in central El Salvador and triggered thousands of landslides in the area east of Lago de Ilopango. The landslides are concentrated in a 2,500-km2 area and are particularly abundant in areas underlain by thick deposits of poorly consolidated, late Pleistocene and Holocene Tierra Blanca rhyolitic tephras that were erupted from Ilopango caldera. Drainages in the tephra deposits are deeply incised, and steep valley walls failed during the strong shaking. Many drainages are clogged with landslide debris that locally buries the adjacent valley floor. The fine grain-size of the tephra facilitates its easy mobilization by rainfall runoff. The potential for remobilizing the landslide debris as debris flows and in floods is significant as this sediment is transported through the drainage systems during the upcoming rainy season. In addition to thousands of shallow failures, two very large landslides occurred that blocked the Rio El Desague and the Rio Jiboa. The Rio El Desague landslide has an estimated volume of 1.5 million m3, and the Rio Jiboa landslide has an estimated volume of 12 million m3. Field studies indicate that catastrophic draining of the Rio El Desague landslide-dammed lake would pose a minimal flooding hazard, whereas catastrophic draining of the Rio Jiboa lake would pose a serious hazard and warrants immediate action. Construction of a spillway across part of the dam could moderate the impact of catastrophic lake draining and the associated flood. Two major slope failures on the northern side of Volcan San Vicente occurred in the upper reaches of Quebrada Del Muerto and the Quebrada El Blanco. The landslide debris in the Quebrada Del Muerto consists dominantly of blocks of well-lithified andesite, whereas the debris in the Quebrada El Blanco consists of poorly consolidated pyroclastic sediment. The large blocks of lithified rock in

  9. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  10. Earthquake catalogs for the 2017 Central and Eastern U.S. short-term seismic hazard model

    USGS Publications Warehouse

    Mueller, Charles S.

    2017-01-01

    The U. S. Geological Survey (USGS) makes long-term seismic hazard forecasts that are used in building codes. The hazard models usually consider only natural seismicity; non-tectonic (man-made) earthquakes are excluded because they are transitory or too small. In the past decade, however, thousands of earthquakes related to underground fluid injection have occurred in the central and eastern U.S. (CEUS), and some have caused damage.  In response, the USGS is now also making short-term forecasts that account for the hazard from these induced earthquakes. Seismicity statistics are analyzed to develop recurrence models, accounting for catalog completeness. In the USGS hazard modeling methodology, earthquakes are counted on a map grid, recurrence models are applied to estimate the rates of future earthquakes in each grid cell, and these rates are combined with maximum-magnitude models and ground-motion models to compute the hazard The USGS published a forecast for the years 2016 and 2017.Here, we document the development of the seismicity catalogs for the 2017 CEUS short-term hazard model.  A uniform earthquake catalog is assembled by combining and winnowing pre-existing source catalogs. The initial, final, and supporting earthquake catalogs are made available here.

  11. Probabilistic seismic hazard assessment of Italy using kernel estimation methods

    NASA Astrophysics Data System (ADS)

    Zuccolo, Elisa; Corigliano, Mirko; Lai, Carlo G.

    2013-07-01

    A representation of seismic hazard is proposed for Italy based on the zone-free approach developed by Woo (BSSA 86(2):353-362, 1996a), which is based on a kernel estimation method governed by concepts of fractal geometry and self-organized seismicity, not requiring the definition of seismogenic zoning. The purpose is to assess the influence of seismogenic zoning on the results obtained for the probabilistic seismic hazard analysis (PSHA) of Italy using the standard Cornell's method. The hazard has been estimated for outcropping rock site conditions in terms of maps and uniform hazard spectra for a selected site, with 10 % probability of exceedance in 50 years. Both spectral acceleration and spectral displacement have been considered as ground motion parameters. Differences in the results of PSHA between the two methods are compared and discussed. The analysis shows that, in areas such as Italy, characterized by a reliable earthquake catalog and in which faults are generally not easily identifiable, a zone-free approach can be considered a valuable tool to address epistemic uncertainty within a logic tree framework.

  12. The Magnitude Frequency Distribution of Induced Earthquakes and Its Implications for Crustal Heterogeneity and Hazard

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.

    2015-12-01

    . Alternatively, the MFD of induced earthquakes may be controlled by small scale stress concentrations in a spatially variable stress field. Resolving the underlying causes of the MFD for induced earthquakes may provide key insights into the hazard posed by induced earthquakes.

  13. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  14. Recent Achievements of the Neo-Deterministic Seismic Hazard Assessment in the CEI Region

    SciTech Connect

    Panza, G. F.; Kouteva, M.; Vaccari, F.; Peresan, A.; Romanelli, F.; Cioflan, C. O.; Radulian, M.; Marmureanu, G.; Paskaleva, I.; Gribovszki, K.; Varga, P.; Herak, M.; Zaichenco, A.; Zivcic, M.

    2008-07-08

    A review of the recent achievements of the innovative neo-deterministic approach for seismic hazard assessment through realistic earthquake scenarios has been performed. The procedure provides strong ground motion parameters for the purpose of earthquake engineering, based on the deterministic seismic wave propagation modelling at different scales--regional, national and metropolitan. The main advantage of this neo-deterministic procedure is the simultaneous treatment of the contribution of the earthquake source and seismic wave propagation media to the strong motion at the target site/region, as required by basic physical principles. The neo-deterministic seismic microzonation procedure has been successfully applied to numerous metropolitan areas all over the world in the framework of several international projects. In this study some examples focused on CEI region concerning both regional seismic hazard assessment and seismic microzonation of the selected metropolitan areas are shown.

  15. Studies of crustal structure, seismic precursors to volcanic eruptions and earthquake hazard in the eastern provinces of the Democratic Republic of Congo

    NASA Astrophysics Data System (ADS)

    Mavonga, T.; Zana, N.; Durrheim, R. J.

    2010-11-01

    In recent decades, civil wars in the eastern provinces of the Democratic Republic of Congo have caused massive social disruptions, which have been exacerbated by volcanic and earthquake disasters. Seismic data were gathered and analysed as part of an effort to monitor the volcanoes and quantitatively assess the earthquake hazard. This information can be used to regulate the settlement of displaced people and to "build back better". In order to investigate volcanic processes in the Virunga area, a local seismic velocity model was derived and used to relocate earthquake hypocenters. It was found that swarm-type seismicity, composed mainly of long-period earthquakes, preceded both the 2004 and 2006 eruptions of Nyamuragira. A steady increase in seismicity was observed to commence ten or eleven months prior to the eruption, which is attributed to the movement of magma in a deep conduit. In the last stage (1 or 2 months) before the eruption, the hypocenters of long-period earthquakes became shallower. Seismic hazard maps were prepared for the DRC using a 90-year catalogue compiled for homogeneous Mw magnitudes, various published attenuation relations, and the EZ-Frisk software package. The highest levels of seismic hazard were found in the Lake Tanganyika Rift seismic zone, where peak ground accelerations (PGA) in excess of 0.32 g, 0.22 g and 0.16 g are expected to occur with 2%, 5% and 10% chance of exceedance in 50 years, respectively.

  16. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  17. Seismic Hazard Assessment in the Aspospirgos Area, Athens - Greece

    NASA Astrophysics Data System (ADS)

    Voulgaris, N.; Drakatos, G.; Lekkas, E.; Karastathis, V.; Valadaki, A.; Plessas, S.

    2005-12-01

    The extensive damages and human life loss related to the September 7, 1999 earthquake in the Athens area (Greece) initiated an effort to re-evaluate seismic hazard in various regions around the capital. One of the target areas selected within the framework of the specially designed research project ESTIA was the industrial area of Aspropirgos, where the epicenter of the main shock was located. The multidisciplinary approach towards seismic hazard assessment included a microseismicity survey and detailed geological and tectonic studies in the area in order to delineate and define the recently activated seismic sources in the area. Initially a portable network, consisting of seventeen (17) digital seismographs was installed and operated for 2 months during the autumn of 2004. A total of five hundred forty five (545) earthquakes (M<3) have been recorded. The results of the geological survey in the region were summarised in two maps compiled at a scale of 1:5,000 and 1:25,000, respectively. These data sets were combined with all the available historical and instrumental seismological data and a revised seismic source zone model was defined for the broader area and subsequently used for hazard assessment calculations. The results were presented as maximum expected peak ground acceleration and velocity distribution maps for 475 and 949 years return period or 90% probability of NBE for the next 50 and 100 years respectively. Finally in order to facilitate the implementation of the above results according to the current Greek Aseismic Code the required distribution for the 3 different soil types was mapped using the results of the geological survey. By combining the above types of data the engineer is able to calculate specific design spectra for every site while combination with available vulnerability estimates could lead to more realistic seismic risk calculations. Acknowledgments We would like to thank the General Secretariat for Research and Technology of Greece for

  18. Protection of the human race against natural hazards (asteroids, comets, volcanoes, earthquakes)

    NASA Astrophysics Data System (ADS)

    Smith, Joseph V.

    1985-10-01

    Although we justifiably worry about the danger of nuclear war to civilization, and perhaps even to survival of the human race, we tend to consider natural hazards (e.g., comets, asteroids, volcanoes, earthquakes) as unavoidable acts of God. In any human lifetime, a truly catastrophic natural event is very unlikely, but ultimately one will occur. For the first time in human history we have sufficient technical skills to begin protection of Earth from some natural hazards. We could decide collectively throughout the world to reassign resources: in particular, reduction of nuclear and conventional weapons to a less dangerous level would allow concomitant increase of international programs for detection and prevention of natural hazards. Worldwide cooperation to mitigate natural hazards might help psychologically to lead us away from the divisive bickering that triggers wars. Future generations could hail us as pioneers of peace and safety rather than curse us as agents of death and destruction.

  19. Earthquake resistant construction of gas and liquid fuel pipeline systems serving, or regulated by, the Federal government. Earthquake hazard reduction series No. 67

    SciTech Connect

    Yokel, F.Y.; Mathey, R.G.

    1992-07-01

    The vulnerability of gas and liquid fuel pipeline systems to damage in past earthquakes, as well as available standards and technologies that can protect these facilities against earthquake damage are reviewed. An overview is presented of measures taken by various Federal Agencies to protect pipeline systems under their jurisdiction against earthquake hazards. It is concluded that the overall performance of pipeline systems in past earthquakes was relatively good, however, older pipelines and above-ground storage tanks were damaged in many earthquakes. Standards and regulations for liquid fuel pipelines contain only general references to seismic loads. Standards and regulations for above-ground fuel storage tanks and for liquefied natural gas facilities contain explicit seismic design provisions. It is recommended that a guideline for earthquake resistant design of gas and liquid fuel pipeline systems be prepared for Federal Agencies to ensure a uniform approach to the protection of these systems.

  20. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  1. Active tectonics of the Seattle fault and central Puget sound, Washington - Implications for earthquake hazards

    USGS Publications Warehouse

    Johnson, S.Y.; Dadisman, S.V.; Childs, J. R.; Stanley, W.D.

    1999-01-01

    We use an extensive network of marine high-resolution and conventional industry seismic-reflection data to constrain the location, shallow structure, and displacement rates of the Seattle fault zone and crosscutting high-angle faults in the Puget Lowland of western Washington. Analysis of seismic profiles extending 50 km across the Puget Lowland from Lake Washington to Hood Canal indicates that the west-trending Seattle fault comprises a broad (4-6 km) zone of three or more south-dipping reverse faults. Quaternary sediment has been folded and faulted along all faults in the zone but is clearly most pronounced along fault A, the northernmost fault, which forms the boundary between the Seattle uplift and Seattle basin. Analysis of growth strata deposited across fault A indicate minimum Quaternary slip rates of about 0.6 mm/yr. Slip rates across the entire zone are estimated to be 0.7-1.1 mm/yr. The Seattle fault is cut into two main segments by an active, north-trending, high-angle, strike-slip fault zone with cumulative dextral displacement of about 2.4 km. Faults in this zone truncate and warp reflections in Tertiary and Quaternary strata and locally coincide with bathymetric lineaments. Cumulative slip rates on these faults may exceed 0.2 mm/yr. Assuming no other crosscutting faults, this north-trending fault zone divides the Seattle fault into 30-40-km-long western and eastern segments. Although this geometry could limit the area ruptured in some Seattle fault earthquakes, a large event ca. A.D. 900 appears to have involved both segments. Regional seismic-hazard assessments must (1) incorporate new information on fault length, geometry, and displacement rates on the Seattle fault, and (2) consider the hazard presented by the previously unrecognized, north-trending fault zone.

  2. Citizen Monitoring during Hazards: The Case of Fukushima Radiation after the 2011 Japanese Earthquake

    NASA Astrophysics Data System (ADS)

    Hultquist, C.; Cervone, G.

    2015-12-01

    Citizen-led movements producing scientific environmental information are increasingly common during hazards. After the Japanese earthquake-triggered tsunami in 2011, the government produced airborne remote sensing data of the radiation levels after the Fukushima nuclear reactor failures. Advances in technology enabled citizens to monitor radiation by innovative mobile devices built from components bought on the Internet. The citizen-led Safecast project measured on-ground levels of radiation in the Fukushima prefecture which total 14 million entries to date in Japan. This non-authoritative citizen science collection recorded radiation levels at specific coordinates and times is available online, yet the reliability and validity of the data had not been assessed. The nuclear incident provided a case for assessment with comparable dimensions of citizen science and authoritative data. To perform a comparison of the datasets, standardization was required. The sensors were calibrated scientifically but collected using different units of measure. Radiation decays over time so temporal interpolation was necessary for comparison of measurements as being the same time frame. Finally, the GPS located points were selected within the overlapping spatial extent of 500 meters. This study spatially analyzes and statistically compares citizen-volunteered and government-generated radiation data. Quantitative measures are used to assess the similarity and difference in the datasets. Radiation measurements from the same geographic extents show similar spatial variations which suggests that citizen science data can be comparable with government-generated measurements. Validation of Safecast demonstrates that we can infer scientific data from unstructured and not vested data. Citizen science can provide real-time data for situational awareness which is crucial for decision making during disasters. This project provides a methodology for comparing datasets of radiological measurements

  3. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2009-04-01

    This special issue includes selected papers on the topic of earthquake impact from the sessions held in 2004 in Nice, France and in 2005 in Vienna, Austria at the first and respectivelly the second European Geosciences Union General Assembly. Since its start in 1999, in the Hague, Netherlands, the hazard of earthquakes has been the most popular of the session. The respective calls in 2004 was for: Nature's forces including earthquakes, floods, landslides, high winds and volcanic eruptions can inflict losses to urban settlements and man-made structures such as infrastructure. In Europe, recent years have seen such significant losses from earthquakes in south and south-eastern Europe, floods in central Europe, and wind storms in western Europe. Meanwhile, significant progress has been made in understanding disasters. Several scientific fields contribute to a holistic approach in the evaluation of capacities, vulnerabilities and hazards, the main factors on mitigating urban disasters due to natural hazards. An important part of the session is devoted to assessment of earthquake shaking and loss scenarios, including both physical damage and human causalities. Early warning and rapid damage evaluation are of utmost importance for addressing the safety of many essential facilities, for emergency management of events and for disaster response. In case of earthquake occurrence strong motion networks, data processing and interpretation lead to preliminary estimation (scenarios) of geographical distribution of damages. Factual information on inflicted damage, like those obtained from shaking maps or aerial imagery permit a confrontation with simulation maps of damage in order to define a more accurate picture of the overall losses. Most recent developments towards quantitative and qualitative simulation of natural hazard impacts on urban areas, which provide decision-making support for urban disaster management, and success stories of and lessons learned from disaster

  4. Earthquake shaking hazard estimates and exposure changes in the conterminous United States

    USGS Publications Warehouse

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.

    2015-01-01

    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  5. Advanced Materials Laboratory hazards assessment document

    SciTech Connect

    Barnett, B.; Banda, Z.

    1995-10-01

    The Department of Energy Order 55OO.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the AML. The entire inventory was screened according to the potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 23 meters. The highest emergency classification is a General Emergency. The Emergency Planning Zone is a nominal area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets.

  6. Techniques for assessing industrial hazards: a manual

    SciTech Connect

    Not Available

    1988-01-01

    This manual provides guidelines for the identification of the potential hazards of new or existing plants or processes in the chemical and energy industries, and for the assessment of the consequences of the release of toxic, flammable, or explosive materials to the atmosphere. It presents a structured, simplified approach for identifying the most-serious potential hazards and for calculating their effect distances or damage ranges. It is the intention that by presenting a simplified approach, the manual should be amenable to use by engineers and scientists with little or no experience of hazard analysis. Further analysis with a view to mitigation of the hazards identified may be appropriate in many cases; at this stage, it may be necessary to seek the advice of a specialist. The basic procedure in a hazard analysis is: identify potential failures, calculate release quantities for each failure, and calculate the impact of each release on people and property. For large plants this can become highly complex, and therefore a simplified method is presented, in which the analysis has been divided into 14 steps. A spreadsheet technique was devised to permit the analyses to be carried out on a programmable calculator or personal computer. After the introductory material, the manual outlines the 14 steps that make up the hazard analysis.

  7. Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.

    2012-12-01

    An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these

  8. The hazard in using probabilistic seismic hazard analysis

    SciTech Connect

    Krinitzsky, E.L. . Geotechnical Lab.)

    1993-11-01

    Earthquake experts rely on probabilistic seismic hazard analysis for everything from emergency-response planning to development of building codes. Unfortunately, says the author, the analysis is defective for the large earthquakes that pose the greater risks. Structures have short lifetimes and the distance over which earthquakes cause damage are relatively small. Exceptions serve to prove the rule. To be useful in engineering, earthquakes hazard assessment must focus narrowly in both time and space.

  9. Widespread seismicity excitation following the 2011 M=9.0 Tohoku, Japan, earthquake and its implications for seismic hazard

    NASA Astrophysics Data System (ADS)

    Toda, S.; Stein, R. S.; Lin, J.

    2011-12-01

    The 11 March 2011 Tohoku-chiho Taiheiyo-oki earthquake (Tohoku earthquake) was followed by massive offshore aftershocks including 6 M≧7 and 94 M≧6 shocks during the 4.5 months (until July 26). It is also unprecedented that a broad increase in seismicity was observed over inland Japan at distances of up to 425 km from the locus of high seismic slip on the megathrust. Such an increase was not seen for the 2004 M=9.1 Sumatra or 2010 M=8.8 Chile earthquakes, but they lacked the seismic networks necessary to detect such small events. Here we explore the possibility that the rate changes are the product of static Coulomb stress transfer to small faults. We use the nodal planes of M≧3.5 earthquakes as proxies for such small active faults, and find that of fifteen regions averaging ˜80 by 80 km in size, 11 show a positive association between calculated stress changes and the observed seismicity rate change, 3 show a negative correlation, and for one the changes are too small to assess. This work demonstrates that seismicity can turn on in the nominal stress shadow of a mainshock as long as small geometrically diverse active faults exist there, which is likely quite common in areas having complex geologic background like Tohoku. In Central Japan, however, there are several regions where the usual tectonic stress has been enhanced by the Tohoku earthquake, and the moderate and large faults have been brought closer to failure, producing M˜5 to 6 shocks, including Nagano, near Mt. Fuji, Tokyo metropolitan area and its offshore. We confirmed that at least 5 of the seven large, exotic, or remote aftershocks were brought ≧0.3 bars closer to failure. Validated by such correlations, we evaluate the effects of the Tohoku event on the other subduction zones nearby and major active faults inland. The majorities of thrust faults inland Tohoku are brought farther from failure by the M9 event. However, we found that the large sections of the Japan trench megathrust, the outer

  10. Inundation Mapping and Hazard Assessment of Tectonic and Landslide Tsunamis in Southeast Alaska

    NASA Astrophysics Data System (ADS)

    Suleimani, E.; Nicolsky, D.; Koehler, R. D., III

    2014-12-01

    The Alaska Earthquake Center conducts tsunami inundation mapping for coastal communities in Alaska, and is currently focused on the southeastern region and communities of Yakutat, Elfin Cove, Gustavus and Hoonah. This activity provides local emergency officials with tsunami hazard assessment, planning, and mitigation tools. At-risk communities are distributed along several segments of the Alaska coastline, each having a unique seismic history and potential tsunami hazard. Thus, a critical component of our project is accurate identification and characterization of potential tectonic and landslide tsunami sources. The primary tectonic element of Southeast Alaska is the Fairweather - Queen Charlotte fault system, which has ruptured in 5 large strike-slip earthquakes in the past 100 years. The 1958 "Lituya Bay" earthquake triggered a large landslide into Lituya Bay that generated a 540-m-high wave. The M7.7 Haida Gwaii earthquake of October 28, 2012 occurred along the same fault, but was associated with dominantly vertical motion, generating a local tsunami. Communities in Southeast Alaska are also vulnerable to hazards related to locally generated waves, due to proximity of communities to landslide-prone fjords and frequent earthquakes. The primary mechanisms for local tsunami generation are failure of steep rock slopes due to relaxation of internal stresses after deglaciation, and failure of thick unconsolidated sediments accumulated on underwater delta fronts at river mouths. We numerically model potential tsunami waves and inundation extent that may result from future hypothetical far- and near-field earthquakes and landslides. We perform simulations for each source scenario using the Alaska Tsunami Model, which is validated through a set of analytical benchmarks and tested against laboratory and field data. Results of numerical modeling combined with historical observations are compiled on inundation maps and used for site-specific tsunami hazard assessment by

  11. Near-Field ETAS Constraints and Applications to Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Yoder, Mark R.; Rundle, John B.; Glasscoe, Margaret T.

    2015-08-01

    The epidemic type aftershock sequence (ETAS) statistical model of aftershock seismicity combines various earthquake scaling relations to produce synthetic earthquake catalogs, or estimates of aftershock seismicity rates, based on recent earthquake activity. One challenge to ETAS-based hazard assessment is the large number of free parameters involved. In this paper, we introduce an approach to constrain this parameter space from canonical scaling relations, empirical observations, and fundamental physics. We show that ETAS parameters can be estimated as a function of an earthquake's magnitude m based on the finite temporal and spatial extents of the rupture area. This approach facilitates fast ETAS-based estimates of seismicity from large "seed" catalogs, and it is particularly well suited to web-based deployment and otherwise automated implementations. It constitutes a significant improvement over contemporary ETAS by mitigating variability related to instrumentation and subjective catalog selection.

  12. Earthquake scenario in West Bengal with emphasis on seismic hazard microzonation of the city of Kolkata, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Maiti, S. K.; Devaraj, N.; Srivastava, N.; Mohapatra, L. D.

    2014-09-01

    Seismic microzonation is a process of estimating site-specific effects due to an earthquake on urban centers for its disaster mitigation and management. The state of West Bengal, located in the western foreland of the Assam-Arakan Orogenic Belt, the Himalayan foothills and Surma Valley, has been struck by several devastating earthquakes in the past, indicating the need for a seismotectonic review of the province, especially in light of probable seismic threat to its capital city of Kolkata, which is a major industrial and commercial hub in the eastern and northeastern region of India. A synoptic probabilistic seismic hazard model of Kolkata is initially generated at engineering bedrock (Vs30 ~ 760 m s-1) considering 33 polygonal seismogenic sources at two hypocentral depth ranges, 0-25 and 25-70 km; 158 tectonic sources; appropriate seismicity modeling; 14 ground motion prediction equations for three seismotectonic provinces, viz. the east-central Himalaya, the Bengal Basin and Northeast India selected through suitability testing; and appropriate weighting in a logic tree framework. Site classification of Kolkata performed following in-depth geophysical and geotechnical investigations places the city in D1, D2, D3 and E classes. Probabilistic seismic hazard assessment at a surface-consistent level - i.e., the local seismic hazard related to site amplification performed by propagating the bedrock ground motion with 10% probability of exceedance in 50 years through a 1-D sediment column using an equivalent linear analysis - predicts a peak ground acceleration (PGA) range from 0.176 to 0.253 g in the city. A deterministic liquefaction scenario in terms of spatial distribution of liquefaction potential index corresponding to surface PGA distribution places 50% of the city in the possible liquefiable zone. A multicriteria seismic hazard microzonation framework is proposed for judicious integration of multiple themes, namely PGA at the surface, liquefaction potential

  13. The 2011 Mineral, Virginia, earthquake and its significance for seismic hazards in eastern North America: overview and synthesis

    USGS Publications Warehouse

    Horton, J. Wright; Chapman, Martin C.; Green, Russell A.

    2015-01-01

    The earthquake and aftershocks occurred in crystalline rocks within Paleozoic thrust sheets of the Chopawamsic terrane. The main shock and majority of aftershocks delineated the newly named Quail fault zone in the subsurface, and shallow aftershocks defined outlying faults. The earthquake induced minor liquefaction sand boils, but notably there was no evidence of a surface fault rupture. Recurrence intervals, and evidence for larger earthquakes in the Quaternary in this area, remain important unknowns. This event, along with similar events during historical time, is a reminder that earthquakes of similar or larger magnitude pose a real hazard in eastern North America.

  14. Quantifying potential earthquake and tsunami hazard in the Lesser Antilles subduction zone of the Caribbean region

    USGS Publications Warehouse

    Hayes, Gavin P.; McNamara, Daniel E.; Seidman, Lily; Roger, Jean

    2013-01-01

    In this study, we quantify the seismic and tsunami hazard in the Lesser Antilles subduction zone, focusing on the plate interface offshore of Guadeloupe. We compare potential strain accumulated via GPS-derived plate motions to strain release due to earthquakes that have occurred over the past 110 yr, and compute the resulting moment deficit. Our results suggest that enough strain is currently stored in the seismogenic zone of the Lesser Antilles subduction arc in the region of Guadeloupe to cause a large and damaging earthquake of magnitude Mw ∼ 8.2 ± 0.4. We model several scenario earthquakes over this magnitude range, using a variety of earthquake magnitudes and rupture areas, and utilizing the USGS ShakeMap and PAGER software packages. Strong ground shaking during the earthquake will likely cause loss of life and damage estimated to be in the range of several tens to several hundreds of fatalities and hundreds of millions to potentially billions of U.S. dollars of damage. In addition, such an event could produce a significant tsunami. Modelled tsunamis resulting from these scenario earthquakes predict meter-scale wave amplitudes even for events at the lower end of our magnitude range (M 7.8), and heights of over 3 m in several locations with our favoured scenario (M 8.0, partially locked interface from 15–45 km depth). In all scenarios, only short lead-times (on the order of tens of minutes) would be possible in the Caribbean before the arrival of damaging waves.

  15. Application of optical remote sensing in the Wenchuan earthquake assessment

    NASA Astrophysics Data System (ADS)

    Zhang, Bing; Lei, Liping; Zhang, Li; Liu, Liangyun; Zhu, Boqin; Zuo, Zhengli

    2009-06-01

    A mega-earthquake of magnitude 8 of Richter scale occurred in Wenchuan County, Sichuan Province, China on May 12, 2008. The earthquake inflicted heavy loss of human lives and properties. The Wenchuan earthquake induced geological disasters, house collapse, and road blockage. In this paper, we demonstrate an application of optical remote sensing images acquired from airborne and satellite platforms in assessing the earthquake damages. The high-resolution airborne images were acquired by the Chinese Academy of Sciences (CAS). The pre- and post-earthquake satellite images of QuickBird, IKONOS, Landsat TM, ALOS, and SPOT were collected by the Center for Earth Observation & Digital Earth (CEODE), CAS, and some of the satellite data were provided by the United States, Japan, and the European Space Agency. The pre- and post-earthquake remote sensing images integrated with DEM and GIS data were adopted to monitor and analyze various earthquake disasters, such as road blockage, house collapse, landslides, avalanches, rock debris flows, and barrier lakes. The results showed that airborne optical images provide a convenient tool for quick and timely monitoring and assessing of the distribution and dynamic changes of the disasters over the earthquake-struck regions. In addition, our study showed that the optical remote sensing data integrated with GIS data can be used to assess disaster conditions such as damaged farmlands, soil erosion, etc, which in turn provides useful information for the postdisaster reconstruction.

  16. Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes.

    PubMed

    Murphy, S; Scala, A; Herrero, A; Lorito, S; Festa, G; Trasatti, E; Tonini, R; Romano, F; Molinari, I; Nielsen, S

    2016-10-11

    The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style.

  17. Tsunami Hazards along the Eastern Australian Coast from Potential Earthquakes: Results from Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Ding, R. W.; Yuen, D. A.

    2015-08-01

    Australia is surrounded by the Pacific Ocean and the Indian Ocean and, thus, may suffer from tsunamis due to its proximity to the subduction earthquakes around the boundary of Australian Plate. Potential tsunami risks along the eastern coast, where more and more people currently live, are numerically investigated through a scenario-based method to provide an estimation of the tsunami hazard in this region. We have chosen and calculated the tsunami waves generated at the New Hebrides Trench and the Puysegur Trench, and we further investigated the relevant tsunami hazards along the eastern coast and their sensitivities to various sea floor frictions and earthquake parameters (i.e. the strike, the dip and the slip angles and the earthquake magnitude/rupture length). The results indicate that the Puysegur trench possesses a seismic threat causing wave amplitudes over 1.5 m along the coast of Tasmania, Victoria, and New South Wales, and even reaching over 2.6 m at the regions close to Sydney, Maria Island, and Gabo Island for a certain worse case, while the cities along the coast of Queensland are potentially less vulnerable than those on the southeastern Australian coast.

  18. Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes

    PubMed Central

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Festa, G.; Trasatti, E.; Tonini, R.; Romano, F.; Molinari, I.; Nielsen, S.

    2016-01-01

    The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style. PMID:27725733

  19. Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes

    NASA Astrophysics Data System (ADS)

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Festa, G.; Trasatti, E.; Tonini, R.; Romano, F.; Molinari, I.; Nielsen, S.

    2016-10-01

    The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style.

  20. Multi-disciplinary Hazard Reduction from Earthquakes and Volcanoes in Indonesia - International Research Cooperation Program

    NASA Astrophysics Data System (ADS)

    Kato, Teruyuki

    2010-05-01

    Indonesian and Japanese researchers started a three-year (2009-2011) multi-disciplinary cooperative research project as a part of "Science and Technology Research Partnership for Sustainable Development" supported by the Japanese government. The ultimate goal of this project is to reduce disaster from earthquakes, tsunamis and volcanoes by enhancing capability of forecasting hazards, reducing social vulnerability, and education and outreach activity of research outcomes. We plan to provide platform of collaboration among researchers in natural science, engineering and social sciences, as well as officials in national and local governments. Research activities are grouped into: (1) geological and geophysical surveys of past earthquakes, monitoring current crustal activity, and simulation of future ground motion or tsunamis, (2) short-term and long-term prediction of volcanic eruptions by monitoring Semeru, Guntur and other volcanoes, and development of their evaluation method, (3) studies to establish social infrastructure based on engineering technologies and hazard maps, (4) social, cultural and religious studies to reduce vulnerability of local communities, and (5) studies on education and outreach on disaster reduction and restoration of community. In addition, to coordinate these research activities and to utilize the research results, (6) application of the research and establishment of collaboration mechanism between researchers and the government officials is planned. In addition to mutual visits and collaborative field studies, it is planned to hold annual joint seminars (in Indonesia in 2009 and 2011, in Japan in 2010) that will be broadcasted through internet. Meetings with Joint Coordinating Committee, composed of representatives of relevant Indonesian ministries and institutions as well as project members, will be held annually to oversee the activities. The kick-off workshop was held in Bandung in April 2009 and the research plans from 22 different

  1. Hazard assessment research strategy for ocean disposal

    SciTech Connect

    Gentile, J.H.; Bierman, V.J.; Paul, J.F.; Walker, H.A.; Miller, D.C.

    1989-01-01

    A decision rationale for ocean disposal based on a predictive hazard assessment research strategy is presented. The conceptual framework for hazard assessment is outlined, and its major components are identified and discussed. The strategy involves the synthesis of results from separate exposure and effects components in order to provide a scientific basis for estimating the probability (risk) of harm to the aquatic environment. The exposure assessment component consists of methodologies for determining biological effects as a function of contaminant exposure concentrations. Two case studies illustrate how a hazard assessment strategy synthesizes exposure and effects information to provide a casual linkage between mass inputs of contaminants and biological effects. The first study examines sewage-sludge disposal at Deep-water Dumpsite-106. The second study, which examines the disposal of dredged material in a shallow coastal site in central Long Island Sound, is a field verification program designed to test methodologies required for the acquisition of exposure and effects information. Both the laboratory and field data are synthesized to evaluate the accuracy and confidence of predictions of the individual methods, the tiered hierarchal concept, and the final prediction.

  2. An Integrated Geospatial System for earthquake precursors assessment in Vrancea tectonic active zone in Romania

    NASA Astrophysics Data System (ADS)

    Zoran, Maria A.; Savastru, Roxana S.; Savastru, Dan M.

    2015-10-01

    With the development of space-based technologies to measure surface geophysical parameters and deformation at the boundaries of tectonic plates and large faults, earthquake science has entered a new era. Using time series satellite data for earthquake prediction, it is possible to pursue the behaviors of earthquake precursors in the future and to announce early warnings when the differences between the predicted value and the observed value exceed the pre-define threshold value. Starting with almost one week prior to a moderate or strong earthquake a transient thermal infrared rise in LST of several Celsius degrees (oC) and the increased OLR values higher than the normal have been recorded around epicentral areas, function of the magnitude and focal depth, which disappeared after the main shock. Also are recorded associated geomagnetic and ionospheric distrurbances. Vrancea tectonic active zone in Romania is characterized by a high seismic hazard in European- Mediterranean region, being responsible of strong or moderate intermediate depth and normal earthquakes generation on a confined epicentral area. Based on recorded geophysical parameters anomalies was developed an integrated geospatial system for earthquake precursors assessment in Vrancea active seismic zone. This system integrates derived from time series MODIS Terra/Aqua, NOAA-AVHRR, ASTER, Landsat TM/ETM satellite data multi geophysical parameters (land surface temperature -LST, outgoing long-wave radiation- OLR, and mean air temperature- AT as well as geomagnetic and ionospheric data in synergy with in-situ data for surveillance and forecasting of seismic events.

  3. Earthquake recurrence and risk assessment in circum-Pacific seismic gaps

    USGS Publications Warehouse

    Thatcher, W.

    1989-01-01

    THE development of the concept of seismic gaps, regions of low earthquake activity where large events are expected, has been one of the notable achievements of seismology and plate tectonics. Its application to long-term earthquake hazard assessment continues to be an active field of seismological research. Here I have surveyed well documented case histories of repeated rupture of the same segment of circum-Pacific plate boundary and characterized their general features. I find that variability in fault slip and spatial extent of great earthquakes rupturing the same plate boundary segment is typical rather than exceptional but sequences of major events fill identified seismic gaps with remarkable order. Earthquakes are concentrated late in the seismic cycle and occur with increasing size and magnitude. Furthermore, earthquake rup-ture starts near zones of concentrated moment release, suggesting that high-slip regions control the timing of recurrent events. The absence of major earthquakes early in the seismic cycle indicates a more complex behaviour for lower-slip regions, which may explain the observed cycle-to-cycle diversity of gap-filling sequences. ?? 1989 Nature Publishing Group.

  4. Earthquake Education in Tajikistan: An assessment of perceptions, preparedness, and a pilot science-based curriculum

    NASA Astrophysics Data System (ADS)

    Mohadjer, S.; Halvorson, S. J.

    2008-12-01

    This study examines a sample of Tajik eighth and ninth graders' perceptions of earthquakes and their hazards with the intent to identify the most effective approaches for conveying earthquake science, hazards, and mitigation techniques to children in Tajikistan. We provide the results of the development of a pilot earthquake education curriculum that was implemented with Tajik students in Dushanbe, Tajikistan in winter of 2008. Prior to implementation of the curriculum, 58% of students used disconnected concepts with scientific or technical terminology when describing earthquakes, none of which accurately explained an earthquake mechanism. A notable portion of students (14%), lacking a scientific explanation for earthquakes, described earthquakes in the context of ancient legends or acts of God. The remaining students gave no explanation for earthquakes. The concept of earthquake preparation was unfamiliar to all students, with most lacking basic knowledge of procedures to follow before, during, or after an earthquake, despite having previously conducted earthquake drills at their schools. The pilot curriculum integrates earthquake science, hazard awareness, and mitigation techniques at the middle school level. Following implementation, almost all students demonstrated a basic understanding of current earthquake science, hazards, and preparedness activities. This is particularly important in Tajikistan where a growing urban earthquake risk poses a significant threat to residents and the country's future economic stability.

  5. Some Factors Controlling the Seismic Hazard due to Earthquakes Induced by Fluid Injection at Depth

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2012-12-01

    The maximum seismic moment (or moment magnitude) is an important measure of the seismic hazard associated with earthquakes induced by deep fluid injection. Although it would be advantageous to be able to predict the induced earthquake outcome, including the maximum seismic moment, of a specified fluid injection project in advance, this capability has, to date, proved to be elusive because the geomechanical and hydrological factors that control the seismic response to injection are too poorly understood. Fortunately, the vast majority of activities involving the injection of fluids into deep aquifers do not cause earthquakes that are large enough to be of any consequence. There have been, however, significant exceptions during the past 50 years, starting with the earthquakes induced by injection of wastewater at the Rocky Mountain Arsenal Well, during the 1960s, that caused extensive damage in the Denver, CO, area. Results from numerous case histories of earthquakes induced by injection activities, including wastewater disposal at depth and the development of enhanced geothermal systems, suggest that it may be feasible to estimate bounds on maximum magnitudes based on the volume of injected liquid. For these cases, volumes of injected liquid ranged from approximately 11.5 thousand to 5 million cubic meters and resulted in main shock moment magnitudes from 3.4 to 5.3. Because the maximum seismic moment appears to be linearly proportional to the total volume of injected fluid, this upper bound is expected to increase with time as long as a given injection well remains active. For example, in the Raton Basin, southern Colorado and northern New Mexico, natural gas is produced from an extensive coal bed methane field. The deep injection of wastewater associated with this gas production has induced a sequence of earthquakes starting in August 2001, shortly after the beginning of major injection activities. Most of this seismicity defines a northeast striking plane dipping

  6. Seismic hazard assessment of Western Coastal Province of Saudi Arabia: deterministic approach

    NASA Astrophysics Data System (ADS)

    Rehman, Faisal; El-Hady, Sherif M.; Atef, Ali H.; Harbi, Hussein M.

    2016-10-01

    Seismic hazard assessment is carried out by utilizing deterministic approach to evaluate the maximum expected earthquake ground motions along the Western Coastal Province of Saudi Arabia. The analysis is accomplished by incorporating seismotectonic source model, determination of earthquake magnitude ( M max), set of appropriate ground motion predictive equations (GMPE), and logic tree sequence. The logic tree sequence is built up to assign weight to ground motion scaling relationships. Contour maps of ground acceleration are generated at different spectral periods. These maps show that the largest ground motion values are emerged in northern and southern regions of the western coastal province in Saudi Arabia in comparison with the central region.

  7. Y-12 site-specific earthquake response analysis and soil liquefaction assessment

    SciTech Connect

    Ahmed, S.B.; Hunt, R.J.; Manrod, W.E. III

    1995-09-29

    A site-specific earthquake response analysis and soil liquefaction assessment were performed for the Oak Ridge Y-12 Plant. The main purpose of these studies was to use the results of the analyses for evaluating the safety of the performance category -1, -2, and -3 facilities against the natural phenomena seismic hazards. Earthquake response was determined for seven (7), one dimensional soil columns (Fig. 12) using two horizontal components of the PC-3 design basis 2000-year seismic event. The computer program SHAKE 91 (Ref. 7) was used to calculate the absolute response accelerations on top of ground (soil/weathered shale) and rock outcrop. The SHAKE program has been validated for horizontal response calculations at periods less than 2.0 second at several sites and consequently is widely accepted in the geotechnical earthquake engineering area for site response analysis.

  8. The Contribution of Paleoseismology to Seismic Hazard Assessment in Site Evaluation for Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Guerrieri, Luca; Fukushima, Yoshimitsu

    2015-04-01

    In the framework of site evaluation/re-evaluation procedures for nuclear power plants (NPP), paleoseismology plays an essential role not only for Fault Displacement Hazard Assessment (FDHA) but also for Seismic Hazard Assessment (SHA). The relevance of paleoseismology is recommended in the reference IAEA Safety Guide (IAEA SSG-9) and has been dramatically confirmed in recent time especially after the accident at the Fukushima Daiichi NPP caused by the disastrous great Tohoku earthquake and tsunami occurred on 11 March 2011. After this event, the IAEA International Seismic Safety Center promoted a technical document aimed at encouraging and supporting Member States, especially from newcomer countries, to include paleoseismic investigations into the geologic database, highlighting the value of earthquake geology studies and paleoseismology for nuclear safety and providing standard methodologies to perform such investigations. In detail, paleoseismic investigations in the context of site evaluation of nuclear installations have the following main objectives: i) identification of seismogenic structures based on the recognition of effects of past earthquakes in the regional area; ii) improvement of the completeness of earthquake catalogs, through the identification and dating of ancient moderate to large earthquakes, whose trace has been preserved in the geologic records; iii) estimation of the maximum seismic potential associated with an identified seismogenic structure/source, typically on the basis of the amount of displacement per event (evaluable in paleoseismic trenches), as well as of the geomorphic and stratigraphic features interpretable as the cumulative effect of repeated large seismic events (concept of "seismic landscape"); iv) rough calibration of probabilistic seismic hazard assessment (PSHA), by using the recurrence interval of large earthquakes detectable by paleoseismic investigations, and providing a "reality check" based on direct observations of

  9. Landslide Hazard Assessment In Mountaneous Area of Uzbekistan

    NASA Astrophysics Data System (ADS)

    Nyazov, R. A.; Nurtaev, B. S.

    Because of the growth of population and caretaking of the flat areas under agricul- ture, mountain areas have been intensively mastered, producing increase of natural and technogenic processes in Uzbekistan last years. The landslides are the most dan- gerous phenomena and 7240 of them happened during last 40 years. More than 50 % has taken place in the term of 1991 - 2000 years. The situation is aggravated be- cause these regions are situated in zones, where disastrous earthquakes with M> 7 occurred in past and are expected in the future. Continuing seismic gap in Uzbek- istan during last 15-20 years and last disastrous earthquakes occurred in Afghanistan, Iran, Turkey, Greece, Taiwan and India worry us. On the basis of long-term observa- tions the criteria of landslide hazard assessment (suddenness, displacement interval, straight-line directivity, kind of residential buildings destruction) are proposed. This methodology was developed on two geographic levels: local (town scale) and regional (region scale). Detailed risk analysis performed on a local scale and extrapolated to the regional scale. Engineering-geologic parameters content of hazard estimation of landslides and mud flows also is divided into regional and local levels. Four degrees of danger of sliding processes are distinguished for compiling of small-scale, medium- and large-scale maps. Angren industrial area in Tien-Shan mountain is characterized by initial seismic intensity of 8-9 (MSC scale). Here the human technological activity (open-cast mining) has initiated the forming of the large landslide that covers more- over 8 square kilometers and corresponds to a volume of 800 billion cubic meters. In turn the landslide influence can become the source of industrial emergencies. On an example of Angren industrial mining region, the different scenarios on safety control of residing of the people and motion of transport, regulating technologies definition of field improvement and exploitation of mountain

  10. Earthquake and Volcanic Hazard Mitigation and Capacity Building in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Ayele, A.

    2012-04-01

    The East African Rift System (EARS) is a classic example of active continental rifting, and a natural laboratory setting to study initiation and early stage evolution of continental rifts. The EARS is at different stages of development that varies from relatively matured rift (16 mm/yr) in the Afar to a weakly extended Okavango Delta in the south with predicted opening velocity < 3 mm/yr. Recent studies in the region helped researchers to highlight the length and timescales of magmatism and faulting, the partitioning of strain between faulting and magmatism, and their implications for the development of along-axis segmentation. Although the human resource and instrument coverage is sparse in the continent, our understanding of rift processes and deep structure has improved in the last decade after the advent of space geodesy and broadband seismology. The recent major earthquakes, volcanic eruptions and mega dike intrusions that occurred along the EARS attracted several earth scientist teams across the globe. However, most African countries traversed by the rift do not have the full capacity to monitor and mitigate earthquake and volcanic hazards. Few monitoring facilities exist in some countries, and the data acquisition is rarely available in real-time for mitigation purpose. Many sub-Saharan Africa governments are currently focused on achieving the millennium development goals with massive infrastructure development scheme and urbanization while impending natural hazards of such nature are severely overlooked. Collaborations with overseas researchers and other joint efforts by the international community are opportunities to be used by African institutions to best utilize limited resources and to mitigate earthquake and volcano hazards.

  11. Contributions to Earthquake Hazard Characterization in Canada from Precision GPS Data

    NASA Astrophysics Data System (ADS)

    Dragert, H.; Hyndman, R. D.; Mazzotti, S.; Wang, K.

    2004-05-01

    In the active seismic regions of Canada, the hazard posed by the recurrence of potentially devastating (M >7) earthquakes is not well defined due to the brevity of the instrumental and historical records, the lack of clear paleoseismic evidence for past large events, and the inexact nature of extrapolating the rate of occurrence of frequent small events to the occurrence of rare large events. This serious shortcoming of probabilistic seismic hazard estimation can be addressed through high-precision GPS measurements which can monitor crustal motions and regional crustal strain associated with the build-up of stress before a large earthquake. In southwestern British Columbia, over a decade of observations of motions of GPS sites of the Western Canada Deformation Array (WCDA) and GPS campaign sites have led to improved models of the locked plate interface on the Cascadia Subduction Zone and better estimates of the landward extent for the next megathrust (M~9) rupture. Regional strain rates based on continuous GPS data from the WCDA and PANGA (Pacific Northwest Geodetic Array) show that the recurrence interval for M7 crustal earthquakes is of the order of 400 years, not several decades as once estimated. Continuous GPS data from these arrays have also led to the discovery of "silent slip" or "slow earthquakes" on the deeper plate interface which do not generate impulsive seismic waves but relieve stress over periods of one to two weeks. For southern Vancouver Island and northwestern Washington State, these slip events appear to occur regularly at ~14 month intervals and have now been found to be associated with distinct, non-earthquake tremors, coining the name "Episodic Tremor and Slip" (ETS) for this newly discovered phenomenon. The repeated relief of small amounts of stress in the ETS zone provides an additional definition of the down-dip limit of megathrust rupture, and the onset of ETS activity could mark times of higher probability for the occurrence of thrust

  12. The 24th January 2016 Hawassa earthquake: Implications for seismic hazard in the Main Ethiopian Rift

    NASA Astrophysics Data System (ADS)

    Wilks, Matthew; Ayele, Atalay; Kendall, J.-Michael; Wookey, James

    2017-01-01

    Earthquakes of low to intermediate magnitudes are a commonly observed feature of continental rifting and particularly in regions of Quaternary to Recent volcanism such as in the Main Ethiopian Rift (MER). Although the seismic hazard is estimated to be less in the Hawassa region of the MER than further north and south, a significant earthquake occurred on the 24th January 2016 in the Hawassa caldera basin and close to the Corbetti volcanic complex. The event was felt up to 100 km away and caused structural damage and public anxiety in the city of Hawassa itself. In this paper we first refine the earthquake's location using data from global network and Ethiopian network stations. The resulting location is at 7.0404°N, 38.3478°E and at 4.55 km depth, which suggests that the event occurred on structures associated with the caldera collapse of the Hawassa caldera in the early Pleistocene and not through volcano-tectonic processes at Corbetti. We calculate local and moment magnitudes, which are magnitude scales more appropriate at regional hypocentral distances than (mb) at four stations. This is done using a local scale (attenuation term) previously determined for the MER and spectral analysis for ML and MW respectively and gives magnitude estimates of 4.68 and 4.29. The event indicates predominantly normal slip on a N-S striking fault structure, which suggests that slip continues to occur on Wonji faults that have exploited weaknesses inherited from the preceding caldera collapse. These results and two previous earthquakes in the Hawassa caldera of M > 5 highlight that earthquakes continue to pose a risk to structures within the caldera basin. With this in mind, it is suggested that enhanced monitoring and public outreach should be considered.

  13. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  14. Developing a global tsunami propagation database and its application for coastal hazard assessments in China

    NASA Astrophysics Data System (ADS)

    Wang, N.; Tang, L.; Titov, V.; Newman, J. C.; Dong, S.; Wei, Y.

    2013-12-01

    The tragedies of the 2004 Indian Ocean and 2011 Japan tsunamis have increased awareness of tsunami hazards for many nations, including China. The low land level and high population density of China's coastal areas place it at high risk for tsunami hazards. Recent research (Komatsubara and Fujiwara, 2007) highlighted concerns of a magnitude 9.0 earthquake on the Nankai trench, which may affect China's coasts not only in South China Sea, but also in the East Sea and Yellow Sea. Here we present our work in progress towards developing a global tsunami propagation database that can be used for hazard assessments by many countries. The propagation scenarios are computed by using NOAA's MOST numerical model. Each scenario represents a typical Mw 7.5 earthquake with predefined earthquake parameters (Gica et al., 2008). The model grid was interpolated from ETOPO1 at 4 arc-min resolution, covering -80° to72°N and 0 to 360°E. We use this database for preliminary tsunami hazard assessment along China's coastlines.

  15. Site characterization and hazard assessment criteria for natural phenomena hazards at DOE sites

    SciTech Connect

    Chen, J.C.; Lu, S.C.; Ueng, T.S.; Boissonnade, A.C.

    1993-09-01

    This paper briefly summarizes requirements for site characterization and hazard assessment of Natural Phenomena Hazards for compliance with DOE Order 5480.28. The site characterization criteria for NPH evaluation are provided in a draft DOE-STD-1022-XX and the assessment criteria of natural phenomena hazards are provided in draft DOE-STD-1023-XX.

  16. Waste Encapsulation and Storage Facility (WESF) Hazards Assessment

    SciTech Connect

    COVEY, L.I.

    2000-11-28

    This report documents the hazards assessment for the Waste Encapsulation and Storage Facility (WESF) located on the U.S. Department of Energy (DOE) Hanford Site. This hazards assessment was conducted to provide the emergency planning technical basis for WESF. DOE Orders require an emergency planning hazards assessment for each facility that has the potential to reach or exceed the lowest level emergency classification.

  17. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  18. Elevation uncertainty in coastal inundation hazard assessments

    USGS Publications Warehouse

    Gesch, Dean B.; Cheval, Sorin

    2012-01-01

    Coastal inundation has been identified as an important natural hazard that affects densely populated and built-up areas (Subcommittee on Disaster Reduction, 2008). Inundation, or coastal flooding, can result from various physical processes, including storm surges, tsunamis, intense precipitation events, and extreme high tides. Such events cause quickly rising water levels. When rapidly rising water levels overwhelm flood defenses, especially in heavily populated areas, the potential of the hazard is realized and a natural disaster results. Two noteworthy recent examples of such natural disasters resulting from coastal inundation are the Hurricane Katrina storm surge in 2005 along the Gulf of Mexico coast in the United States, and the tsunami in northern Japan in 2011. Longer term, slowly varying processes such as land subsidence (Committee on Floodplain Mapping Technologies, 2007) and sea-level rise also can result in coastal inundation, although such conditions do not have the rapid water level rise associated with other flooding events. Geospatial data are a critical resource for conducting assessments of the potential impacts of coastal inundation, and geospatial representations of the topography in the form of elevation measurements are a primary source of information for identifying the natural and human components of the landscape that are at risk. Recently, the quantity and quality of elevation data available for the coastal zone have increased markedly, and this availability facilitates more detailed and comprehensive hazard impact assessments.

  19. Kauai Test Facility hazards assessment document

    SciTech Connect

    Swihart, A

    1995-05-01

    The Department of Energy Order 55003A requires facility-specific hazards assessment be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the Kauai Test Facility, Barking Sands, Kauai, Hawaii. The Kauai Test Facility`s chemical and radiological inventories were screened according to potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance to the Early Severe Health Effects threshold is 4.2 kilometers. The highest emergency classification is a General Emergency at the {open_quotes}Main Complex{close_quotes} and a Site Area Emergency at the Kokole Point Launch Site. The Emergency Planning Zone for the {open_quotes}Main Complex{close_quotes} is 5 kilometers. The Emergency Planning Zone for the Kokole Point Launch Site is the Pacific Missile Range Facility`s site boundary.

  20. Assessing hazards along our Nation's coasts

    USGS Publications Warehouse

    Hapke, Cheryl J.; Brenner, Owen; Henderson, Rachel E.; Reynolds, B.J.

    2013-01-01

    Coastal areas are essential to the economic, cultural, and environmental health of the Nation, yet by nature coastal areas are constantly changing due to a variety of events and processes. Extreme storms can cause dramatic changes to our shorelines in a matter of hours, while sea-level rise can profoundly alter coastal environments over decades. These changes can have a devastating impact on coastal communities, such as the loss of homes built on retreating sea cliffs or protective dunes eroded by storm waves. Sometimes, however, the changes can be positive, such as new habitat created by storm deposits. The U.S. Geological Survey (USGS) is meeting the need for scientific understanding of how our coasts respond to different hazards with continued assessments of current and future changes along U.S. coastlines. Through the National Assessment of Coastal Change Hazards (NACCH), the USGS carries out the unique task of quantifying coastal change hazards along open-ocean coasts in the United States and its territories. Residents of coastal communities, emergency managers, and other stakeholders can use science-based data, tools, models, and other products to improve planning and enhance resilience.

  1. Debris flows: behavior and hazard assessment

    USGS Publications Warehouse

    Iverson, Richard M.

    2014-01-01

    Debris flows are water-laden masses of soil and fragmented rock that rush down mountainsides, funnel into stream channels, entrain objects in their paths, and form lobate deposits when they spill onto valley floors. Because they have volumetric sediment concentrations that exceed 40 percent, maximum speeds that surpass 10 m/s, and sizes that can range up to ~109 m3, debris flows can denude slopes, bury floodplains, and devastate people and property. Computational models can accurately represent the physics of debris-flow initiation, motion and deposition by simulating evolution of flow mass and momentum while accounting for interactions of debris' solid and fluid constituents. The use of physically based models for hazard forecasting can be limited by imprecise knowledge of initial and boundary conditions and material properties, however. Therefore, empirical methods continue to play an important role in debris-flow hazard assessment.

  2. Remote sensing and landslide hazard assessment

    NASA Technical Reports Server (NTRS)

    Mckean, J.; Buechel, S.; Gaydos, L.

    1991-01-01

    Remotely acquired multispectral data are used to improve landslide hazard assessments at all scales of investigation. A vegetation map produced from automated interpretation of TM data is used in a GIS context to explore the effect of vegetation type on debris flow occurrence in preparation for inclusion in debris flow hazard modeling. Spectral vegetation indices map spatial patterns of grass senescence which are found to be correlated with soil thickness variations on hillslopes. Grassland senescence is delayed over deeper, wetter soils that are likely debris flow source areas. Prediction of actual soil depths using vegetation indices may be possible up to some limiting depth greater than the grass rooting zone. On forested earthflows, the slow slide movement disrupts the overhead timber canopy, exposes understory vegetation and soils, and alters site spectral characteristics. Both spectral and textural measures from broad band multispectral data are successful at detecting an earthflow within an undisturbed old-growth forest.

  3. Final Report: Seismic Hazard Assessment at the PGDP

    SciTech Connect

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties of seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.

  4. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  5. Earthquakes

    MedlinePlus

    ... Thunderstorms & Lightning Tornadoes Tsunamis Volcanoes Wildfires Main Content Earthquakes Earthquakes are sudden rolling or shaking events caused ... at any time of the year. Before An Earthquake Look around places where you spend time. Identify ...

  6. How detailed should earthquake hazard maps be: comparing the performance of Japan's maps to uniform, randomized, and smoothed maps

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Spencer, Bruce; Liu, Mian

    2016-04-01

    Earthquake hazard maps forecast future shaking via assumptions about where, when, and how large future earthquakes will be. These assumptions involve the known earthquake history, models of fault geometry and motion, and geodetic data. Maps are made more detailed as additional data and more complicated models become available. However, the extent to which this process produces better forecasts of shaking is unknown. We explore this issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted shaking should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. Similarly, by the squared misfit metric, map performance improves up to a ~75-150 km smoothing window, and then decreases with further smoothing. Because the maps were made by using other data and models to try to predict future earthquake shaking, rather than by fitting past shaking data, these results are probably not an artifact of hindcasting rather than forecasting. They suggest that hazard models and the resulting maps can be over-parameterized, in that including too high a level of detail to describe past earthquakes may lower the maps' ability to forecast what will occur in the future. For example in Nepal, where GPS data show no significant variation in coupling between areas that have had recent large earthquakes and those that have not, past earthquakes likely do not show which parts are more at risk, and the entire area can be regarded as equally hazardous.

  7. Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS

    NASA Astrophysics Data System (ADS)

    Sabuncu, A.; Garagon Dogru, A.; Ozener, H.

    2013-12-01

    Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.

  8. Assessment of External Hazards at Radioactive Waste and Used Fuel Management Facilities - 13505

    SciTech Connect

    Gerchikov, Mark; Schneider, Glenn; Khan, Badi; Alderson, Elizabeth

    2013-07-01

    One of the key lessons from the Fukushima accident is the importance of having a comprehensive identification and evaluation of risks posed by external events to nuclear facilities. While the primary focus has been on nuclear power plants, the Canadian nuclear industry has also been updating hazard assessments for radioactive waste and used fuel management facilities to ensure that lessons learnt from Fukushima are addressed. External events are events that originate either physically outside the nuclear site or outside its control. They include natural events, such as high winds, lightning, earthquakes or flood due to extreme rainfall. The approaches that have been applied to the identification and assessment of external hazards in Canada are presented and analyzed. Specific aspects and considerations concerning hazards posed to radioactive waste and used fuel management operations are identified. Relevant hazard identification techniques are described, which draw upon available regulatory guidance and standard assessment techniques such as Hazard and Operability Studies (HAZOPs) and 'What-if' analysis. Consideration is given to ensuring that hazard combinations (for example: high winds and flooding due to rainfall) are properly taken into account. Approaches that can be used to screen out external hazards, through a combination of frequency and impact assessments, are summarized. For those hazards that cannot be screened out, a brief overview of methods that can be used to conduct more detailed hazard assessments is also provided. The lessons learnt from the Fukushima accident have had a significant impact on specific aspects of the approaches used to hazard assessment for waste management. Practical examples of the effect of these impacts are provided. (authors)

  9. Kinematics, mechanics, and potential earthquake hazards for faults in Pottawatomie County, Kansas, USA

    USGS Publications Warehouse

    Ohlmacher, G.C.; Berendsen, P.

    2005-01-01

    Many stable continental regions have subregions with poorly defined earthquake hazards. Analysis of minor structures (folds and faults) in these subregions can improve our understanding of the tectonics and earthquake hazards. Detailed structural mapping in Pottawatomie County has revealed a suite consisting of two uplifted blocks aligned along a northeast trend and surrounded by faults. The first uplift is located southwest of the second. The northwest and southeast sides of these uplifts are bounded by northeast-trending right-lateral faults. To the east, both uplifts are bounded by north-trending reverse faults, and the first uplift is bounded by a north-trending high-angle fault to the west. The structural suite occurs above a basement fault that is part of a series of north-northeast-trending faults that delineate the Humboldt Fault Zone of eastern Kansas, an integral part of the Midcontinent Rift System. The favored kinematic model is a contractional stepover (push-up) between echelon strike-slip faults. Mechanical modeling using the boundary element method supports the interpretation of the uplifts as contractional stepovers and indicates that an approximately east-northeast maximum compressive stress trajectory is responsible for the formation of the structural suite. This stress trajectory suggests potential activity during the Laramide Orogeny, which agrees with the age of kimberlite emplacement in adjacent Riley County. The current stress field in Kansas has a N85??W maximum compressive stress trajectory that could potentially produce earthquakes along the basement faults. Several epicenters of seismic events (

  10. The earthquake hazard associated with ground motion amplification in the lower Portneuf River Valley, Idaho

    NASA Astrophysics Data System (ADS)

    Peterson, Brian K.

    This thesis describes an investigation of the effect of the soil deposit in modifying earthquake-induced bedrock motion in the Pocatello South quadrangle, Idaho. The purpose of the investigation was to provide results regarding site period, peak horizontal acceleration, and amplification of peak horizontal bedrock acceleration at the surface of the soil deposit for use in local earthquake hazard planning and mitigation. An essentially deterministic approach to earthquake hazard analysis was taken. First, bedrock motion in the study area was characterized by identifying seismic sources in terms of magnitude and epicentral distance, which provided parameters for specifying simulated accelerograms. Second, the soil deposit response in the study area was characterized by determining the material properties and stratigraphy of the soil from geotechnical and geological data. Third, the bedrock motion and soil deposit input data were subjected to an equivalent linear elastic method of analysis using the SHAKE91 computer program. Finally, the resulting site period and acceleration data were used to construct earthquake hazard maps depicting site period and amplification of bedrock acceleration in the study area. The results of the analysis indicated (1) a mean site period of about 1.5 s, (2) the predominant period of short-period, high-amplitude bedrock acceleration was increased by a factor of about six, while the peak horizontal acceleration at the surface was decreased by a factor of about 0.90, and (3) the predominant period of long-period, low-amplitude bedrock acceleration was increased by a factor of about three, while the peak horizontal acceleration at the surface was increased by a factor of about two. It may be concluded from the results of the analysis that (1) amplification of bedrock acceleration in the study area may be expected at periods of input motion exceeding about 0.5 second and at peak bedrock accelerations of less than about 0.4g, and (2

  11. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  12. Seismic hazard along a crude oil pipeline in the event of an 1811-1812 type New Madrid earthquake. Technical report

    SciTech Connect

    Hwang, H.H.M.; Chen, C.H.S.

    1990-04-16

    An assessment of the seismic hazard that exists along the major crude oil pipeline running through the New Madrid seismic zone from southeastern Louisiana to Patoka, Illinois is examined in the report. An 1811-1812 type New Madrid earthquake with moment magnitude 8.2 is assumed to occur at three locations where large historical earthquakes have occurred. Six pipeline crossings of the major rivers in West Tennessee are chosen as the sites for hazard evaluation because of the liquefaction potential at these sites. A seismologically-based model is used to predict the bedrock accelerations. Uncertainties in three model parameters, i.e., stress parameter, cutoff frequency, and strong-motion duration are included in the analysis. Each parameter is represented by three typical values. From the combination of these typical values, a total of 27 earthquake time histories can be generated for each selected site due to an 1811-1812 type New Madrid earthquake occurring at a postulated seismic source.

  13. Assessment of Landslide Hazards using Geophysical Tomography

    NASA Astrophysics Data System (ADS)

    Kostyanev, S.; Iliev, I.; Stefanov, P.; Stoeva, P.

    2003-04-01

    Landslides and unstable slopes are among the major natural and man-made hazards affecting manking and yet their causes, their consequences for human life and property, and possible strategies for mitigating their effect are not very well understood. We will note, that only in Bulgaria there are over thousand active landslides on populated and health resort areas. The material and social losses have not been calculated yet. But in preliminary data they are enormous.Numerous and dangerous are the landslides and unstable slopes in opencast coal-mines too. In this paper we offer methods for combined application of high resolution electrical resistivity) tomography and seismic ray tomography for characteristic of landslide hazards and unstable ones. The major aim here is to predict where and when landsliding will occur, establishing their variability in space and time, and appraising their impact on the natural and socio-economical environment. The above methods are applied for studing of concrete landslide in Bulgarian Black Sea and on some unstable slopes in an opencast coal-mine of Maritza-Iztok area. This combined application of electrical and seismic tomography for assessment of landslide hazard is very usefull.

  14. Methods for probabilistic assessments of geologic hazards

    SciTech Connect

    Mann, C.J.

    1987-01-01

    Although risk analysis today is considered to include three separate aspects: (1) identifying sources of risk, (2) estimating probabilities quantitatively, and (3) evaluating consequences of risk, here, only estimation of probabilities for natural geologic events, processes, and phenomena is addressed. Ideally, evaluation of potential future hazards includes an objective determination of probabilities that has been derived from past occurrences of identical events or components contributing to complex processes or phenomena. In practice, however, data which would permit objective estimation of those probabilities of interest may not be adequate, or may not even exist. Another problem that arises normally, regardless of the extent of data, is that risk assessments involve estimating extreme values. Rarely are extreme values accurately predictable even when an empirical frequency distribution is established well by data. In the absence of objective methods for estimating probabilities of natural events or processes, subjective probabilities for the hazard must be established through Bayesian methods, expert opinion, or Delphi methods. Uncertainty of every probability determination must be stated for each component of an event, process, or phenomenon. These uncertainties also must be propagated through the quantitative analysis so that a realistic estimate of total uncertainty can be associated with each final probability estimate for a geologic hazard.

  15. Characterizing soils for hazardous waste site assessments.

    PubMed

    Breckenridge, R P; Keck, J F; Williams, J R

    1994-04-01

    This paper provides a review and justification of the minimum data needed to characterize soils for hazardous waste site assessments and to comply with the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA). Scientists and managers within the regulatory agency and the liable party need to know what are the important soil characteristics needed to make decisions about risk assessment, what areas need remediation and what remediation options are available. If all parties involved in characterizing a hazardous waste site can agree on the required soils data set prior to starting a site investigation, data can be collected in a more efficient and less costly manner. Having the proper data will aid in reaching decisions on how to address concerns at, and close-out, hazardous waste sites.This paper was prepared to address two specific concerns related to soil characterization for CERCLA remedial response. The first concern is the applicability of traditional soil classification methods to CERCLA soil characterization. The second is the identification of soil characterization data type required for CERCLA risk assessment and analysis of remedial alternatives. These concerns are related, in that the Data Quality Objective (DQO) process addresses both. The DQO process was developed in part to assist CERCLA decision-makers in identifying the data types, data quality, and data quantity required to support decisions that must be made during the remedial investigation/feasibility study (RI/FS) process. Data Quality Objectives for Remedial Response Activities: Development Process (US EPA, 1987a) is a guidebook on developing DQOs. This process as it relates to CERCLA soil characterization is discussed in the Data Quality Objective Section of this paper.

  16. Probabilistic Storm Surge Hazard Assessment in Martinique

    NASA Astrophysics Data System (ADS)

    Krien, Yann; Dudon, Bernard; Sansorgne, Eliot; Roger, Jean; Zahibo, Narcisse; Roquelaure, Stevie

    2013-04-01

    Located at the center of the Lesser Antilles, Martinique is under the threat of hurricanes formed over the warm tropical waters of the Atlantic Ocean and Caribbean Sea. These events can be extremely costly in terms of human, property, and economic losses. Storm surge hazard studies are hence required to provide guidance to emergency managers and decision-makers. A few studies have been conducted so far in the French Lesser Antilles, but they mainly rely on scarce historical data of extreme sea levels or numerical models with coarse resolutions. Recent progress in statistical techniques for generating large number of synthetic hurricanes as well as availability of high-resolution topographic and bathymetric data (LIDAR) and improved numerical models enables us today to conduct storm surge hazard assessment studies with much more accuracy. Here we present a methodology to assess cyclonic surge hazard in Martinique both at regional and local scales. We first simulate the storm surges that would be induced by a large set of potential events generated by the statistical/deterministic models of Emanuel et al. [2006]. We use the ADCIRC-SWAN coupled models (Dietrich et al 2012) to simulate inundation inland with grid resolutions of up to 50-100m in the coastal area for the whole island.These models are validated against observations during past events such as hurricane Dean in 2007. The outputs can then be used in some specific sites to force higher resolution models for crisis management and local risk assessment studies. This work is supported by the INTERREG IV « Caribbean » program TSUNAHOULE.

  17. Determination of Bedrock Variations and S-wave Velocity Structure in the NW part of Turkey for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Ozel, A. O.; Arslan, M. S.; Aksahin, B. B.; Genc, T.; Isseven, T.; Tuncer, M. K.

    2015-12-01

    Tekirdag region (NW Turkey) is quite close to the North Anatolian Fault which is capable of producing a large earthquake. Therefore, earthquake hazard mitigation studies are important for the urban areas close to the major faults. From this point of view, integration of different geophysical methods has important role for the study of seismic hazard problems including seismotectonic zoning. On the other hand, geological mapping and determining the subsurface structure, which is a key to assist management of new developed areas, conversion of current urban areas or assessment of urban geological hazards can be performed by integrated geophysical methods. This study has been performed in the frame of a national project, which is a complimentary project of the cooperative project between Turkey and Japan (JICA&JST), named as "Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education". With this principal aim, this study is focused on Tekirdag and its surrounding region (NW of Turkey) where some uncertainties in subsurface knowledge (maps of bedrock depth, thickness of quaternary sediments, basin geometry and seismic velocity structure,) need to be resolved. Several geophysical methods (microgravity, magnetic and single station and array microtremor measurements) are applied and the results are evaluated to characterize lithological changes in the region. Array microtremor measurements with several radiuses are taken in 30 locations and 1D-velocity structures of S-waves are determined by the inversion of phase velocities of surface waves, and the results of 1D structures are verified by theoretical Rayleigh wave modelling. Following the array measurements, single-station microtremor measurements are implemented at 75 locations to determine the predominant frequency distribution. The predominant frequencies in the region range from 0.5 Hz to 8 Hz in study area. On the other hand, microgravity and magnetic measurements are performed on

  18. Evaluating earthquake hazards in the Los Angeles region; an earth-science perspective

    USGS Publications Warehouse

    Ziony, Joseph I.

    1985-01-01

    Potentially destructive earthquakes are inevitable in the Los Angeles region of California, but hazards prediction can provide a basis for reducing damage and loss. This volume identifies the principal geologically controlled earthquake hazards of the region (surface faulting, strong shaking, ground failure, and tsunamis), summarizes methods for characterizing their extent and severity, and suggests opportunities for their reduction. Two systems of active faults generate earthquakes in the Los Angeles region: northwest-trending, chiefly horizontal-slip faults, such as the San Andreas, and west-trending, chiefly vertical-slip faults, such as those of the Transverse Ranges. Faults in these two systems have produced more than 40 damaging earthquakes since 1800. Ninety-five faults have slipped in late Quaternary time (approximately the past 750,000 yr) and are judged capable of generating future moderate to large earthquakes and displacing the ground surface. Average rates of late Quaternary slip or separation along these faults provide an index of their relative activity. The San Andreas and San Jacinto faults have slip rates measured in tens of millimeters per year, but most other faults have rates of about 1 mm/yr or less. Intermediate rates of as much as 6 mm/yr characterize a belt of Transverse Ranges faults that extends from near Santa Barbara to near San Bernardino. The dimensions of late Quaternary faults provide a basis for estimating the maximum sizes of likely future earthquakes in the Los Angeles region: moment magnitude .(M) 8 for the San Andreas, M 7 for the other northwest-trending elements of that fault system, and M 7.5 for the Transverse Ranges faults. Geologic and seismologic evidence along these faults, however, suggests that, for planning and designing noncritical facilities, appropriate sizes would be M 8 for the San Andreas, M 7 for the San Jacinto, M 6.5 for other northwest-trending faults, and M 6.5 to 7 for the Transverse Ranges faults. The

  19. Earthquake hazard analysis for the different regions in and around Aǧrı

    NASA Astrophysics Data System (ADS)

    Bayrak, Erdem; Yilmaz, Şeyda; Bayrak, Yusuf

    2016-04-01

    We investigated earthquake hazard parameters for Eastern part of Turkey by determining the a and b parameters in a Gutenberg-Richter magnitude-frequency relationship. For this purpose, study area is divided into seven different source zones based on their tectonic and seismotectonic regimes. The database used in this work was taken from different sources and catalogues such as TURKNET, International Seismological Centre (ISC), Incorporated Research Institutions for Seismology (IRIS) and The Scientific and Technological Research Council of Turkey (TUBITAK) for instrumental period. We calculated the a value, b value, which is the slope of the frequency-magnitude Gutenberg-Richter relationship, from the maximum likelihood method (ML). Also, we estimated the mean return periods, the most probable maximum magnitude in the time period of t-years and the probability for an earthquake occurrence for an earthquake magnitude ≥ M during a time span of t-years. We used Zmap software to calculate these parameters. The lowest b value was calculated in Region 1 covered Cobandede Fault Zone. We obtain the highest a value in Region 2 covered Kagizman Fault Zone. This conclusion is strongly supported from the probability value, which shows the largest value (87%) for an earthquake with magnitude greater than or equal to 6.0. The mean return period for such a magnitude is the lowest in this region (49-years). The most probable magnitude in the next 100 years was calculated and we determined the highest value around Cobandede Fault Zone. According to these parameters, Region 1 covered the Cobandede Fault Zone and is the most dangerous area around the Eastern part of Turkey.

  20. An Earthquake Source Ontology for Seismic Hazard Analysis and Ground Motion Simulation

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Jordan, T. H.; Gil, Y.; Ratnakar, V.

    2005-12-01

    Representation of the earthquake source is an important element in seismic hazard analysis and earthquake simulations. Source models span a range of conceptual complexity - from simple time-independent point sources to extended fault slip distributions. Further computational complexity arises because the seismological community has established so many source description formats and variations thereof; what this means is that conceptually equivalent source models are often expressed in different ways. Despite the resultant practical difficulties, there exists a rich semantic vocabulary for working with earthquake sources. For these reasons, we feel it is appropriate to create a semantic model of earthquake sources using an ontology, a computer science tool from the field of knowledge representation. Unlike the domain of most ontology work to date, earthquake sources can be described by a very precise mathematical framework. Another uniqueness associated with developing such an ontology is that earthquake sources are often used as computational objects. A seismologist generally wants more than to simply construct a source and have it be well-formed and properly described; additionally, the source will be used for performing calculations. Representation and manipulation of complex mathematical objects presents a challenge to the ontology development community. In order to enable simulations involving many different types of source models, we have completed preliminary development of a seismic point source ontology. The use of an ontology to represent knowledge provides machine interpretability and the ability to validate logical consistency and completeness. Our ontology, encoded using the OWL Web Ontology Language - a standard from the World Wide Web Consortium, contains the conceptual definitions and relationships necessary for source translation services. For example, specification of strike, dip, rake, and seismic moment will automatically translate into a double

  1. Earthquake hazard and vulnerability in the northeastern Mediterranean: the Corinth earthquake sequence of February-March 1981.

    PubMed

    Ambraseys, N N; Jackson, J A

    1981-12-01

    Population density, building type and earthquake magnitude are the main factors on which the total building damage following an earthquake depend. For the Greece - Turkey region quantitative relations between these factors have been developed, which, in spite of the inaccuracies in the available data, allow crude estimates of the damage following a particular size earthquake to be made. This is demonstrated retrospectively for the Gulf of Corinth earthquakes in 1981.

  2. RiskScape Volcano: Development of a risk assessment tool for volcanic hazards

    NASA Astrophysics Data System (ADS)

    Deligne, Natalia; King, Andrew; Jolly, Gill; Wilson, Grant; Wilson, Tom; Lindsay, Jan

    2013-04-01

    RiskScape is a multi-hazard risk assessment tool developed by GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand that models the risk and impact of various natural hazards on a given built environment. RiskScape has a modular structure: the hazard module models hazard exposure (e.g., ash thickness at a given location), the asset module catalogues assets (built environment, infrastructure, and people) and their attributes exposed to the hazard, and the vulnerability module models the consequences of asset exposure to the hazard. Hazards presently included in RiskScape are earthquakes, river floods, tsunamis, windstorms, and ash from volcanic eruptions (specifically from Ruapehu). Here we present our framework for incorporating other volcanic hazards (e.g., pyroclastic density currents, lava flows, lahars, ground deformation) into RiskScape along with our approach for assessing asset vulnerability. We also will discuss the challenges of evaluating risk for 'point source' (e.g., stratovolcanoes) vs 'diffuse' (e.g., volcanic fields) volcanism using Ruapehu and the Auckland volcanic field as examples. Once operational, RiskScape Volcano will be a valuable resource both in New Zealand and internationally as a practical tool for evaluating risk and also as an example for how to predict the consequences of volcanic eruptions on both rural and urban environments.

  3. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety workplace hazards through appropriate workplace monitoring; (2) Document assessment for chemical, physical... hazards; (6) Perform routine job activity-level hazard analyses; (7) Review site safety and...

  4. Influence of the Great Megathrust Earthquakes of the Past Decade on Risk Assessment and Outreach Programs

    NASA Astrophysics Data System (ADS)

    Dengler, L. A.

    2014-12-01

    Four subduction zone earthquakes of magnitude ≥ 8.6 occurred between 2004 and 2013. No earthquakes of this size were reported anywhere in the world in the preceding 36 years. The wealth of seismic, geodetic, geologic and tsunami data from these great megathrust events has advanced the understanding of subduction zones and challenged a number of previously accepted ideas. This talk focuses on how they have also influenced risk assessment and preparedness programs. Megathrust earthquakes differ from other large damaging earthquakes. The size of the megathrust source means a much larger area may be impacted by earthquake shakingaffecting not only the amount of damage, but posing response and recovery challenges. A second factor is tsunami generation. About a third of the 760,000 casualties in the decade were caused by the four mega-earthquakes. All four produced deadly tsunamis and over 95% of the death total was attributed to tsunami. Even when the extraordinarily deadly 2004 Andaman Sumatra tsunami is removed from the data set, 85% of the casualties in the remaining three earthquakes were caused by tsunami. In contrast, in the non-megathrust events caused over two-thirds of the decade's casualties but less than 1 % were caused by tsunami. The Cascadia subduction zone along the coast of northern California, Oregon, Washington and southern British Columbia is the only location in the contiguous 48 states where a great megathrust earthquake will someday occur. Assessing the risk posed by Cascadia and developing effective preparedness programs pose a number of challenges. Awareness of Cascadia is relatively recent and assessing the magnitude, recurrence and nature of past events depends primarily on paleoseismology. The megathrust events of the past decade provide a proxy for and a general picture of the likely impacts of a future Cascadia earthquake and have influenced preparedness efforts throughout the Cascadia region. The recent events have also posed problems for

  5. Flood hazard assessment for french NPPs

    NASA Astrophysics Data System (ADS)

    Rebour, Vincent; Duluc, Claire-Marie; Guimier, Laurent

    2015-04-01

    This paper presents the approach for flood hazard assessment for NPP which is on-going in France in the framework of post-Fukushima activities. These activities were initially defined considering both European "stress tests" of NPPs pursuant to the request of the European Council, and the French safety audit of civilian nuclear facilities in the light of the Fukushima Daiichi accident. The main actors in that process are the utility (EDF is, up to date, the unique NPP's operator in France), the regulatory authority (ASN) and its technical support organization (IRSN). This paper was prepared by IRSN, considering official positions of the other main actors in the current review process, it was not officially endorsed by them. In France, flood hazard to be considered for design basis definition (for new NPPs and for existing NPPs in periodic safety reviews conducted every 10 years) was revised before Fukushima-Daichi accident, due to le Blayais NPP December 1999 experience (partial site flooding and loss of some safety classified systems). The paper presents in the first part an overview of the revised guidance for design basis flood. In order to address design extension conditions (conditions that could result from natural events exceeding the design basis events), a set of flooding scenarios have been defined by adding margins on the scenarios that are considered for the design. Due to the diversity of phenomena to be considered for flooding hazard, the margin assessment is specific to each flooding scenario in terms of parameter to be penalized and of degree of variation of this parameter. The general approach to address design extension conditions is presented in the second part of the paper. The next parts present the approach for five flooding scenarios including design basis scenario and additional margin to define design extension scenarios.

  6. Assessment of Liquefaction Susceptibility of Kutahya Soils Based on Recent Earthquakes in Turkey

    NASA Astrophysics Data System (ADS)

    Zengin, Enes; Abiddin Erguler, Zeynal

    2014-05-01

    The plate tectonic setting of Turkey resulted many destructive earthquakes having magnitude higher than 7 in several cities situated close to faulting system. The city of Kutahya and its surrounding counties are notable examples to be located in the earthquake prone region and therefore, several earthquakes have been recently recorded particularly in its Simav district. A significant part of the residential area of Kutahya is found on alluvial deposits dominated by silt and fine sand size materials, and its southern boundary is controlled by Kutahya fault zone (KFZ) extending parallel to the city settlement. In this study, considering the possibility of a potential destructive earthquake in future as well as increasing population dependent further demand for new building in this city, investigation liquefaction potential of these soils is aimed for using in earthquake risk mitigation strategies. For this purpose, physical, ground water condition and standard penetration test (SPT) results from 283 different boreholes spreading over a wide area were examined to understand the behaviour this soil under earthquake induced dynamic loading. The total assessed drilling depth is about 2140 m. Required corrections were applied to all SPT data for obtaining SPT-(N1)60 values for liquefaction analyses. The estimation representative magnitude, depth of epicentre and maximum ground acceleration (amax) based on previous earthquakes and faulting characteristics of KFZ were initial targets for accurately assessment liquefaction phenomena of this city. For determination of amax in this region, in addition to attenuation relationship based on Turkish strong ground motion data, individual measurements from earthquakes stations closing to study site were also collected. As a result of all analyses and reviewing previous earthquakes records in this region, earthquake magnitudes vary between 5.0 and 7.4, and amax values changing between 400 and 800 gal were used in liquefaction

  7. Past earthquake history and seismic hazard in Fucino region, Central Italy

    NASA Astrophysics Data System (ADS)

    Schlagenhauf, A.; Manighetti, I.; Benedetti, L. C.; Gaudemer, Y.; Pou, K.

    2009-12-01

    a single major fault within the WFN system, the maximum magnitudes estimated for those earthquakes are 6.5-6.9. In the Fucino plain, the adjacent Trasacco fault shows a similar behavior, as it primarily broke during two 2-3 kyrs-long periods of paroxysmal activity, at 14.5-12 and 8.5-6.5 ka, which thus do not coincide in time with those recognized on the Magnola-Velino system. Assuming that the faults reload at a constant rate (mean slip rate estimated from our measurements), our data suggest that the faults have entered a paroxysmal phase when they had reached a certain threshold of cumulative strain. Though the Magnola-Velino fault system has not broken since long (about 1 ka), the cumulative strain it has accommodated since then is still below the threshold discussed above. By contrast, though part of the Trasacco fault has broken less than a century ago (1915), the fault is approaching the cumulative strain threshold from which it may enter in a paroxysmal phase. Though those results need further refinements, they suggest that seismic hazard in the Fucino region is high.

  8. Assessing community vulnerabilities to natural hazards on the Island of Hawaii

    NASA Astrophysics Data System (ADS)

    Nishioka, Chris; Delparte, Donna

    2010-05-01

    The island of Hawaii is susceptible to numerous natural hazards such as tsunamis, flooding, lava flow, earthquakes, hurricanes, landslides, wildfires and storm surge. The impact of a natural disaster on the island's communities has the potential to endanger peoples' lives and threaten critical infrastructure, homes, businesses and economic drivers such as tourism. A Geographic Information System (GIS) has the ability to assess community vulnerabilities by examining the spatial relationships between hazard zones, socioeconomic infrastructure and demographic data. By drawing together existing datasets, GIS was used to examine a number of community vulnerabilities. Key areas of interest were government services, utilities, property assets, industry and transportation. GIS was also used to investigate population dynamics in hazard zones. Identification of community vulnerabilities from GIS analysis can support mitigation measures and assist planning and response measures to natural hazards.

  9. Probabilistic Seismic Hazard Assessment for Iraq

    SciTech Connect

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq; Shakir, Ammar M.; Mahdi, Hanan; Numan, Nazar M.S.; Al-Shukri, Haydar; Chlaib, Hussein K.; Ameen, Taher H.; Abd, Najah A.

    2016-05-06

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al., 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.

  10. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  11. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  12. Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada

    NASA Astrophysics Data System (ADS)

    Eisses, A.; Kell, A. M.; Kent, G.; Driscoll, N. W.; Karlin, R. E.; Baskin, R. L.; Louie, J. N.; Smith, K. D.; Pullammanappallil, S.

    2011-12-01

    Preliminary slip rates measured across the East Pyramid Lake fault, or the Lake Range fault, help provide new estimates of extension across the Pyramid Lake basin. Multiple stratigraphic horizons spanning 48 ka were tracked throughout the lake, with layer offsets measured across all significant faults in the basin. A chronstratigraphic framework acquired from four sediment cores allows slip rates of the Lake Range and other faults to be calculated accurately. This region of the northern Walker Lake, strategically placed between the right-lateral strike-slip faults of Honey and Eagle Lakes to the north, and the normal fault bounded basins to the southwest (e.g., Tahoe, Carson), is critical in understanding the underlying structural complexity that is not only necessary for geothermal exploration, but also earthquake hazard assessment due to the proximity of the Reno-Sparks metropolitan area. In addition, our seismic CHIRP imaging with submeter resolution allows the construction of the first fault map of Pyramid Lake. The Lake Range fault can be obviously traced west of Anahoe Island extending north along the east end of the lake in numerous CHIRP lines. Initial drafts of the fault map reveal active transtension through a series of numerous, small, northwest striking, oblique-slip faults in the north end of the lake. A previously field mapped northwest striking fault near Sutcliff can be extended into the west end of Pyramid Lake. This fault map, along with the calculated slip rate of the Lake Range, and potentially multiple other faults, gives a clearer picture into understanding the geothermal potential, tectonic regime and earthquake hazards in the Pyramid Lake basin and the northern Walker Lane. These new results have also been merged with seismicity maps, along with focal mechanisms for the larger events to begin to extend our fault map in depth.

  13. Broadband Ground Motion Simulation Recipe for Scenario Hazard Assessment in Japan

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Fujiwara, H.; Irikura, K.

    2014-12-01

    codes using ground motions from the 2005 Fukuoka earthquake. Irikura and Miyake (2011) summarized the latter validations, concluding that the ground motions were successfully simulated as shown in the figure. This indicates that the recipe has enough potential to generate broadband ground motions for scenario hazard assessment in Japan.

  14. Geologic Hazards Associated with Longmen Shan Fault zone, During and After the Mw 8.0, May 12, 2008 Earthquake

    NASA Astrophysics Data System (ADS)

    Xu, X.; Kusky, T.; Li, Z.

    2008-12-01

    A magnitude 8.0 earthquake shook the northeastern margin of the Tibetan plateau, on May 12, 2008 along the Longmen Shan orogenic belt that marks the boundary between the Songpan Ganzi terrane and Yangtze block. The Tibetan plateau is expanding eastwards, and GPS observations show that surface motion directions are northeast relative to the Sizhuan basin where the earthquake occurred. This sense of motion of crustal blocks is the reason why the main faults in Longmen Shan are oblique thrust-dextral strike slip faults. There are three main parallel thrust/ dextral-slip faults in Longmen Shan. All three faults strike northeast and dip to northwest. The May 12 rupture extends 270 km along the fault zone, and the epicenter of the magnitude 8.0 earthquake was located in Wenchuan, 90 km WNW of Chengdu, Sichuan, China. The devastating earthquake killed at least 87,652 people and destroyed all the buildings in epicenter. The victims of the earthquake zone want to rebuild their homes immediately, but they need more suggestions about the geologic hazards to help them withstand future possible earthquakes. So after earthquake, we went to disaster areas from July 5th to 10th to get first-hand field data, which include observations of surface ruptures, landslides, features of X joints on the damaged buildings, parameters of the active faults and landslides. If we only depend on the field data in accessible locations, we can only know the information of the ruptures in these positions, and we can't learn more information about the whole area affected by the earthquake. The earthquake zone shows surface rupture features of both thrust and strike-slip fault activities, indicating oblique slip followed by thrusting during the May 12 earthquake. In my talk, I will show the general regional geological disaster information by processing the pro- and post-earthquake satellite data. Then we combine the raw field data and regional geology as the restrictive conditions to determine the

  15. Long aftershock sequences in North China and Central US: implications for hazard assessment in mid-continents

    NASA Astrophysics Data System (ADS)

    Liu, Mian; Luo, Gang; Wang, Hui; Stein, Seth

    2014-02-01

    Because seismic activity within mid-continents is usually much lower than that along plate boundary zones, even small earthquakes can cause widespread concerns, especially when these events occur in the source regions of previous large earthquakes. However, these small earthquakes may be just aftershocks that continue for decades or even longer. The recent seismicity in the Tangshan region in North China is likely aftershocks of the 1976 Great Tangshan earthquake. The current earthquake sequence in the New Madrid seismic zone in central United States, which includes a cluster of M ~ 7.0 events in 1811-1812 and a number of similar events in the past millennium, is believed to result from recent fault reactivation that releases pre-stored strain energy in the crust. If so, this earthquake sequence is similar to aftershocks in that the rates of energy release should decay with time and the sequence of earthquakes will eventually end. We use simple physical analysis and numerical simulations to show that the current sequence of large earthquakes in the New Madrid fault zone is likely ending or has ended. Recognizing that mid-continental earthquakes have long aftershock sequences and complex spatiotemporal occurrences are critical to improve hazard assessments.

  16. Hazards assessment for the Waste Experimental Reduction Facility

    SciTech Connect

    Calley, M.B.; Jones, J.L. Jr.

    1994-09-19

    This report documents the hazards assessment for the Waste Experimental Reduction Facility (WERF) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. This hazards assessment describes the WERF, the area surrounding WERF, associated buildings and structures at WERF, and the processes performed at WERF. All radiological and nonradiological hazardous materials stored, used, or produced at WERF were identified and screened. Even though the screening process indicated that the hazardous materials could be screened from further analysis because the inventory of radiological and nonradiological hazardous materials were below the screening thresholds specified by DOE and DOE-ID guidance for DOE Order 5500.3A, the nonradiological hazardous materials were analyzed further because it was felt that the nonradiological hazardous material screening thresholds were too high.

  17. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    NASA Technical Reports Server (NTRS)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  18. Introducing ShakeMap Atlas 2.0: An improved suite of recent historical earthquake ShakeMaps for global hazard analyses

    USGS Publications Warehouse

    Garcia, D.; Mah, R.T.; Johnson, K.L.; Worden, B.C.; Hearne, M.G.; Marano, K.D.; Lin, K.; Wald, D.J.

    2011-01-01

    The U.S. Geological Survey (USGS) ShakeMap system is a widely used tool for assessing the ground motion during an earthquake in near-real time applications, but also for past events and seismic scenarios. The ShakeMap Atlas (Allen et al., 2008) is a compilation of nearly 5,000 ShakeMaps of global events that comprises the most damaging and potentially damaging earthquakes between 1973 and 2007. The Atlas is an invaluable resource for investigating strong ground-motion near the source, and it is also used for calibrating the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system. Here we present an extensively revised version of the Atlas, which includes as new features the use of: (1) a new version of ShakeMap; (2) an updated source catalog; (3) a refined ground-motion prediction equation (GMPE) selection; and (4) many more macroseismic intensity and ground-motion data. The new version of ShakeMap (V3.5; Worden et al., 2010) treats in a separate way native and converted data when mapping each intensity measure (MMI, PGA, PGV, and PSA). This is especially important for intensity observations, which are the main data source in the aftermath of most global events. ShakeMap V3.5 also allows for the inclusion of intensity prediction equations and makes use of improved mapping techniques and uncertainty estimations. Earthquake global hypocenters have been substituted, when possible, for regional locations and, in some cases, finite source models not included before. The Atlas span has been extended till mid 2010, and some older events have also been added for the 1973-2007 period. In order to improve the adequacy of the GMPE used by ShakeMap to estimate the ground shaking for a given event where data are not available, we use a new global selection scheme to discriminate between different types of earthquakes (García et al., 2011). Finally, we have included a large amount of recently available observations from national and regional networks. All these

  19. Seismic hazard assessment of Sub-Saharan Africa using geodetic strain rate models

    NASA Astrophysics Data System (ADS)

    Poggi, Valerio; Pagani, Marco; Weatherill, Graeme; Garcia, Julio; Durrheim, Raymond J.; Mavonga Tuluka, Georges

    2016-04-01

    The East African Rift System (EARS) is the major active tectonic feature of the Sub-Saharan Africa (SSA) region. Although the seismicity level of such a divergent plate boundary can be described as moderate, several earthquakes have been reported in historical times causing a non-negligible level of damage, albeit mostly due to the high vulnerability of the local buildings and structures. Formulation and enforcement of national seismic codes is therefore an essential future risk mitigation strategy. Nonetheless, a reliable risk assessment cannot be done without the calibration of an updated seismic hazard model for the region. Unfortunately, the major issue in assessing seismic hazard in Sub-Saharan Africa is the lack of basic information needed to construct source and ground motion models. The historical earthquake record is largely incomplete, while instrumental catalogue is complete down to sufficient magnitude only for a relatively short time span. In addition, mapping of seimogenically active faults is still an on-going program. Recent studies have identified major seismogenic lineaments, but there is substantial lack of kinematic information for intermediate-to-small scale tectonic features, information that is essential for the proper calibration of earthquake recurrence models. To compensate this lack of information, we experiment the use of a strain rate model recently developed by Stamps et al. (2015) in the framework of a earthquake hazard and risk project along the EARS supported by USAID and jointly carried out by GEM and AfricaArray. We use the inferred geodetic strain rates to derive estimates of total scalar moment release, subsequently used to constrain earthquake recurrence relationships for both area (as distributed seismicity) and fault source models. The rates obtained indirectly from strain rates and more classically derived from the available seismic catalogues are then compared and combined into a unique mixed earthquake recurrence model

  20. Past earthquake history and seismic hazard in Fucino region, Central Italy

    NASA Astrophysics Data System (ADS)

    Schlagenhauf, Aloe; Manighetti, Isabelle; Benedetti, Lucilla; Gaudemer, Yves; Pou, Khemrak

    2010-05-01

    coincide in time with those recognized on the Magnola-Velino system. Assuming that the faults reload at a constant rate (mean slip rate estimated from our measurements), our data suggest that the faults have entered a paroxysmal phase when they had reached a certain threshold of cumulative strain. Though the Magnola-Velino fault system has not broken since long (≈1.1 ka), the cumulative strain it has accommodated since then is still far below the threshold discussed above. By contrast, though part of the Trasacco fault has broken less than a century ago (1915), the fault is approaching the cumulative strain threshold from which it may enter in a paroxysmal phase. Though those results need further refinements, they suggest that seismic hazard in the Fucino region is high. References : Schlagenhauf A., Gaudemer Y., Benedetti L., Manighetti I., Palumbo L, Schimmelpfennig I., Finkel R., and Pou K. Using in-situ Chlorine-36 cosmonuclide to recover past earthquake histories on limestone normal fault scarps: A reappraisal of methodology and interpretations; in revision Geophys. J. Int., 2009

  1. Earthquake induced landslide hazard: a multidisciplinary field observatory in the Marmara SUPERSITE

    NASA Astrophysics Data System (ADS)

    Bigarré, Pascal

    2014-05-01

    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. As one of the three SUPERSITE concept FP7 projects dealing with long term high level monitoring of major natural hazards at the European level, the MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 1999 Earthquake caused extensive landslides while tsunami effects were observed during the post-event surveys in several places along the coasts of the Izmit bay. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest. First, the Cekmece-Avcilar peninsula, located westwards of Istanbul, is a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. Second, the off-shore entrance of the Izmit Gulf, close to the termination of the surface rupture of the 1999 earthquake

  2. Preliminary assessment of landslides resulting from the earthquake of 23rd November 1980 in Southern Italy.

    PubMed

    Alexander, D

    1981-12-01

    This paper examines the hazards, mechanisms and effects of landsliding provoked by the 1980 earthquake in Campania and Basilicata Regions, Southern Italy. The effects of seismically-induced mass-movement are assessed with respect to slope stability and damage to both settlements and roads. Whereas the mechanism of cyclic loading of soils, which can give rise to landslides, is different from the pore-pressure, gravity loading and strength-reduction mechanisms that normally cause slope failure, the morphology of slides is often indistinguishable and this made it difficult to identify which slides were directly caused by the earthquake. However, creep in potential shear planes undoubtably became more widespread, and the incidence of small, bowl-shaped slides Increased as a direct result of the earthquake. Although variations in the detailed stress-pattern within individual slopes meant that some very mobile soil and rock masses did not move, 36 settlements reported landslide damage and 29 roads were affected by landslides occurring during the earthquake and its immediate aftermath. A full assessment of the disaster, together with an explanation of the geography of the disaster area, can be found in Alexander (1982).

  3. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  4. Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada

    SciTech Connect

    Eisses, A.; Kell, A.; Kent, G.; Driscoll, N.; Karlin, R.; Baskin, R.; Louie, J.; Pullammanappallil, S.

    2016-08-01

    Amy Eisses, Annie M. Kell, Graham Kent, Neal W. Driscoll, Robert E. Karlin, Robert L. Baskin, John N. Louie, Kenneth D. Smith, Sathish Pullammanappallil, 2011, Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada: presented at American Geophysical Union Fall Meeting, San Francisco, Dec. 5-9, abstract NS14A-08.

  5. Premium Rating and Risk Assessment in Earthquake Insurance

    NASA Astrophysics Data System (ADS)

    Jimenez-Huerta, D.

    2005-12-01

    Assessing earthquake risk in a given asset portfolio involves a synthesis of results from two areas of research. The first is knowledge of the earthquake sources that are likely to affect the assets: where they are, how large they are likely to be and how often earthquakes are likely to occur; this issue is addressed via a doubly stochastic Poisson-gamma marked point process model for earthquake occurrence, accounting for the spatial and temporal distribution of seismicity. The second is knowledge of the likely severity of loss that will arise given the occurrence of an earthquake. A beta-regression model is used to relate observed (conditional) losses to site conditions and earthquake characteristics. The calculation of expected losses and associated quantities of interest in an insurance portfolio lies at the interface of the above-mentioned two factors and is the aim of this paper. Of particular interest is the approximation of the aggregate loss distribution, from which any actuarial analysis stems.

  6. Setting the Stage for Harmonized Risk Assessment by Seismic Hazard Harmonization in Europe (SHARE)

    NASA Astrophysics Data System (ADS)

    Woessner, Jochen; Giardini, Domenico; SHARE Consortium

    2010-05-01

    Probabilistic seismic hazard assessment (PSHA) is arguably one of the most useful products that seismology can offer to society. PSHA characterizes the best available knowledge on the seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results form the baseline for informed decision making, such as building codes or insurance rates and provide essential input to each risk assessment application. Several large scale national and international projects have recently been launched aimed at improving and harmonizing PSHA standards around the globe. SHARE (www.share-eu.org) is the European Commission funded project in the Framework Programme 7 (FP-7) that will create an updated, living seismic hazard model for the Euro-Mediterranean region. SHARE is a regional component of the Global Earthquake Model (GEM, www.globalquakemodel.org), a public/private partnership initiated and approved by the Global Science Forum of the OECD-GSF. GEM aims to be the uniform, independent and open access standard to calculate and communicate earthquake hazard and risk worldwide. SHARE itself will deliver measurable progress in all steps leading to a harmonized assessment of seismic hazard - in the definition of engineering requirements, in the collection of input data, in procedures for hazard assessment, and in engineering applications. SHARE scientists will create a unified framework and computational infrastructure for seismic hazard assessment and produce an integrated European probabilistic seismic hazard assessment (PSHA) model and specific scenario based modeling tools. The results will deliver long-lasting structural impact in areas of societal and economic relevance, they will serve as reference for the Eurocode 8 (EC8) application, and will provide homogeneous input for the correct seismic safety assessment for critical industry, such as the energy infrastructures and the re-insurance sector. SHARE will cover the whole European territory, the

  7. Improving the Level of Seismic Hazard Parameters in Saudi Arabia Using Earthquake Location and Magnitude Calibration

    NASA Astrophysics Data System (ADS)

    Al-Amri, A. M.; Rodgers, A. J.

    2004-05-01

    Saudi Arabia is an area, which is characterized very poorly seismically and for which little existing data is available. While for the most parts, particularly, Arabian Shield and Arabian Platform are aseismic, the area is ringed with regional seismic sources in the tectonically active areas of Iran and Turkey to the northeast, the Red Sea Rift bordering the Shield to the southwest, and the Dead Sea Transform fault zone to the north. Therefore, this paper aims to improve the level of seismic hazard parameters by improving earthquake location and magnitude estimates with the Saudi Arabian National Digital Seismic Network (SANDSN). We analyzed earthquake data, travel times and seismic waveform data from the SANDSN. KACST operates the 38 station SANDSN, consisting of 27 broadband and 11 short-period stations. The SANDSN has good signal detection capabilities because the sites are relatively quiet. Noise surveys at a few stations indicate that seismic noise levels at SANDSN stations are quite low for frequencies between 0.1 and 1.0 Hz, however cultural noise appears to affect some stations at frequencies above 1.0 Hz. Locations of regional earthquakes estimated by KACST were compared with locations from global bulletins. Large differences between KACST and global catalog locations are likely the result of inadequacies of the global average earth model (iasp91) used by the KACST system. While this model is probably adequate for locating distant (teleseismic) events in continental regions, it leads to large location errors, as much as 50-100 km, for regional events. We present detailed analysis of some events and Dead Sea explosions where we found gross errors in estimated locations. Velocity models are presented that should improve estimated locations of regional events in three specific regions: 1. Gulf of Aqabah - Dead Sea region 2. Arabian Shield and 3. Arabian Platform. Recently, these models are applied to the SANDSN to improve local and teleseismic event locations

  8. Earthquake recordings from the 2002 Seattle Seismic Hazard Investigation of Puget Sound (SHIPS), Washington State

    USGS Publications Warehouse

    Pratt, Thomas L.; Meagher, Karen L.; Brocher, Thomas M.; Yelin, Thomas; Norris, Robert; Hultgrien, Lynn; Barnett, Elizabeth; Weaver, Craig S.

    2003-01-01

    This report describes seismic data obtained during the fourth Seismic Hazard Investigation of Puget Sound (SHIPS) experiment, termed Seattle SHIPS . The experiment was designed to study the influence of the Seattle sedimentary basin on ground shaking during earthquakes. To accomplish this, we deployed seismometers over the basin to record local earthquakes, quarry blasts, and teleseisms during the period of January 26 to May 27, 2002. We plan to analyze the recordings to compute spectral amplitudes at each site, to determine the variability of ground motions over the basin. During the Seattle SHIPS experiment, seismometers were deployed at 87 sites in a 110-km-long east-west line, three north-south lines, and a grid throughout the Seattle urban area (Figure 1). At each of these sites, an L-22, 2-Hz velocity transducer was installed and connected to a REF TEK Digital Acquisition System (DAS), both provided by the Program for Array Seismic Studies of the Continental Lithosphere (PASSCAL) of the Incorporated Research Institutes for Seismology (IRIS). The instruments were installed on January 26 and 27, and were retrieved gradually between April 18 and May 27. All instruments continuously sampled all three components of motion (velocity) at a sample rate of 50 samples/sec. To ensure accurate computations of amplitude, we calibrated the geophones in situ to obtain the instrument responses. In this report, we discuss the acquisition of these data, we describe the processing and merging of these data into 1-hour long traces and into windowed events, we discuss the geophone calibration process and its results, and we display some of the earthquake recordings.

  9. Probabilistic Rockfall Hazard Analysis in the area affect by the Christchurch Earthquakes, New Zealand

    NASA Astrophysics Data System (ADS)

    Frattini, P.; Lari, S.; Agliardi, F.; Crosta, G. B.; Salzmann, H.

    2012-04-01

    To limit damages to human lives and property in case of natural disasters, land planning and zonation, as well as the design of countermeasures, are fundamental tools, requiring however a rigorous quantitative risk analysis. As a consequence of the 3rd September 2010 (Mw 7.1) Darfield Earthquake, and the 22nd February (Mw 6.2), the 16th April 2011 (Mw 5.3) and the 13th June, 2011 (Mw 6.2) aftershock events, about 6000 rockfalls were triggered in the Port Hills of Christchurch, New Zealand. Five people were killed by falling rocks in the area, and several hundred homes were damaged or evacuated. In this work, we present a probabilistic rockfall hazard analysis for a small area located in the south-eastern slope of Richmond Hill (0.6 km2, Sumner, Christchurch, NZ). For the analysis, we adopted a new methodology (Probabilistic Rockfall Hazard Analysis, PRHA), which allows to quantify the exceedance probability for a given slope location of being affected by a rockfall event with a specific level of kinetic energy, integrating the contribution of different rockfall magnitude (volume) scenarios. The methodology requires the calculation of onset annual frequency, rockfall runout, and spatially-varying kinetic energy. Onset annual frequencies for different magnitude scenarios were derived from frequency-magnitude relationship adapted from the literature. The probability distribution of kinetic energy for a given slope location and volume scenario was obtained by rockfall runout modeling of non-interacting blocks through the 3D Hy-Stone simulation code. The reference simulation was calibrated by back-analysis of rockfall events occurred during the earthquake. For each rockfall magnitude scenario, 20 rockfall trajectories have been simulated for each source cell using stochastically variable values of restitution parameters. Finally, probabilistic analysis integrating over six rockfall magnitude scenarios (ranging from 0.001 m3 to 1000 m3) was carried out to produce

  10. Seismic Hazard Assessment for the Baku City and Absheron Peninsula, Azerbaijan

    SciTech Connect

    Babayev, Gulam R.

    2006-03-23

    This paper deals with the seismic hazard assessment for Baku and the Absheron peninsula. The assessment is based on the information on the features of earthquake ground motion excitation, seismic wave propagation (attenuation), and site effect. I analyze active faults, seismicity, soil and rock properties, geological cross-sections, the borehole data of measured shear-wave velocity, lithology, amplification factor of each geological unit, geomorphology, topography, and basic rock and surface ground motions. To estimate peak ground acceleration (PGA) at the surface, PGA at the basic rock is multiplied by the amplification parameter of each surface layers. Quaternary soft deposits, representing a high risk due to increasing PGA values at surface, are studied in detail. For a near-zone target earthquake PGA values are compared to intensity at MSK-64 scale for the Absheron peninsula. The amplification factor for the Baku city is assessed and provides estimations for a level of a seismic motion and seismic intensity of the studied area.

  11. A preliminary regional assessment of earthquake-induced landslide susceptibility for Vrancea Seismic Region

    NASA Astrophysics Data System (ADS)

    Micu, Mihai; Balteanu, Dan; Ionescu, Constantin; Havenith, Hans; Radulian, Mircea; van Westen, Cees; Damen, Michiel; Jurchescu, Marta

    2015-04-01

    In seismically-active regions, earthquakes may trigger landslides enhancing the short-to-long term slope denudation and sediment delivery and conditioning the general landscape evolution. Co-seismic slope failures present in general a low frequency - high magnitude pattern which should be addressed accordingly by landslide hazard assessment, with respect to the generally more frequent precipitation-triggered landslides. The Vrancea Seismic Region, corresponding to the curvature sector of the Eastern Romanian Carpathians, represents the most active sub-crustal (focal depth > 50 km) earthquake province of Europe. It represents the main seismic energy source throughout Romania with significant transboundary effects recorded as far as Ukraine and Bulgaria. During the last 300 years, the region featured 14 earthquakes with M>7, among which seven events with magnitude above 7.5 and three between 7.7 and 7.9. Apart from the direct damages, the Vrancea earthquakes are also responsible for causing numerous other geohazards, such as ground fracturing, groundwater level disturbances and possible deep-seated landslide occurrences (rock slumps, rock-block slides, rock falls, rock avalanches). The older deep-seated landslides (assumed to have been) triggered by earthquakes usually affect the entire slope profile. They often formed landslide dams strongly influencing the river morphology and representing potential threats (through flash-floods) in case of lake outburst. Despite the large potential of this research issue, the correlation between the region's seismotectonic context and landslide predisposing factors has not yet been entirely understood. Presently, there is a lack of information provided by the geohazards databases of Vrancea that does not allow us to outline the seismic influence on the triggering of slope failures in this region. We only know that the morphology of numerous large, deep-seated and dormant landslides (which can possibly be reactivated in future

  12. An Arduino project to record ground motion and to learn on earthquake hazard at high school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Barnaba, Carla; Clocchiatti, Marco; Zuliani, David

    2015-04-01

    Through a multidisciplinary work that integrates Technology education with Earth Sciences, we implemented an educational program to raise the students' awareness of seismic hazard and to disseminate good practices of earthquake safety. Using free software and low-cost open hardware, the students of a senior class of the high school Liceo Paschini in Tolmezzo (NE Italy) implemented a seismograph using the Arduino open-source electronics platform and the ADXL345 sensors to emulate a low cost seismometer (e.g. O-NAVI sensor of the Quake-Catcher Network, http://qcn.stanford.edu). To accomplish their task the students were addressed to use the web resources for technical support and troubleshooting. Shell scripts, running on local computers under Linux OS, controlled the process of recording and display data. The main part of the experiment was documented using the DokuWiki style. Some propaedeutic lessons in computer sciences and electronics were needed to build up the necessary skills of the students and to fill in the gap of their background knowledge. In addition lectures by seismologists and laboratory activity allowed the class to exploit different aspects of the physics of the earthquake and particularly of the seismic waves, and to become familiar with the topics of seismic hazard through an inquiry-based learning. The Arduino seismograph achieved can be used for educational purposes and it can display tremors on the local network of the school. For sure it can record the ground motion due to a seismic event that can occur in the area, but further improvements are necessary for a quantitative analysis of the recorded signals.

  13. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  14. Earthquakes

    MedlinePlus

    ... and Cleanup Workers Hurricanes PSAs ASL Videos: Hurricanes Landslides & Mudslides Lightning Lightning Safety Tips First Aid Recommendations ... Disasters & Severe Weather Earthquakes Extreme Heat Floods Hurricanes Landslides Tornadoes Tsunamis Volcanoes Wildfires Winter Weather Earthquakes Language: ...

  15. Numerical testing of certain features of probabilistic aftershock hazard assessment

    NASA Astrophysics Data System (ADS)

    Gallovic, F.; Brokesova, J.

    2005-12-01

    Probabilistic aftershock hazard assessment (PAHA, Wiemer, 2000), provided for California in the frame of the STEP project, is based on a methodology having features, two of which are addressed in detail: 1) independence of parameter c in the Omori's law on a lower magnitude cut-off, and, 2) application attenuation relations in the expression for the probability of PGA exceedance. Concerning the first point, in STEP, c is assumed constant with respect to magnitude. However, in paper by Shcherbakov et al. (2004) the authors conclude that c scales with a lower magnitude cut-off. We show, using Japanese attenuation relations and four different earthquake models, that this modification change the hazard curves for very early time interval (<1 day) after the mainshock substantially. For later times (>1 day), the effect is minimal. As regards the second point, we try to substitute attenuation relations and their uncertainties by strong ground motion simulations for a set of scenarios. The main advantage of such an approach is that the simulations account for details of the aftershock source effects (faulting style, slip distribution, position of the nucleation point, etc.). Mean PGAs and their variances are retrieved from the simulations and they are used for the PAHA analysis at a station under study. The method is tested for the Izmit A25 aftershock (Mw=5.8) that occurred 26 days after the main shock. The resulting PAHA maps are compared with those obtained by the use of attenuation relations. We conclude that the two types of the PAHA maps do not differ significantly provided equal occurrence probability is assigned to each nucleation point location. However, possible constraint on this location (e.g., occurrence within the red Coulomb stress change areas) would change the maps considerably.

  16. Were the May 2012 Emilia-Romagna earthquakes induced? A coupled flow-geomechanics modeling assessment

    NASA Astrophysics Data System (ADS)

    Juanes, R.; Jha, B.; Hager, B. H.; Shaw, J. H.; Plesch, A.; Astiz, L.; Dieterich, J. H.; Frohlich, C.

    2016-07-01

    Seismicity induced by fluid injection and withdrawal has emerged as a central element of the scientific discussion around subsurface technologies that tap into water and energy resources. Here we present the application of coupled flow-geomechanics simulation technology to the post mortem analysis of a sequence of damaging earthquakes (Mw = 6.0 and 5.8) in May 2012 near the Cavone oil field, in northern Italy. This sequence raised the question of whether these earthquakes might have been triggered by activities due to oil and gas production. Our analysis strongly suggests that the combined effects of fluid production and injection from the Cavone field were not a driver for the observed seismicity. More generally, our study illustrates that computational modeling of coupled flow and geomechanics permits the integration of geologic, seismotectonic, well log, fluid pressure and flow rate, and geodetic data and provides a promising approach for assessing and managing hazards associated with induced seismicity.

  17. Definition of a short-cut methodology for assessing earthquake-related Na-Tech risk.

    PubMed

    Busini, Valentina; Marzo, Enrico; Callioni, Andrea; Rota, Renato

    2011-08-15

    Na-Tech (Natural and Technological) refers to industrial accidents triggered by natural events such as storms, earthquakes, flooding, and lightning. Herein, a qualitative methodology for the initial assessment of earthquake Na-Tech risk has been developed as a screening tool to identify which situations require a much more expensive Quantitative Risk Analysis (QRA). The proposed methodology, through suitable Key Hazard Indicators (KHIs), identifies the Na-Tech risk level associated with a given situation (i.e., a process plant located in a given territory), using the Analytical Hierarchy Process as a multi-criteria decision tool for the evaluation of such KHIs. The developed methodology was validated by comparing its computational results with QRA results that involved Na-Tech events previously presented in literature.

  18. Lateral spread hazard mapping of the northern Salt Lake Valley, Utah, for a M7.0 scenario earthquake

    USGS Publications Warehouse

    Olsen, M.J.; Bartlett, S.F.; Solomon, B.J.

    2007-01-01

    This paper describes the methodology used to develop a lateral spread-displacement hazard map for northern Salt Lake Valley, Utah, using a scenario M7.0 earthquake occurring on the Salt Lake City segment of the Wasatch fault. The mapping effort is supported by a substantial amount of geotechnical, geologic, and topographic data compiled for the Salt Lake Valley, Utah. ArcGIS?? routines created for the mapping project then input this information to perform site-specific lateral spread analyses using methods developed by Bartlett and Youd (1992) and Youd et al. (2002) at individual borehole locations. The distributions of predicted lateral spread displacements from the boreholes located spatially within a geologic unit were subsequently used to map the hazard for that particular unit. The mapped displacement zones consist of low hazard (0-0.1 m), moderate hazard (0.1-0.3 m), high hazard (0.3-1.0 m), and very high hazard (> 1.0 m). As expected, the produced map shows the highest hazard in the alluvial deposits at the center of the valley and in sandy deposits close to the fault. This mapping effort is currently being applied to the southern part of the Salt Lake Valley, Utah, and probabilistic maps are being developed for the entire valley. ?? 2007, Earthquake Engineering Research Institute.

  19. Seismic Hazard and risk assessment for Romania -Bulgaria cross-border region

    NASA Astrophysics Data System (ADS)

    Simeonova, Stela; Solakov, Dimcho; Alexandrova, Irena; Vaseva, Elena; Trifonova, Petya; Raykova, Plamena

    2016-04-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic hazard and vulnerability to earthquakes are steadily increasing as urbanization and development occupy more areas that are prone to effects of strong earthquakes. The assessment of the seismic hazard and risk is particularly important, because it provides valuable information for seismic safety and disaster mitigation, and it supports decision making for the benefit of society. Romania and Bulgaria, situated in the Balkan Region as a part of the Alpine-Himalayan seismic belt, are characterized by high seismicity, and are exposed to a high seismic risk. Over the centuries, both countries have experienced strong earthquakes. The cross-border region encompassing the northern Bulgaria and southern Romania is a territory prone to effects of strong earthquakes. The area is significantly affected by earthquakes occurred in both countries, on the one hand the events generated by the Vrancea intermediate-depth seismic source in Romania, and on the other hand by the crustal seismicity originated in the seismic sources: Shabla (SHB), Dulovo, Gorna Orjahovitza (GO) in Bulgaria. The Vrancea seismogenic zone of Romania is a very peculiar seismic source, often described as unique in the world, and it represents a major concern for most of the northern part of Bulgaria as well. In the present study the seismic hazard for Romania-Bulgaria cross-border region on the basis of integrated basic geo-datasets is assessed. The hazard results are obtained by applying two alternative approaches - probabilistic and deterministic. The MSK64 intensity (MSK64 scale is practically equal to the new EMS98) is used as output parameter for the hazard maps. We prefer to use here the macroseismic intensity instead of PGA, because it is directly related to the degree of damages and, moreover, the epicentral intensity is the original

  20. Development of Rapid Earthquake Loss Assessment Methodologies for Euro-Med Region

    NASA Astrophysics Data System (ADS)

    Erdik, M.

    2009-04-01

    For almost-real time estimation of the ground shaking and losses after a major earthquake in the Euro-Mediterranean region the JRA-3 component of the EU Project entitled "Network of research Infrastructures for European Seismology, NERIES" foresees: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base, supported, if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line regional broadband stations. 2. Estimation of the spatial distribution of selected ground motion parameters at engineering bedrock through region specific ground motion attenuation relationships and/or actual physical simulation of ground motion. 3. Estimation of the spatial distribution of site-specific ground selected motion parameters using regional geology (or urban geotechnical information) data-base using appropriate amplification models. 4. Estimation of the losses and uncertainties at various orders of sophistication (buildings, casualties) Main objective of the JRA-3 wprk package is to develop a methodology for real time estimation of losses after a major earthquake in the Euro-Mediterranean region. The multi-level methodology being developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variabilities and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical ane social elements subjected to earthquake hazard and the associated vulnerability relationships. A comprehensive methodology has been developed and the related software ELER is under preparation. The apllications of the ELER software are presented in the following two accompanying papers. 1. Regional Earthquake Shaking and Loss Estimation 2. Urban Earthquake Shakıng and Loss Assessment

  1. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach

  2. Volcanic hazard assessment at Deception Island

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Sobradelo, R.; Geyer, A.; Martí, J.

    2012-04-01

    Deception Island is the most active volcano of the South Shetland Islands (Antarctica) with more than twenty eruptions recognised over the past two centuries. The island was formed on the expansion axis of the Central Bransfield Strait and its evolution consists of constructive and destructive phases. A first a shield phase was followed by the construction of a central edifice and formation of the caldera with a final monogenetic volcanism along the caldera rim. The post-caldera magma composition varies from andesitic-basaltic to dacitic. The activity is characterised by monogenetic eruptions of low volume and short duration. The eruptions show a variable degree of explosivity, strombolian or phreatomagmatic, with a VEI 2 to 4, which have generated a wide variety of pyroclastic deposits and lavas. It is remarkable how many phases of phreatic explosive eruptions are associated to the emission of large ballistic blocks. Tephra record preserved in the glacier ice of Livingston Island or in marine sediments show the explosive power of the phreatomagmatic phases and the wide dispersal of its finest products in a great variety of directions of the prevailing winds. Also it is important to highlight the presence of different lahar deposits associated with some of these eruptions. In this contribution we present the guidelines to conduct a short-term and long-term volcanic hazard assessment at Deception Island. We apply probabilistic methods to estimate the susceptibility, statistical techniques to determine the eruption recurrence and eruptive scenario, and reproduce the effects of historical eruptions too. Volcanic hazard maps and scenarios are obtained using a Voris-based model tool (Felpeto et al., 2007) in a free Geographical Information System (GIS), a Quantum GIS.

  3. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  4. Assessment of impact of strong earthquakes to the global economy by example of Thoku event

    NASA Astrophysics Data System (ADS)

    Tatiana, Skufina; Peter, Skuf'in; Sergey, Baranov; Vera, Samarina; Taisiya, Shatalova

    2016-04-01

    We examine the economic consequences of strong earthquakes by example of M9 Tahoku one that occurred on March 11, 2011 close to the northeast shore of Japanese coast Honshu. This earthquake became the strongest in the whole history of the seismological observations in this part of the planet. The generated tsunami killed more than 15,700 people, damaged 332,395 buildings and 2,126 roads. The total economic loss in Japan was estimated at 309 billion. The catastrophe in Japan also impacted global economy. To estimate its impact, we used regional and global stock indexes, production indexes, stock prices of the main Japanese, European and US companies, import and export dynamics, as well as the data provided by the custom of Japan. We also demonstrated that the catastrophe substantially affected the markets and on the short run in some indicators it even exceeded the effect of the global financial crisis of 2008. The last strong earthquake occurred in Nepal (25.04.2015, M7.8) and Chile (16.09.2015, M8.3), both actualized the research of cost assessments of the overall economic impact of seismic hazard. We concluded that it is necessary to treat strong earthquakes as one very important factor that affects the world economy depending on their location. The research was supported by Russian Foundation for Basic Research (Project 16-06-00056A).

  5. The key role of eyewitnesses in rapid earthquake impact assessment

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  6. OpenQuake, a platform for collaborative seismic hazard and risk assessment

    NASA Astrophysics Data System (ADS)

    Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben

    2013-04-01

    Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental

  7. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    ,

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  8. Seismic Hazard Assessment for Western Kentucky, Northeastern Kentucky and Southeastern Ohio

    SciTech Connect

    Cobb, James C; Wang, Zhenming; Woolery, Edward W; Kiefer, John D

    2002-07-01

    Earthquakes pose a seismic hazards and risk to the Commonwealth of Kentucky. Furthermore, the seismic hazards and risk vary throughout the Commonwealth. The US Nuclear Regulatory Commission uses the seismic hazard maps developed by the US Geological Survey for seismic safety regulation for nuclear facilities. Under current US Geological Survey's seismic hazard assessment it is economically unfeasible to build a new uranium plant near Paducah relative to the Portsmouth, Ohio site. This is not to say that the facility cannot be safely engineered to withstand the present seismic load, but enormously expensive to do so. More than 20 years observations and research at UK have shown that the US Geological Survey has overestimated seismic hazards in western Kentucky, particularly in the Jackson Purchase area that includes Paducah. Furthermore, our research indicates underestimated seismic hazards in northeastern Kentucky and southeastern Ohio. Such overestimation and underestimation could jeopardize possible site selection of PGDP for the new uranium plant. The existing database, research experience, and expertise in UK's Kentucky Geological Survey and Department of Geological Science put this institution in a unique position to conduct a comprehensive seismic hazard evaluation.

  9. Seismic hazard assessments for European nuclear power plants: a review based on the results of the ENSREG Stress Tests

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Hintersberger, Esther

    2015-04-01

    In aftermath of the Fukushima Daiichi accident ENSREG and the European Commission reviewed the seismic safety of all European nuclear plants on the basis of a comprehensive and transparent risk and safety assessment ('Stress Tests'). This process resulted in the publication of a large amount of data describing approaches, methods and results previously used to assess seismic hazards for European NPPs (http://www.ensreg.eu/eu-stress-tests). A review of the published documents reveals considerable differences between the approaches of seismic hazard assessment. Most of the EU countries use probabilistic or a combination of probabilistic and deterministic approaches to estimate hazard. A second group of countries relies on deterministic assessments. Reports from countries adopting probabilistic hazard assessment methodologies reveal a spread of exceedance frequencies defining the design base earthquake (DBE) between 10-3 and 10-5 per year with a majority of countries referring to a frequency of 10-4. Deterministic approaches use the maximum earthquake intensities to define the DBE, mostly adding 1° of intensity as a safety margin. In very few cases only 0.5° or even no safety margin was added to the strongest intensity. The hazard levels obtained from both types of analyses are not comparable to each other as no benchmark studies appear to exist to define the occurrence probabilities of DBE values established by deterministic methods. The Stress Tests documents do not allow for an in-depth check of the hazard assessments. Assessments for different countries/sites have been performed between the 1970s and 2011. Although it is conceded that all assessments were performed according to the state of the art of the time of their performance, only a part of the hazard assessments can be justified in terms of being compliant with current scientific standards. Due to the time elapsed since their implementation several decades ago some assessments do not take advantage of

  10. From Seismic Scenarios to Earthquake Risk Assessment: A Case Study for Iquique, Chile.

    NASA Astrophysics Data System (ADS)

    Aguirre, P.; Fortuno, C.; Martin, J. C. D. L. L.; Vasquez, J.

    2015-12-01

    Iquique is a strategic city and economic center in northern Chile, and is located in a large seismic gap where a megathrust earthquake and tsunami is expected. Although it was hit by a Mw 8.2 earthquake on April 1st 2014, which caused moderate damage, geophysical evidence still suggests that there is potential for a larger event, so a thorough risk assessment is key to understand the physical, social, and economic effects of such potential event, and devise appropriate mitigation plans. Hence, Iquique has been selected as a prime study case for the implementation of a risk assessment platform in Chile. Our study integrates research on three main elements of risk calculations: hazard evaluation, exposure model, and physical vulnerabilities. To characterize the hazard field, a set of synthetic seismic scenarios have been developed based on plate interlocking and the residual slip potential that results from subtracting the slip occurred during the April 1st 2014 rupture fault mechanism, obtained using InSAR+GPS inversion. Additional scenarios were developed based of the fault rupture model of the Maule 2010 Mw 8.8 earthquake and on the local plate locking models in northern Chile. These rupture models define a collection of possible realizations of earthquake geometries parameterized in terms of critical variables like slip magnitude, rise time, mean propagation velocity, directivity, and other, which are propagated to obtain a hazard map for Iquique (e.g. PGA, PGV, PDG). Furthermore, a large body of public and local data was used to construct a detailed exposure model for Iquique, including aggregated building count, demographics, essential facilities, and lifelines. This model together with the PGA maps for the April 1st 2014 earthquake are used to calibrate HAZUS outputs against observed damage, and adjust the fragility curves of physical systems according to more detailed analyses of typical Chilean building types and their structural properties, plus historical

  11. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  12. Persistency of rupture directivity in moderate-magnitude earthquakes in Italy: Implications for seismic hazard

    NASA Astrophysics Data System (ADS)

    Rovelli, A.; Calderoni, G.

    2012-12-01

    to L'Aquila faults, was characterized by normal-faulting earthquakes with strike substantially parallel to the Apennine trend. Although the amount of data is not as abundant as for the most recent earthquakes, the available data were already object of previous studies indicating unilateral rupture propagation in several of the strongest (5.5 < Mw < 6.0) shocks. We show that the effect of directivity is particularly significant in intermontane basins where long-period (T > 1 sec) ground motions are amplified by soft sediments and the combination of local amplification with source directivity causes exceedance of spectral ordinates at those periods up to more than 2 standard deviations from the expected values of commonly used GMPEs for soft sites. These results arise a concern in terms of seismic hazard because source directivity is found to be recurrent feature in the Apennines. Moreover, the predominant fault strike and intermontane basins are both aligned along the Apennine chain offering a condition potentially favorable to extra-amplifications at periods relevant to seismic risk.

  13. GRC Payload Hazard Assessment: Supporting the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Schoren, William R.; Zampino, Edward J.

    2004-01-01

    A hazard assessment was conducted on the GRC managed payloads in support of a NASA Headquarters Code Q request to examine STS-107 payloads and determine if they were credible contributors to the Columbia accident. This assessment utilized each payload's Final Flight Safety Data Package for hazard identification. An applicability assessment was performed and most of the hazards were eliminated because they dealt with payload operations or crew interactions. A Fault Tree was developed for all the hazards deemed applicable and the safety verification documentation was reviewed for these applicable hazards. At the completion of this hazard assessment, it was concluded that none of the GRC managed payloads were credible contributors to the Columbia accident.

  14. Miscellaneous High-Resolution Seismic Imaging Investigations in Salt Lake and Utah Valleys for Earthquake Hazards

    USGS Publications Warehouse

    Stephenson, W.J.; Williams, R.A.; Odum, J.K.; Worley, D.M.

    2007-01-01

    Introduction In support of earthquake hazards and ground motion studies by researchers at the Utah Geological Survey, University of Utah, Utah State University, Brigham Young University, and San Diego State University, the U.S. Geological Survey Geologic Hazards Team Intermountain West Project conducted three high-resolution seismic imaging investigations along the Wasatch Front between September 2003 and September 2005. These three investigations include: (1) a proof-of-concept P-wave minivib reflection imaging profile in south-central Salt Lake Valley, (2) a series of seven deep (as deep as 400 m) S-wave reflection/refraction soundings using an S-wave minivib in both Salt Lake and Utah Valleys, and (3) an S-wave (and P-wave) investigation to 30 m at four sites in Utah Valley and at two previously investigated S-wave (Vs) minivib sites. In addition, we present results from a previously unpublished downhole S-wave investigation conducted at four sites in Utah Valley. The locations for each of these investigations are shown in figure 1. Coordinates for the investigation sites are listed in Table 1. With the exception of the P-wave common mid-point (CMP) reflection profile, whose end points are listed, these coordinates are for the midpoint of each velocity sounding. Vs30 and Vs100, also shown in Table 1, are defined as the average shear-wave velocities to depths of 30 and 100 m, respectively, and details of their calculation can be found in Stephenson and others (2005). The information from these studies will be incorporated into components of the urban hazards maps along the Wasatch Front being developed by the U.S. Geological Survey, Utah Geological Survey, and numerous collaborating research institutions.

  15. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  16. Using a geographic information system (GIS) to assess pediatric surge potential after an earthquake.

    PubMed

    Curtis, Jacqueline W; Curtis, Andrew; Upperman, Jeffrey S

    2012-06-01

    Geographic information systems (GIS) and geospatial technology (GT) can help hospitals improve plans for postdisaster surge by assessing numbers of potential patients in a catchment area and providing estimates of special needs populations, such as pediatrics. In this study, census-derived variables are computed for blockgroups within a 3-mile radius from Children's Hospital Los Angeles (CHLA) and from Los Angeles County-University of Southern California (LAC-USC) Medical Center. Landslide and liquefaction zones are overlaid on US Census Bureau blockgroups. Units that intersect with the hazard zones are selected for computation of pediatric surge potential in case of an earthquake. In addition, cartographic visualization and cluster analysis are performed on the entire 3-mile study area to identify hot spots of socially vulnerable populations. The results suggest the need for locally specified vulnerability models for pediatric populations. GIS and GT have untapped potential to contribute local specificity to planning for surge potential after a disaster. Although this case focuses on an earthquake hazard, the methodology is appropriate for an all-hazards approach. With the advent of Google Earth, GIS output can now be easily shared with medical personnel for broader application and improvement in planning.

  17. Significant earthquakes on the Enriquillo fault system, Hispaniola, 1500-2010: Implications for seismic hazard

    USGS Publications Warehouse

    Bakun, William H.; Flores, Claudia H.; ten Brink, Uri S.

    2012-01-01

    Historical records indicate frequent seismic activity along the north-east Caribbean plate boundary over the past 500 years, particularly on the island of Hispaniola. We use accounts of historical earthquakes to assign intensities and the intensity assignments for the 2010 Haiti earthquakes to derive an intensity attenuation relation for Hispaniola. The intensity assignments and the attenuation relation are used in a grid search to find source locations and magnitudes that best fit the intensity assignments. Here we describe a sequence of devastating earthquakes on the Enriquillo fault system in the eighteenth century. An intensity magnitude MI 6.6 earthquake in 1701 occurred near the location of the 2010 Haiti earthquake, and the accounts of the shaking in the 1701 earthquake are similar to those of the 2010 earthquake. A series of large earthquakes migrating from east to west started with the 18 October 1751 MI 7.4–7.5 earthquake, probably located near the eastern end of the fault in the Dominican Republic, followed by the 21 November 1751 MI 6.6 earthquake near Port-au-Prince, Haiti, and the 3 June 1770 MI 7.5 earthquake west of the 2010 earthquake rupture. The 2010 Haiti earthquake may mark the beginning of a new cycle of large earthquakes on the Enriquillo fault system after 240 years of seismic quiescence. The entire Enriquillo fault system appears to be seismically active; Haiti and the Dominican Republic should prepare for future devastating earthquakes.

  18. A Revised Evaluation of Tsunami Hazards along the Chinese Coast in View of the Tohoku-Oki Earthquake

    NASA Astrophysics Data System (ADS)

    Jing, Huimin Helen; Zhang, Huai; Yuen, David A.; Shi, Yaolin

    2013-01-01

    Japan's 2011 Tohoku-Oki earthquake and the accompanying tsunami have reminded us of the potential tsunami hazards from the Manila and Ryukyu trenches to the South China and East China Seas. Statistics of historical seismic records from nearly the last 4 decades have shown that major earthquakes do not necessarily agree with the local Gutenberg-Richter relationship. The probability of a mega-earthquake may be higher than we have previously estimated. Furthermore, we noted that the percentages of tsunami-associated earthquakes are much higher in major events, and the earthquakes with magnitudes equal to or greater than 8.8 have all triggered tsunamis in the past approximately 100 years. We will emphasize the importance of a thorough study of possible tsunami scenarios for hazard mitigation. We focus on several hypothetical earthquake-induced tsunamis caused by M w 8.8 events along the Manila and Ryukyu trenches. We carried out numerical simulations based on shallow-water equations (SWE) to predict the tsunami dynamics in the South China and East China Seas. By analyzing the computed results we found that the height of the potential surge in China's coastal area caused by earthquake-induced tsunamis may reach a couple of meters high. Our preliminary results show that tsunamis generated in the Manila and Ryukyu trenches could pose a significant threat to Chinese coastal cities such as Shanghai, Hong Kong and Macao. However, we did not find the highest tsunami wave at Taiwan, partially because it lies right on the extension of an assumed fault line. Furthermore, we put forward a multi-scale model with higher resolution, which enabled us to investigate the edge waves diffracted around Taiwan Island with a closer view.

  19. Seismic hazard and risk assessment for large Romanian dams situated in the Moldavian Platform

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Popescu, Emilia; Otilia Placinta, Anica; Petruta Constantin, Angela; Toma Danila, Dragos; Borleanu, Felix; Emilian Toader, Victorin; Moldoveanu, Traian

    2016-04-01

    Besides periodical technical inspections, the monitoring and the surveillance of dams' related structures and infrastructures, there are some more seismic specific requirements towards dams' safety. The most important one is the seismic risk assessment that can be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine (2002), and Bureau (2003), taking into account the maximum expected peak ground motions at the dams site - values obtained using probabilistic hazard assessment approaches (Moldovan et al., 2008), the structures vulnerability and the downstream risk characteristics (human, economical, historic and cultural heritage, etc) in the areas that might be flooded in the case of a dam failure. Probabilistic seismic hazard (PSH), vulnerability and risk studies for dams situated in the Moldavian Platform, starting from Izvorul Muntelui Dam, down on Bistrita and following on Siret River and theirs affluent will be realized. The most vulnerable dams will be studied in detail and flooding maps will be drawn to find the most exposed downstream localities both for risk assessment studies and warnings. GIS maps that clearly indicate areas that are potentially flooded are enough for these studies, thus giving information on the number of inhabitants and goods that may be destroyed. Geospatial servers included topography is sufficient to achieve them, all other further studies are not necessary for downstream risk assessment. The results will consist of local and regional seismic information, dams specific characteristics and locations, seismic hazard maps and risk classes, for all dams sites (for more than 30 dams), inundation maps (for the most vulnerable dams from the region) and possible affected localities. The studies realized in this paper have as final goal to provide the local emergency services with warnings of a potential dam failure and ensuing flood as a result of an large earthquake occurrence, allowing further

  20. Assessment of liquefaction potential during earthquakes by arias intensity

    USGS Publications Warehouse

    Kayen, R.E.; Mitchell, J.K.

    1997-01-01

    An Arias intensity approach to assess the liquefaction potential of soil deposits during earthquakes is proposed, using an energy-based measure of the severity of earthquake-shaking recorded on seismograms of the two horizontal components of ground motion. Values representing the severity of strong motion at depth in the soil column are associated with the liquefaction resistance of that layer, as measured by in situ penetration testing (SPT, CPT). This association results in a magnitude-independent boundary that envelopes initial liquefaction of soil in Arias intensity-normalized penetration resistance space. The Arias intensity approach is simple to apply and has proven to be highly reliable in assessing liquefaction potential. The advantages of using Arias intensity as a measure of earthquake-shaking severity in liquefaction assessment are: Arias intensity is derived from integration of the entire seismogram wave form, incorporating both the amplitude and duration elements of ground motion; all frequencies of recorded motion are considered; and Arias intensity is an appropriate measure to use when evaluating field penetration test methodologies that are inherently energy-based. Predictor equations describing the attenuation of Arias intensity as a function of earthquake magnitude and source distance are presented for rock, deep-stiff alluvium, and soft soil sites.

  1. Tsunamis hazard assessment and monitoring for the Back Sea area

    NASA Astrophysics Data System (ADS)

    Partheniu, Raluca; Ionescu, Constantin; Constantin, Angela; Moldovan, Iren; Diaconescu, Mihail; Marmureanu, Alexandru; Radulian, Mircea; Toader, Victorin

    2016-04-01

    NIEP has improved lately its researches regarding tsunamis in the Black Sea. As part of the routine earthquake and tsunami monitoring activity, the first tsunami early-warning system in the Black Sea has been implemented in 2013 and is active during these last years. In order to monitor the seismic activity of the Black Sea, NIEP is using a total number of 114 real time stations and 2 seismic arrays, 18 of the stations being located in Dobrogea area, area situated in the vicinity of the Romanian Black Sea shore line. Moreover, there is a data exchange with the Black Sea surrounding countries involving the acquisition of real-time data for 17 stations from Bulgaria, Turkey, Georgia and Ukraine. This improves the capability of the Romanian Seismic Network to monitor and more accurately locate the earthquakes occurred in the Black Sea area. For tsunamis monitoring and warning, a number of 6 sea level monitoring stations, 1 infrasound barometer, 3 offshore marine buoys and 7 GPS/GNSS stations are installed in different locations along and near the Romanian shore line. In the framework of ASTARTE project, few objectives regarding the seismic hazard and tsunami waves height assessment for the Black Sea were accomplished. The seismic hazard estimation was based on statistical studies of the seismic sources and their characteristics, compiled using different seismic catalogues. Two probabilistic methods were used for the evaluation of the seismic hazard, the Cornell method, based on the Gutenberg Richter distribution parameters, and Gumbel method, based on extremes statistic. The results show maximum values of possible magnitudes and their recurrence periods, for each seismic source. Using the Tsunami Analysis Tool (TAT) software, a set of tsunami modelling scenarios have been generated for Shabla area, the seismic source that could mostly affect the Romanian shore. These simulations are structured in a database, in order to set maximum possible tsunami waves that could be

  2. Pre-earthquake assessment and recovery planning for the regional transportation system in the San Francisco Bay area

    SciTech Connect

    Perkins, J.B.

    1995-12-31

    In May 1995, ABAG began a cooperative project with Caltrans District 4 to perform a vulnerability analysis of the regional transportation system in the San Francisco Bay Area. This assessment will be used for pre-earthquake planning to speed the recovery process for the transportation system, including both freeways and local roads. The project is using geographic information system (GIS) technology and computer simulation models to assist in the vulnerability analyses, assessment of hazard mitigation strategies, and pre-earthquake planning activities. It is expected that this project will result in improving post-earthquake short-term emergency response as well as in shortening the time for long-term recovery. In addition, this innovative and timely approach should be applicable to other large metropolitan areas of the state, as well as to other metropolitan areas in the nation.

  3. Seismic Hazard and Risk Assessments for Beijing-Tianjin-Tangshan, China, Area

    USGS Publications Warehouse

    Xie, F.; Wang, Z.; Liu, J.

    2011-01-01

    Seismic hazard and risk in the Beijing-Tianjin-Tangshan, China, area were estimated from 500-year intensity observations. First, we digitized the intensity observations (maps) using ArcGIS with a cell size of 0.1 ?? 0.1??. Second, we performed a statistical analysis on the digitized intensity data, determined an average b value (0.39), and derived the intensity-frequency relationship (hazard curve) for each cell. Finally, based on a Poisson model for earthquake occurrence, we calculated seismic risk in terms of a probability of I ??? 7, 8, or 9 in 50 years. We also calculated the corresponding 10 percent probability of exceedance of these intensities in 50 years. The advantages of assessing seismic hazard and risk from intensity records are that (1) fewer assumptions (i. e., earthquake source and ground motion attenuation) are made, and (2) site-effect is included. Our study shows that the area has high seismic hazard and risk. Our study also suggests that current design peak ground acceleration or intensity for the area may not be adequate. ?? 2010 Birkh??user / Springer Basel AG.

  4. Volcanic-hazards assessments; past, present, and future

    USGS Publications Warehouse

    Crandell, D.R.

    1991-01-01

    Worldwide interest in volcanic-hazards assessments was greatly stimulated by the 1980 eruption of Mount St. Helens, just 2 years after a hazards assessment of the volcano was published in U.S Geological Survey Bulletin 1383-C. Many climactic eruption on May 18, although the extent of the unprecedented and devastating lateral blast was not anticipated. 

  5. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  6. Comparison of the historical record of earthquake hazard with seismic-hazard models for New Zealand and the continental United States

    USGS Publications Warehouse

    Stirling, M.; Petersen, M.

    2006-01-01

    We compare the historical record of earthquake hazard experienced at 78 towns and cities (sites) distributed across New Zealand and the continental United States with the hazard estimated from the national probabilistic seismic-hazard (PSH) models for the two countries. The two PSH models are constructed with similar methodologies and data. Our comparisons show a tendency for the PSH models to slightly exceed the historical hazard in New Zealand and westernmost continental United States interplate regions, but show lower hazard than that of the historical record in the continental United States intraplate region. Factors such as non-Poissonian behavior, parameterization of active fault data in the PSH calculations, and uncertainties in estimation of ground-motion levels from historical felt intensity data for the interplate regions may have led to the higher-than-historical levels of hazard at the interplate sites. In contrast, the less-than-historical hazard for the remaining continental United States (intraplate) sites may be largely due to site conditions not having been considered at the intraplate sites, and uncertainties in correlating ground-motion levels to historical felt intensities. The study also highlights the importance of evaluating PSH models at more than one region, because the conclusions reached on the basis of a solely interplate or intraplate study would be very different.

  7. New Directions in Seismic Hazard Assessment through Focused Earth Observation in the MARmara SuperSITE

    NASA Astrophysics Data System (ADS)

    Meral Ozel, Nurcan; Necmioglu, Ocal; Favali, Paolo; Douglas, John; Mathieu, Pierre Philippe; Geli, Louis; Tan, Onur; Ergintav, Semih; Oguz Ozel, A.; Gurbuz, Cemil; Erdik, Mustafa

    2013-04-01

    Among the regions around the Mediterranean Sea for which earthquakes represent a major threat to their social and economic development, the area around the Marmara Sea, one of the most densely populated parts of Europe, is subjected to a high level of seismic hazard. For this region the MARSITE project is proposed with the aim of assessing the "state of the art" of seismic risk evaluation and management at European level. This will be the starting point to move a "step forward" towards new concepts of risk mitigation and management by long-term monitoring activities carried out both on land and at sea. MARsite will serve as the platform for an integrated, multidisciplinary, holistic and articulated framework for dealing with fault zone monitoring, capable of developing the next generation of observatories to study earthquake generation processes. The main progress will be the fusion of ground- and space-based monitoring systems dedicated to geo-hazard monitoring. All data (space/sea-bottom/seismology/borehole/geochemistry) will flow to KOERI and hosted in and served via a secure server. The MARSITE project aims to coordinate research groups with different scientific skills (from seismology to engineering to gas geochemistry) in a comprehensive monitoring activity developed both in the Marmara Sea and in the surrounding urban and country areas. The project collects multidisciplinary data, to be shared, interpreted and merged in consistent theoretical and practical models suitable for the implementation of good practices to move the necessary information to the end users in charge of seismic risk management of the Istanbul-Marmara Sea area. Marsite is divided into eleven work packages that consider the processes involved in earthquake generation and the physics of short-term seismic transients, 4D deformations to understand earthquake cycle processes, fluid activity monitoring and seismicity under the sea floor using existing autonomous instrumentation, early warning

  8. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... existing and potential workplace hazards and assess the risk of associated workers injury and illness. Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or...

  9. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... existing and potential workplace hazards and assess the risk of associated workers injury and illness. Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or...

  10. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... existing and potential workplace hazards and assess the risk of associated workers injury and illness. Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or...

  11. The Salton Seismic Imaging Project: Investigating Earthquake Hazards in the Salton Trough, Southern California

    NASA Astrophysics Data System (ADS)

    Fuis, G. S.; Goldman, M.; Sickler, R. R.; Catchings, R. D.; Rymer, M. J.; Rose, E. J.; Murphy, J. M.; Butcher, L. A.; Cotton, J. A.; Criley, C. J.; Croker, D. S.; Emmons, I.; Ferguson, A. J.; Gardner, M. A.; Jensen, E. G.; McClearn, R.; Loughran, C. L.; Slayday-Criley, C. J.; Svitek, J. F.; Hole, J. A.; Stock, J. M.; Skinner, S. M.; Driscoll, N. W.; Harding, A. J.; Babcock, J. M.; Kent, G.; Kell, A. M.; Harder, S. H.

    2011-12-01

    The Salton Seismic Imaging Project (SSIP) is a collaborative effort between academia and the U.S. Geological Survey to provide detailed, subsurface 3-D images of the Salton Trough of southern California and northern Mexico. From both active- and passive-source seismic data that were acquired both onshore and offshore (Salton Sea), the resulting images will provide insights into earthquake hazards, rift processes, and rift-transform interaction at the southern end of the San Andreas Fault system. The southernmost San Andreas Fault (SAF) is considered to be at high-risk of producing a large damaging earthquake, yet the structure of this and other regional faults and that of adjacent sedimentary basins is not currently well understood. Seismic data were acquired from 2 to 18 March 2011. One hundred and twenty-six borehole explosions (10-1400 kg yield) were detonated along seven profiles in the Salton Trough region, extending from area of Palm Springs, California, to the southwestern tip of Arizona. Airguns (1500 and 3500 cc) were fired along two profiles in the Salton Sea and at points in a 2-D array in the southern Salton Sea. Approximately 2800 seismometers were deployed at over 4200 locations throughout the Salton Trough region, and 48 ocean-bottom seismometers were deployed at 78 locations beneath the Salton Sea. Many of the onshore explosions were energetic enough to be recorded and located by the Southern California Seismograph Network. The geometry of the SAF has important implications for energy radiation in the next major rupture. Prior potential field, seismicity, and InSAR data indicate that the SAF may dip moderately to the northeast from the Salton Sea to Cajon Pass in the Transverse Ranges. Much of SSIP was designed to test models of this geometry.

  12. Geodynamics and seismic hazard in the Calabrian Arc: towards a Messina earthquake supersite

    NASA Astrophysics Data System (ADS)

    Chiarabba, Claudio; Dell'Acqua, Fabio; Faccenna, Claudio; Lanari, Riccardo; Matteuzzi, Francesco; Mattia, Mario; Neri, Giancarlo; Patané, Domenico; Polonia, Alina; Prati, Claudio; Tinti, Stefano; Zerbini, Susanna; Ozener, Haluk

    2015-04-01

    The Messina region represents a key site of the Mediterranean, where active faulting, seismic shaking, volcanism, rapid uplift and landslides represent the surface manifestation of deep processes. Fast deformation results in one of the highest seismic hazard of the Mediterranean, as testified by historic destructive earthquakes occasionally accompanied by submarine mass flows and tsunami-events that added death and destruction to the already devastating effects of the earthquakes. Several geophysical and geological studies carried out during the last decades help defining the kinematics and the dynamics of the system. The tectonic evolution of the Messina region is strictly linked with the Southern Tyrrhenian and Calabrian Arc system, the retreat of the Ionian slab and the back-arc basin opening. The present-day geometry of the Calabrian slab, as well imaged by tomographic analyses and shallow-to-deep seismicity, shows a narrow slab plunging down steeply into the mantle. At 100-150 km depth, the southern edge of the slab is positioned beneath Northeastern Sicily, approximately between Tindari and Messina. Within this frame, several relevant questions are still unsolved. For example, it is not clear how the upper plate may deform as a response of a differential sinking of the subducting slabs, or how deep mantle flow at the slab edge may influence the pattern of surface deformation. Structural and geodetic data show the first-order pattern of deformation in Northeastern Sicily, and define the Tindari-Messina area as the boundary between a region in compression to the west, dominated by the Africa convergence, and a region in extension to the east-northeast, dominated by slab rollback. In addition, geodetic studies also show an increase of crustal motion velocity from Sicily to Calabria with an overall clockwise rotation of the velocity vector. This pattern of surface deformation evidences a sharp extension process active in the Messina region. The elevation of

  13. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  14. Subduction zone and crustal dynamics of western Washington; a tectonic model for earthquake hazards evaluation

    USGS Publications Warehouse

    Stanley, Dal; Villaseñor, Antonio; Benz, Harley

    1999-01-01

    The Cascadia subduction zone is extremely complex in the western Washington region, involving local deformation of the subducting Juan de Fuca plate and complicated block structures in the crust. It has been postulated that the Cascadia subduction zone could be the source for a large thrust earthquake, possibly as large as M9.0. Large intraplate earthquakes from within the subducting Juan de Fuca plate beneath the Puget Sound region have accounted for most of the energy release in this century and future such large earthquakes are expected. Added to these possible hazards is clear evidence for strong crustal deformation events in the Puget Sound region near faults such as the Seattle fault, which passes through the southern Seattle metropolitan area. In order to understand the nature of these individual earthquake sources and their possible interrelationship, we have conducted an extensive seismotectonic study of the region. We have employed P-wave velocity models developed using local earthquake tomography as a key tool in this research. Other information utilized includes geological, paleoseismic, gravity, magnetic, magnetotelluric, deformation, seismicity, focal mechanism and geodetic data. Neotectonic concepts were tested and augmented through use of anelastic (creep) deformation models based on thin-plate, finite-element techniques developed by Peter Bird, UCLA. These programs model anelastic strain rate, stress, and velocity fields for given rheological parameters, variable crust and lithosphere thicknesses, heat flow, and elevation. Known faults in western Washington and the main Cascadia subduction thrust were incorporated in the modeling process. Significant results from the velocity models include delineation of a previously studied arch in the subducting Juan de Fuca plate. The axis of the arch is oriented in the direction of current subduction and asymmetrically deformed due to the effects of a northern buttress mapped in the velocity models. This

  15. Feasibility of anomaly occurrence in aerosols time series obtained from MODIS satellite images during hazardous earthquakes

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi; Jahani Chehrebargh, Fatemeh

    2016-09-01

    Earthquake is one of the most devastating natural disasters that its prediction has not materialized comprehensive. Remote sensing data can be used to access information which is closely related to an earthquake. The unusual variations of lithosphere, atmosphere and ionosphere parameters before the main earthquakes are considered as earthquake precursors. To date the different precursors have been proposed. This paper examines one of the parameters which can be derived from satellite imagery. The mentioned parameter is Aerosol Optical Depth (AOD) that this article reviews its relationship with earthquake. Aerosol parameter can be achieved through various methods such as AERONET ground stations or using satellite images via algorithms such as the DDV (Dark Dense Vegetation), Deep Blue Algorithm and SYNTAM (SYNergy of Terra and Aqua Modis). In this paper, by analyzing AOD's time series (derived from MODIS sensor on the TERRA platform) for 16 major earthquakes, seismic anomalies were observed before and after earthquakes. Before large earthquakes, rate of AOD increases due to the pre-seismic changes before the strong earthquake, which produces gaseous molecules and therefore AOD increases. Also because of aftershocks after the earthquake there is a significant change in AOD due to gaseous molecules and dust. These behaviors suggest that there is a close relationship between earthquakes and the unusual AOD variations. Therefore the unusual AOD variations around the time of earthquakes can be introduced as an earthquake precursor.

  16. Estimating Seismic Hazards from the Catalog of Taiwan Earthquakes from 1900 to 2014 in Terms of Maximum Magnitude

    NASA Astrophysics Data System (ADS)

    Chen, Kuei-Pao; Chang, Wen-Yen

    2017-02-01

    Maximum expected earthquake magnitude is an important parameter when designing mitigation measures for seismic hazards. This study calculated the maximum magnitude of potential earthquakes for each cell in a 0.1° × 0.1° grid of Taiwan. Two zones vulnerable to maximum magnitudes of M w ≥6.0, which will cause extensive building damage, were identified: one extends from Hsinchu southward to Taichung, Nantou, Chiayi, and Tainan in western Taiwan; the other extends from Ilan southward to Hualian and Taitung in eastern Taiwan. These zones are also characterized by low b values, which are consistent with high peak ground shaking. We also employed an innovative method to calculate (at intervals of M w 0.5) the bounds and median of recurrence time for earthquakes of magnitude M w 6.0-8.0 in Taiwan.

  17. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J.; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/qfaults/ . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  18. Geophysical setting of the February 21, 2008 Mw 6 Wells earthquake