Sample records for earthquake hazard assessment

  1. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  2. The Active Fault Parameters for Time-Dependent Earthquake Hazard Assessment in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Y.; Cheng, C.; Lin, P.; Shao, K.; Wu, Y.; Shih, C.

    2011-12-01

    Taiwan is located at the boundary between the Philippine Sea Plate and the Eurasian Plate, with a convergence rate of ~ 80 mm/yr in a ~N118E direction. The plate motion is so active that earthquake is very frequent. In the Taiwan area, disaster-inducing earthquakes often result from active faults. For this reason, it's an important subject to understand the activity and hazard of active faults. The active faults in Taiwan are mainly located in the Western Foothills and the Eastern longitudinal valley. Active fault distribution map published by the Central Geological Survey (CGS) in 2010 shows that there are 31 active faults in the island of Taiwan and some of which are related to earthquake. Many researchers have investigated these active faults and continuously update new data and results, but few people have integrated them for time-dependent earthquake hazard assessment. In this study, we want to gather previous researches and field work results and then integrate these data as an active fault parameters table for time-dependent earthquake hazard assessment. We are going to gather the seismic profiles or earthquake relocation of a fault and then combine the fault trace on land to establish the 3D fault geometry model in GIS system. We collect the researches of fault source scaling in Taiwan and estimate the maximum magnitude from fault length or fault area. We use the characteristic earthquake model to evaluate the active fault earthquake recurrence interval. In the other parameters, we will collect previous studies or historical references and complete our parameter table of active faults in Taiwan. The WG08 have done the time-dependent earthquake hazard assessment of active faults in California. They established the fault models, deformation models, earthquake rate models, and probability models and then compute the probability of faults in California. Following these steps, we have the preliminary evaluated probability of earthquake-related hazards in certain faults in Taiwan. By accomplishing active fault parameters table in Taiwan, we would apply it in time-dependent earthquake hazard assessment. The result can also give engineers a reference for design. Furthermore, it can be applied in the seismic hazard map to mitigate disasters.

  3. The 2014 United States National Seismic Hazard Model

    USGS Publications Warehouse

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Mueller, Charles; Haller, Kathleen; Frankel, Arthur; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen; Boyd, Oliver; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nicolas; Wheeler, Russell; Williams, Robert; Olsen, Anna H.

    2015-01-01

    New seismic hazard maps have been developed for the conterminous United States using the latest data, models, and methods available for assessing earthquake hazard. The hazard models incorporate new information on earthquake rupture behavior observed in recent earthquakes; fault studies that use both geologic and geodetic strain rate data; earthquake catalogs through 2012 that include new assessments of locations and magnitudes; earthquake adaptive smoothing models that more fully account for the spatial clustering of earthquakes; and 22 ground motion models, some of which consider more than double the shaking data applied previously. Alternative input models account for larger earthquakes, more complicated ruptures, and more varied ground shaking estimates than assumed in earlier models. The ground motions, for levels applied in building codes, differ from the previous version by less than ±10% over 60% of the country, but can differ by ±50% in localized areas. The models are incorporated in insurance rates, risk assessments, and as input into the U.S. building code provisions for earthquake ground shaking.

  4. Statistical analysis of earthquakes after the 1999 MW 7.7 Chi-Chi, Taiwan, earthquake based on a modified Reasenberg-Jones model

    NASA Astrophysics Data System (ADS)

    Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq

    2015-12-01

    We investigated the temporal-spatial hazard of the earthquakes after the 1999 September 21 MW = 7.7 Chi-Chi shock in a continental region of Taiwan. The Reasenberg-Jones (RJ) model (Reasenberg and Jones, 1989, 1994) that combines the frequency-magnitude distribution (Gutenberg and Richter, 1944) and time-decaying occurrence rate (Utsu et al., 1995) is conventionally employed for assessing the earthquake hazard after a large shock. However, it is found that the b values in the frequency-magnitude distribution of the earthquakes in the study region dramatically decreased from background values after the Chi-Chi shock, and then gradually increased up. The observation of a time-dependent frequency-magnitude distribution motivated us to propose a modified RJ model (MRJ) to assess the earthquake hazard. To see how the models perform on assessing short-term earthquake hazard, the RJ and MRJ models were separately used to sequentially forecast earthquakes in the study region. To depict the potential rupture area for future earthquakes, we further constructed relative hazard (RH) maps based on the two models. The Receiver Operating Characteristics (ROC) curves (Swets, 1988) finally demonstrated that the RH map based on the MRJ model was, in general, superior to the one based on the original RJ model for exploring the spatial hazard of earthquakes in a short time after the Chi-Chi shock.

  5. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.

  6. Earthquake Emergency Education in Dushanbe, Tajikistan

    ERIC Educational Resources Information Center

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  7. The application of the geography census data in seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Yuan, Shen; Ying, Zhang

    2017-04-01

    Limited by basic data timeliness to earthquake emergency database in Sichuan province, after the earthquake disaster assessment results and the actual damage there is a certain gap. In 2015, Sichuan completed the province census for the first time which including topography, traffic, vegetation coverage, water area, desert and bare ground, traffic network, the census residents and facilities, geographical unit, geological hazard as well as the Lushan earthquake-stricken area's town planning construction and ecological environment restoration. On this basis, combining with the existing achievements of basic geographic information data and high resolution image data, supplemented by remote sensing image interpretation and geological survey, Carried out distribution and change situation of statistical analysis and information extraction for earthquake disaster hazard-affected body elements such as surface coverage, roads, structures infrastructure in Lushan county before 2013 after 2015. At the same time, achieved the transformation and updating from geographical conditions census data to earthquake emergency basic data through research their data type, structure and relationship. Finally, based on multi-source disaster information including hazard-affected body changed data and Lushan 7.0 magnitude earthquake CORS network coseismal displacement field, etc. obtaining intensity control points through information fusion. Then completed the seismic influence field correction and assessed earthquake disaster again through Sichuan earthquake relief headquarters technology platform. Compared the new assessment result,original assessment result and actual earthquake disaster loss which shows that the revised evaluation result is more close to the actual earthquake disaster loss. In the future can realize geographical conditions census data to earthquake emergency basic data's normalized updates, ensure the timeliness to earthquake emergency database meanwhile improve the accuracy of assessment of earthquake disaster constantly.

  8. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and resultant loss of income produces widespread default on payments. With increased computational power and more complete inventories of exposure, Monte Carlo methods may provide more accurate estimation of severe losses and the opportunity to increase resilience of vulnerable systems and communities.

  9. International Collaboration for Strengthening Capacity to Assess Earthquake Hazard in Indonesia

    NASA Astrophysics Data System (ADS)

    Cummins, P. R.; Hidayati, S.; Suhardjono, S.; Meilano, I.; Natawidjaja, D.

    2012-12-01

    Indonesia has experienced a dramatic increase in earthquake risk due to rapid population growth in the 20th century, much of it occurring in areas near the subduction zone plate boundaries that are prone to earthquake occurrence. While recent seismic hazard assessments have resulted in better building codes that can inform safer building practices, many of the fundamental parameters controlling earthquake occurrence and ground shaking - e.g., fault slip rates, earthquake scaling relations, ground motion prediction equations, and site response - could still be better constrained. In recognition of the need to improve the level of information on which seismic hazard assessments are based, the Australian Agency for International Development (AusAID) and Indonesia's National Agency for Disaster Management (BNPB), through the Australia-Indonesia Facility for Disaster Reduction, have initiated a 4-year project designed to strengthen the Government of Indonesia's capacity to reliably assess earthquake hazard. This project is a collaboration of Australian institutions including Geoscience Australia and the Australian National University, with Indonesian government agencies and universities including the Agency for Meteorology, Climatology and Geophysics, the Geological Agency, the Indonesian Institute of Sciences, and Bandung Institute of Technology. Effective earthquake hazard assessment requires input from many different types of research, ranging from geological studies of active faults, seismological studies of crustal structure, earthquake sources and ground motion, PSHA methodology, and geodetic studies of crustal strain rates. The project is a large and diverse one that spans all these components, and these will be briefly reviewed in this presentation

  10. 77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... is to discuss engineering needs for existing buildings, to review the National Earthquake Hazards... Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov/ . DATES: The... assesses: Trends and developments in the science and engineering of earthquake hazards reduction; The...

  11. Earthquake Hazard Analysis Methods: A Review

    NASA Astrophysics Data System (ADS)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  12. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  13. Updated earthquake catalogue for seismic hazard analysis in Pakistan

    NASA Astrophysics Data System (ADS)

    Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas

    2018-03-01

    A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.

  14. USGS Training in Afghanistan: Modern Earthquake Hazards Assessments

    NASA Astrophysics Data System (ADS)

    Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.

    2007-05-01

    Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."

  15. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  16. Documentation for the 2008 Update of the United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Haller, Kathleen M.; Wheeler, Russell L.; Wesson, Robert L.; Zeng, Yuehua; Boyd, Oliver S.; Perkins, David M.; Luco, Nicolas; Field, Edward H.; Wills, Chris J.; Rukstales, Kenneth S.

    2008-01-01

    The 2008 U.S. Geological Survey (USGS) National Seismic Hazard Maps display earthquake ground motions for various probability levels across the United States and are applied in seismic provisions of building codes, insurance rate structures, risk assessments, and other public policy. This update of the maps incorporates new findings on earthquake ground shaking, faults, seismicity, and geodesy. The resulting maps are derived from seismic hazard curves calculated on a grid of sites across the United States that describe the frequency of exceeding a set of ground motions. The USGS National Seismic Hazard Mapping Project developed these maps by incorporating information on potential earthquakes and associated ground shaking obtained from interaction in science and engineering workshops involving hundreds of participants, review by several science organizations and State surveys, and advice from two expert panels. The National Seismic Hazard Maps represent our assessment of the 'best available science' in earthquake hazards estimation for the United States (maps of Alaska and Hawaii as well as further information on hazard across the United States are available on our Web site at http://earthquake.usgs.gov/research/hazmaps/).

  17. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the qualitative assessments based on frequency and amplitude. The second method to assess tsunami hazard involved using the USGS earthquake databases to search for possible earthquake sources near American coastlines to extend the NOAA/NGDC tsunami databases backward in time. The qualitative tsunami hazard assessment based on the results of the NGDC and USGS database searches will be presented.

  18. Satellite Detection of the Convection Generated Stresses in Earth

    NASA Technical Reports Server (NTRS)

    Liu, Han-Shou; Kolenkiewicz, Ronald; Li, Jin-Ling; Chen, Jiz-Hong

    2003-01-01

    We review research developments on satellite detection of the convection generated stresses in the Earth for seismic hazard assessment and Earth resource survey. Particular emphasis is laid upon recent progress and results of stress calculations from which the origin and evolution of the tectonic features on Earth's surface can be scientifically addressed. An important aspect of the recent research development in tectonic stresses relative to earthquakes is the implications for earthquake forecasting and prediction. We have demonstrated that earthquakes occur on the ring of fire around the Pacific in response to the tectonic stresses induced by mantle convection. We propose a systematic global assessment of the seismic hazard based on variations of tectonic stresses in the Earth as observed by satellites. This space geodynamic approach for assessing the seismic hazard is unique in that it can pinpoint the triggering stresses for large earthquakes without ambiguities of geological structures, fault geometries, and other tectonic properties. Also, it is distinct from the probabilistic seismic hazard assessment models in the literature, which are based only on extrapolations of available earthquake data.

  19. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  20. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model

    NASA Astrophysics Data System (ADS)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza

    2017-08-01

    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  1. Considering potential seismic sources in earthquake hazard assessment for Northern Iran

    NASA Astrophysics Data System (ADS)

    Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila

    2014-07-01

    Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.

  2. Technical Report - FINAL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbara Luke, Director, UNLV Engineering Geophysics Laboratory

    2007-04-25

    Improve understanding of the earthquake hazard in the Las Vegas Valley and to assess the state of preparedness of the area's population and structures for the next big earthquake. 1. Enhance the seismic monitoring network in the Las Vegas Valley 2. Improve understanding of deep basin structure through active-source seismic refraction and reflection testing 3. Improve understanding of dynamic response of shallow sediments through seismic testing and correlations with lithology 4. Develop credible earthquake scenarios by laboratory and field studies, literature review and analyses 5. Refine ground motion expectations around the Las Vegas Valley through simulations 6. Assess current buildingmore » standards in light of improved understanding of hazards 7. Perform risk assessment for structures and infrastructures, with emphasis on lifelines and critical structures 8. Encourage and facilitate broad and open technical interchange regarding earthquake safety in southern Nevada and efforts to inform citizens of earthquake hazards and mitigation opportunities« less

  3. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    ,

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  4. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could potentially cause moderate-large earthquakes. Note that although much of the region has a low probability of damaging shaking, low-probability events have resulted in much destruction recently in SE Asia (e.g. 2008 Wenchuan, 2015 Sabah earthquakes).

  5. Earthquake risk reduction in the United States: An assessment of selected user needs and recommendations for the National Earthquake Hazards Reduction Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-31

    This Assessment was conducted to improve the National Earthquake Hazards Reduction Program (NEHRP) by providing NEHRP agencies with information that supports their user-oriented setting of crosscutting priorities in the NEHRP strategic planning process. The primary objective of this Assessment was to take a ``snapshot`` evaluation of the needs of selected users throughout the major program elements of NEHRP. Secondary objectives were to conduct an assessment of the knowledge that exists (or is being developed by NEHRP) to support earthquake risk reduction, and to begin a process of evaluating how NEHRP is meeting user needs. An identification of NEHRP`s strengths alsomore » resulted from the effort, since those strengths demonstrate successful methods that may be useful to NEHRP in the future. These strengths are identified in the text, and many of them represent important achievements since the Earthquake Hazards Reduction Act was passed in 1977.« less

  6. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  7. Vulnerability of port and harbor communities to earthquake and tsunami hazards: The use of GIS in community hazard planning

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.

    2004-01-01

    AbstractEarthquakes and tsunamis pose significant threats to Pacific Northwest coastal port and harbor communities. Developing holistic mitigation and preparedness strategies to reduce the potential for loss of life and property damage requires community-wide vulnerability assessments that transcend traditional site-specific analyses. The ability of a geographic information system (GIS) to integrate natural, socioeconomic, and hazards information makes it an ideal assessment tool to support community hazard planning efforts. This article summarizes how GIS was used to assess the vulnerability of an Oregon port and harbor community to earthquake and tsunami hazards, as part of a larger risk-reduction planning initiative. The primary purposes of the GIS were to highlight community vulnerability issues and to identify areas that both are susceptible to hazards and contain valued port and harbor community resources. Results of the GIS analyses can help decision makers with limited mitigation resources set priorities for increasing community resiliency to natural hazards.

  8. Earthquake hazard assessment after Mexico (1985).

    PubMed

    Degg, M R

    1989-09-01

    The 1985 Mexican earthquake ranks foremost amongst the major earthquake disasters of the twentieth century. One of the few positive aspects of the disaster is that it provided massive quantities of data that would otherwise have been unobtainable. Every opportunity should be taken to incorporate the findings from these data in earthquake hazard assessments. The purpose of this paper is to provide a succinct summary of some of the more important lessons from Mexico. It stems from detailed field investigations, and subsequent analyses, conducted by the author on the behalf of reinsurance companies.

  9. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  10. Errors in Seismic Hazard Assessment are Creating Huge Human Losses

    NASA Astrophysics Data System (ADS)

    Bela, J.

    2015-12-01

    The current practice of representing earthquake hazards to the public based upon their perceived likelihood or probability of occurrence is proven now by the global record of actual earthquakes to be not only erroneous and unreliable, but also too deadly! Earthquake occurrence is sporadic and therefore assumptions of earthquake frequency and return-period are both not only misleading, but also categorically false. More than 700,000 people have now lost their lives (2000-2011), wherein 11 of the World's Deadliest Earthquakes have occurred in locations where probability-based seismic hazard assessments had predicted only low seismic low hazard. Unless seismic hazard assessment and the setting of minimum earthquake design safety standards for buildings and bridges are based on a more realistic deterministic recognition of "what can happen" rather than on what mathematical models suggest is "most likely to happen" such future huge human losses can only be expected to continue! The actual earthquake events that did occur were at or near the maximum potential-size event that either already had occurred in the past; or were geologically known to be possible. Haiti's M7 earthquake, 2010 (with > 222,000 fatalities) meant the dead could not even be buried with dignity. Japan's catastrophic Tohoku earthquake, 2011; a M9 Megathrust earthquake, unleashed a tsunami that not only obliterated coastal communities along the northern Japanese coast, but also claimed > 20,000 lives. This tsunami flooded nuclear reactors at Fukushima, causing 4 explosions and 3 reactors to melt down. But while this history of huge human losses due to erroneous and misleading seismic hazard estimates, despite its wrenching pain, cannot be unlived; if faced with courage and a more realistic deterministic estimate of "what is possible", it need not be lived again. An objective testing of the results of global probability based seismic hazard maps against real occurrences has never been done by the GSHAP team; even though the obvious inadequacy of the GSHAP map could have been established in the course of a simple check before the project completion. The doctrine of "psha exceptionalism" that created the maps can only be esponged by carefully examining the facts . . . which unfortunately include huge human losses!

  11. Increased Earthquake Rates in the Central and Eastern US Portend Higher Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Rubinstein, J. L.; Ellsworth, W. L.; Mueller, C. S.; Michael, A. J.; McGarr, A.; Petersen, M. D.; Weingarten, M.; Holland, A. A.

    2014-12-01

    Since 2009 the central and eastern United States has experienced an unprecedented increase in the rate of M≥3 earthquakes that is unlikely to be due to natural variation. Where the rates have increased so has the seismic hazard, making it important to understand these changes. Areas with significant seismicity increases are limited to areas where oil and gas production take place. By far the largest contributor to the seismicity increase is Oklahoma, where recent studies suggest that these rate changes may be due to fluid injection (e.g., Keranen et al., Geology, 2013; Science, 2014). Moreover, the area of increased seismicity in northern Oklahoma that began in 2013 coincides with the Mississippi Lime play, where well completions greatly increased the year before the seismicity increase. This suggests a link to oil and gas production either directly or from the disposal of significant amounts of produced water within the play. For the purpose of assessing the hazard due to these earthquakes, should they be treated differently from natural earthquakes? Previous studies suggest that induced seismicity may differ from natural seismicity in clustering characteristics or frequency-magnitude distributions (e.g., Bachmann et al., GJI, 2011; Llenos and Michael, BSSA, 2013). These differences could affect time-independent hazard computations, which typically assume that clustering and size distribution remain constant. In Oklahoma, as well as other areas of suspected induced seismicity, we find that earthquakes since 2009 tend to be considerably more clustered in space and time than before 2009. However differences between various regional and national catalogs leave unclear whether there are significant changes in magnitude distribution. Whether they are due to natural or industrial causes, the increased earthquake rates in these areas could increase the hazard in ways that are not accounted for in current hazard assessment practice. Clearly the possibility of induced earthquakes needs to be considered in seismic hazard assessments.

  12. Using the USGS Seismic Risk Web Application to estimate aftershock damage

    USGS Publications Warehouse

    McGowan, Sean M.; Luco, Nicolas

    2014-01-01

    The U.S. Geological Survey (USGS) Engineering Risk Assessment Project has developed the Seismic Risk Web Application to combine earthquake hazard and structural fragility information in order to calculate the risk of earthquake damage to structures. Enabling users to incorporate their own hazard and fragility information into the calculations will make it possible to quantify (in near real-time) the risk of additional damage to structures caused by aftershocks following significant earthquakes. Results can quickly be shared with stakeholders to illustrate the impact of elevated ground motion hazard and earthquake-compromised structural integrity on the risk of damage during a short-term, post-earthquake time horizon.

  13. Probabilistic tsunami hazard assessment based on the long-term evaluation of subduction-zone earthquakes along the Sagami Trough, Japan

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Maeda, T.; Matsuyama, H.; Toyama, N.; Kito, T.; Murata, Y.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.; Hakamata, T.

    2017-12-01

    For the forthcoming large earthquakes along the Sagami Trough where the Philippine Sea Plate is subducting beneath the northeast Japan arc, the Earthquake Research Committee(ERC) /Headquarters for Earthquake Research Promotion, Japanese government (2014a) assessed that M7 and M8 class earthquakes will occur there and defined the possible extent of the earthquake source areas. They assessed 70% and 0% 5% of the occurrence probability within the next 30 years (from Jan. 1, 2014), respectively, for the M7 and M8 class earthquakes. First, we set possible 10 earthquake source areas(ESAs) and 920 ESAs, respectively, for M8 and M7 class earthquakes. Next, we constructed 125 characterized earthquake fault models (CEFMs) and 938 CEFMs, respectively, for M8 and M7 class earthquakes, based on "tsunami receipt" of ERC (2017) (Kitoh et al., 2016, JpGU). All the CEFMs are allowed to have a large slip area for expression of fault slip heterogeneity. For all the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. Finally, we re-distributed the occurrence probability to all CEFMs (Abe et al., 2014, JpGU) and gathered excess probabilities for variable tsunami heights, calculated from all the CEFMs, at every observation point along Pacific coast to get PTHA. We incorporated aleatory uncertainties inherent in tsunami calculation and earthquake fault slip heterogeneity. We considered two kinds of probabilistic hazard models; one is "Present-time hazard model" under an assumption that the earthquake occurrence basically follows a renewal process based on BPT distribution if the latest faulting time was known. The other is "Long-time averaged hazard model" under an assumption that earthquake occurrence follows a stationary Poisson process. We fixed our viewpoint, for example, on the probability that the tsunami height will exceed 3 meters at coastal points in next 30 years (from Jan. 1, 2014). Present-time hazard model showed relatively high possibility over 0.1% along the Boso Peninsula. Long-time averaged hazard model showed highest possibility over 3% along the Boso Peninsula and relatively high possibility over 0.1 % along wide coastal areas on Pacific side from Kii Peninsula to Fukushima prefecture.

  14. A Case Study of Geologic Hazards Affecting School Buildings: Evaluating Seismic Structural Vulnerability and Landslide Hazards at Schools in Aizawl, India

    NASA Astrophysics Data System (ADS)

    Perley, M. M.; Guo, J.

    2016-12-01

    India's National School Safety Program (NSSP) aims to assess all government schools in earthquake prone regions of the country. To supplement the Mizoram State Government's recent survey of 141 government schools, we screened an additional 16 private and 4 government schools for structural vulnerabilities due to earthquakes, as well as landslide hazards, in Mizoram's capital of Aizawl. We developed a geomorphologically derived landslide susceptibility matrix, which was cross-checked with Aizawl Municipal Corporation's landslide hazard map (provided by Lettis Consultants International), to determine the geologic hazards at each school. Our research indicates that only 7% of the 22 assessed school buildings are located within low landslide hazard zones; 64% of the school buildings, with approximately 9,500 students, are located within very high or high landslide hazard zones. Rapid Visual Screening (RVS) was used to determine the structural earthquake vulnerability of each school building. RVS is an initial vulnerability assessment procedure used to inventory and rank buildings that may be hazardous during an earthquake. Our study indicates that all of the 22 assessed school buildings have a damageability rating of Grade 3 or higher on the 5-grade EMS scale, suggesting a significant vulnerability and potential for damage in buildings, ranging from widespread cracking of columns and beam column joints to collapse. Additionally, 86% of the schools we visited had reinforced concrete buildings constructed before Aizawl's building regulations were passed in 2007, which can be assumed to lack appropriate seismic reinforcement. Using our findings, we will give recommendations to the Government of Mizoram to prevent unnecessary loss of life by minimizing each school's landslide risk and ensuring schools are earthquake-resistant.

  15. Coulomb stress transfer and accumulation on the Sagaing Fault, Myanmar, over the past 110 years and its implications for seismic hazard

    NASA Astrophysics Data System (ADS)

    Xiong, X.; Shan, B.; Zhou, Y. M.; Wei, S. J.; Li, Y. D.; Wang, R. J.; Zheng, Y.

    2017-05-01

    Myanmar is drawing rapidly increasing attention from the world for its seismic hazard. The Sagaing Fault (SF), an active right-lateral strike-slip fault passing through Myanmar, has been being the source of serious seismic damage of the country. Thus, awareness of seismic hazard assessment of this region is of pivotal significance by taking into account the interaction and migration of earthquakes with respect to time and space. We investigated a seismic series comprising10 earthquakes with M > 6.5 that occurred along the SF since 1906. The Coulomb failure stress modeling exhibits significant interactions among the earthquakes. After the 1906 earthquake, eight out of nine earthquakes occurred in the positively stress-enhanced zone of the preceding earthquakes, verifying that the hypothesis of earthquake triggering is applicable on the SF. Moreover, we identified three visible positively stressed earthquake gaps on the central and southern SF, on which seismic hazard is increased.

  16. Nationwide tsunami hazard assessment project in Japan

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2014-12-01

    In 2012, we began a project of nationwide Probabilistic Tsunami Hazard Assessment (PTHA) in Japan to support various measures (Fujiwara et al., 2013, JpGU; Hirata et al., 2014, AOGS). The most important strategy in the nationwide PTHA is predominance of aleatory uncertainty in the assessment but use of epistemic uncertainty is limited to the minimum, because the number of all possible combinations among epistemic uncertainties diverges quickly when the number of epistemic uncertainties in the assessment increases ; we consider only a type of earthquake occurrence probability distribution as epistemic uncertainty. We briefly show outlines of the nationwide PTHA as follows; (i) we consider all possible earthquakes in the future, including those that the Headquarters for Earthquake Research Promotion (HERP) of Japanese Government, already assessed. (ii) We construct a set of simplified earthquake fault models, called "Characterized Earthquake Fault Models (CEFMs)", for all of the earthquakes by following prescribed rules (Toyama et al., 2014, JpGU; Korenaga et al., 2014, JpGU). (iii) For all of initial water surface distributions caused by a number of the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. (iv) Finally, we integrate information about the tsunamis calculated from the numerous CEFMs to get nationwide tsunami hazard assessments. One of the most popular representations of the integrated information is a tsunami hazard curve for coastal tsunami heights, incorporating uncertainties inherent in tsunami simulation and earthquake fault slip heterogeneity (Abe et al., 2014, JpGU). We will show a PTHA along the eastern coast of Honshu, Japan, based on approximately 1,800 tsunami sources located within the subduction zone along the Japan Trench, as a prototype of the nationwide PTHA. This study is supported by part of the research project on research on evaluation of hazard and risk of natural disasters, under the direction of the HERP of Japanese Government.

  17. Wicked Problems in Natural Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S.; Steckler, M. S.; Rundle, J. B.; Dixon, T. H.

    2017-12-01

    Social scientists have defined "wicked" problems that are "messy, ill-defined, more complex than we fully grasp, and open to multiple interpretations based on one's point of view... No solution to a wicked problem is permanent or wholly satisfying, which leaves every solution open to easy polemical attack." These contrast with "tame" problems in which necessary information is available and solutions - even if difficult and expensive - are straightforward to identify and execute. Updating the U.S.'s aging infrastructure is a tame problem, because what is wrong and how to fix it are clear. In contrast, addressing climate change is a wicked problem because its effects are uncertain and the best strategies to address them are unclear. An analogous approach can be taken to natural hazard problems. In tame problems, we have a good model of the process, good information about past events, and data implying that the model should predict future events. In such cases, we can make a reasonable assessment of the hazard that can be used to develop mitigation strategies. Earthquake hazard mitigation for San Francisco is a relatively tame problem. We understand how the earthquakes result from known plate motions, have information about past earthquakes, and have geodetic data implying that future similar earthquakes will occur. As a result, it is straightforward to develop and implement mitigation strategies. However, in many cases, hazard assessment and mitigation is a wicked problem. How should we prepare for a great earthquake on plate boundaries where tectonics favor such events but we have no evidence that they have occurred and hence how large they may be or how often to expect them? How should we assess the hazard within plates, for example in the New Madrid seismic zone, where large earthquakes have occurred but we do not understand their causes and geodetic data show no strain accumulating? How can we assess the hazard and make sensible policy when the recurrence of earthquakes, floods, or hurricanes seems to be changing with time or is expected to do so due to human activity? A starting approach might be to assess what we know, what we don't know, what we think, and what can be done that might improve this situation. We should draw on what is known in other areas of risk assessment including social science, meteorology, engineering, and economics.

  18. Two examples of earthquake- hazard reduction in southern California.

    USGS Publications Warehouse

    Kockelman, W.J.; Campbell, C.C.

    1983-01-01

    Because California is seismically active, planners and decisionmakers must try to anticipate earthquake hazards there and, where possible, to reduce the hazards. Geologic and seismologic information provides the basis for the necessary plans and actions. Two examples of how such information is used are presented. The first involves assessing the impact of a major earthquake on critical facilities in southern California, and the second involves strengthening or removing unsafe masonry buildings in the Los Angeles area. -from Authors

  19. Probabilistic seismic hazard assessment for the two layer fault system of Antalya (SW Turkey) area

    NASA Astrophysics Data System (ADS)

    Dipova, Nihat; Cangir, Bülent

    2017-09-01

    Southwest Turkey, along Mediterranean coast, is prone to large earthquakes resulting from subduction of the African plate under the Eurasian plate and shallow crustal faults. Maximum observed magnitude of subduction earthquakes is Mw = 6.5 whereas that of crustal earthquakes is Mw = 6.6. Crustal earthquakes are sourced from faults which are related with Isparta Angle and Cyprus Arc tectonic structures. The primary goal of this study is to assess seismic hazard for Antalya area (SW Turkey) using a probabilistic approach. A new earthquake catalog for Antalya area, with unified moment magnitude scale, was prepared in the scope of the study. Seismicity of the area has been evaluated by the Gutenberg-Richter recurrence relationship. For hazard computation, CRISIS2007 software was used following the standard Cornell-McGuire methodology. Attenuation model developed by Youngs et al. Seismol Res Lett 68(1):58-73, (1997) was used for deep subduction earthquakes and Chiou and Youngs Earthq Spectra 24(1):173-215, (2008) model was used for shallow crustal earthquakes. A seismic hazard map was developed for peak ground acceleration and for rock ground with a hazard level of a 10% probability of exceedance in 50 years. Results of the study show that peak ground acceleration values on bedrock change between 0.215 and 0.23 g in the center of Antalya.

  20. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in north‐central Texas near Dallas–Fort Worth. The chance of having levels of ground motions corresponding to modified Mercalli intensity (MMI) VI or greater earthquake shaking is 2%–12% per year in north‐central Oklahoma and southern Kansas and New Madrid similar to the chance of damage at sites in high‐hazard portions of California caused by natural earthquakes. Hazard is also significant in the Raton basin of Colorado/New Mexico; north‐central Arkansas; Dallas–Fort Worth, Texas; and in a few other areas. Hazard probabilities are much lower (by about half or more) for exceeding MMI VII or VIII. Hazard is 3‐ to 10‐fold higher near some areas of active‐induced earthquakes than in the 2014 USGS National Seismic Hazard Model (NSHM), which did not consider induced earthquakes. This study in conjunction with the LandScan TM Database (2013) indicates that about 8 million people live in areas of active injection wells that have a greater than 1% chance of experiencing damaging ground shaking (MMI≥VI) in 2016. The final model has high uncertainty, and engineers, regulators, and industry should use these assessments cautiously to make informed decisions on mitigating the potential effects of induced and natural earthquakes.

  1. Long aftershock sequences within continents and implications for earthquake hazard assessment.

    PubMed

    Stein, Seth; Liu, Mian

    2009-11-05

    One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere.

  2. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  3. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published results at that time. CDMG eventually published the second edition map in 1992 following the Governor's Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.

  4. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  5. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale

    NASA Astrophysics Data System (ADS)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.

    2016-05-01

    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  6. Regional coseismic landslide hazard assessment without historical landslide inventories: A new approach

    NASA Astrophysics Data System (ADS)

    Kritikos, Theodosios; Robinson, Tom R.; Davies, Tim R. H.

    2015-04-01

    Currently, regional coseismic landslide hazard analyses require comprehensive historical landslide inventories as well as detailed geotechnical data. Consequently, such analyses have not been possible where these data are not available. A new approach is proposed herein to assess coseismic landslide hazard at regional scale for specific earthquake scenarios in areas without historical landslide inventories. The proposed model employs fuzzy logic and geographic information systems to establish relationships between causative factors and coseismic slope failures in regions with well-documented and substantially complete coseismic landslide inventories. These relationships are then utilized to estimate the relative probability of landslide occurrence in regions with neither historical landslide inventories nor detailed geotechnical data. Statistical analyses of inventories from the 1994 Northridge and 2008 Wenchuan earthquakes reveal that shaking intensity, topography, and distance from active faults and streams are the main controls on the spatial distribution of coseismic landslides. Average fuzzy memberships for each factor are developed and aggregated to model the relative coseismic landslide hazard for both earthquakes. The predictive capabilities of the models are assessed and show good-to-excellent model performance for both events. These memberships are then applied to the 1999 Chi-Chi earthquake, using only a digital elevation model, active fault map, and isoseismal data, replicating prediction of a future event in a region lacking historic inventories and/or geotechnical data. This similarly results in excellent model performance, demonstrating the model's predictive potential and confirming it can be meaningfully applied in regions where previous methods could not. For such regions, this method may enable a greater ability to analyze coseismic landslide hazard from specific earthquake scenarios, allowing for mitigation measures and emergency response plans to be better informed of earthquake-related hazards.

  7. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part-I. Bureau of Indian Standards, New Delhi, 2002]. Not only holistic treatment of earthquake catalog and seismogenic zones has been performed, but also higher resolution in spatial distribution could be achieved. The COV maps have been provided with the strong ground-motion maps under various conditions to show the confidence in the results obtained. Results obtained in the present study would be helpful for risk assessment and other disaster mitigation-related studies.

  8. Deterministic Seismic Hazard Assessment of Center-East IRAN (55.5-58.5˚ E, 29-31˚ N)

    NASA Astrophysics Data System (ADS)

    Askari, M.; Ney, Beh

    2009-04-01

    Deterministic Seismic Hazard Assessment of Center-East IRAN (55.5-58.5˚E, 29-31˚N) Mina Askari, Behnoosh Neyestani Students of Science and Research University,Iran. Deterministic seismic hazard assessment has been performed in Center-East IRAN, including Kerman and adjacent regions of 100km is selected. A catalogue of earthquakes in the region, including historical earthquakes and instrumental earthquakes is provided. A total of 25 potential seismic source zones in the region delineated as area sources for seismic hazard assessment based on geological, seismological and geophysical information, then minimum distance for every seismic sources until site (Kerman) and maximum magnitude for each source have been determined, eventually using the N. A. ABRAHAMSON and J. J. LITEHISER '1989 attenuation relationship, maximum acceleration is estimated to be 0.38g, that is related to the movement of blind fault with maximum magnitude of this source is Ms=5.5.

  9. 2016 one-year seismic hazard forecast for the Central and Eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-03-28

    The U.S. Geological Survey (USGS) has produced a 1-year seismic hazard forecast for 2016 for the Central and Eastern United States (CEUS) that includes contributions from both induced and natural earthquakes. The model assumes that earthquake rates calculated from several different time windows will remain relatively stationary and can be used to forecast earthquake hazard and damage intensity for the year 2016. This assessment is the first step in developing an operational earthquake forecast for the CEUS, and the analysis could be revised with updated seismicity and model parameters. Consensus input models consider alternative earthquake catalog durations, smoothing parameters, maximum magnitudes, and ground motion estimates, and represent uncertainties in earthquake occurrence and diversity of opinion in the science community. Ground shaking seismic hazard for 1-percent probability of exceedance in 1 year reaches 0.6 g (as a fraction of standard gravity [g]) in northern Oklahoma and southern Kansas, and about 0.2 g in the Raton Basin of Colorado and New Mexico, in central Arkansas, and in north-central Texas near Dallas. Near some areas of active induced earthquakes, hazard is higher than in the 2014 USGS National Seismic Hazard Model (NHSM) by more than a factor of 3; the 2014 NHSM did not consider induced earthquakes. In some areas, previously observed induced earthquakes have stopped, so the seismic hazard reverts back to the 2014 NSHM. Increased seismic activity, whether defined as induced or natural, produces high hazard. Conversion of ground shaking to seismic intensity indicates that some places in Oklahoma, Kansas, Colorado, New Mexico, Texas, and Arkansas may experience damage if the induced seismicity continues unabated. The chance of having Modified Mercalli Intensity (MMI) VI or greater (damaging earthquake shaking) is 5–12 percent per year in north-central Oklahoma and southern Kansas, similar to the chance of damage caused by natural earthquakes at sites in parts of California.

  10. Seismic Hazard Assessment of the Sheki-Ismayilli Region, Azerbaijan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyubova, Leyla J.

    2006-03-23

    Seismic hazard assessment is an important factor in disaster management of Azerbaijan Republic. The Shaki-Ismayilli region is one of the earthquake-prone areas in Azerbaijan. According to the seismic zoning map, the region is located in intensity IX zone. Large earthquakes in the region take place along the active faults. The seismic activity of the Shaki-Ismayilli region is studied using macroseismic and instrumental data, which cover the period between 1250 and 2003. Several principal parameters of earthquakes are analyzed: maximal magnitude, energetic class, intensity, depth of earthquake hypocenter, and occurrence. The geological structures prone to large earthquakes are determined, and themore » dependence of magnitude on the fault length is shown. The large earthquakes take place mainly along the active faults. A map of earthquake intensity has been developed for the region, and the potential seismic activity of the Shaki-Ismayilli region has been estimated.« less

  11. Global assessment of human losses due to earthquakes

    USGS Publications Warehouse

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  12. The Global Earthquake Model - Past, Present, Future

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Stein, Ross

    2014-05-01

    The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange. Sharing of data and risk information, best practices, and approaches across the globe are key to assessing risk more effectively. Through consortium driven global projects, open-source IT development and collaborations with more than 10 regions, leading experts are developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. The year 2013 has seen the completion of ten global data sets or components addressing various aspects of earthquake hazard and risk, as well as two GEM-related, but independently managed regional projects SHARE and EMME. Notably, the International Seismological Centre (ISC) led the development of a new ISC-GEM global instrumental earthquake catalogue, which was made publicly available in early 2013. It has set a new standard for global earthquake catalogues and has found widespread acceptance and application in the global earthquake community. By the end of 2014, GEM's OpenQuake computational platform will provide the OpenQuake hazard/risk assessment software and integrate all GEM data and information products. The public release of OpenQuake is planned for the end of this 2014, and will comprise the following datasets and models: • ISC-GEM Instrumental Earthquake Catalogue (released January 2013) • Global Earthquake History Catalogue [1000-1903] • Global Geodetic Strain Rate Database and Model • Global Active Fault Database • Tectonic Regionalisation Model • Global Exposure Database • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerabilities Database • Socio-Economic Vulnerability and Resilience Indicators • Seismic Source Models • Ground Motion (Attenuation) Models • Physical Exposure Models • Physical Vulnerability Models • Composite Index Models (social vulnerability, resilience, indirect loss) • Repository of national hazard models • Uniform global hazard model Armed with these tools and databases, stakeholders worldwide will then be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Earthquake hazard information will be able to be combined with data on exposure (buildings, population) and data on their vulnerability, for risk assessment around the globe. Furthermore, for a truly integrated view of seismic risk, users will be able to add social vulnerability and resilience indices and estimate the costs and benefits of different risk management measures. Having finished its first five-year Work Program at the end of 2013, GEM has entered into its second five-year Work Program 2014-2018. Beyond maintaining and enhancing the products developed in Work Program 1, the second phase will have a stronger focus on regional hazard and risk activities, and on seeing GEM products used for risk assessment and risk management practice at regional, national and local scales. Furthermore GEM intends to partner with similar initiatives underway for other natural perils, which together are needed to meet the need for advanced risk assessment methods, tools and data to underpin global disaster risk reduction efforts under the Hyogo Framework for Action #2 to be launched in Sendai/Japan in spring 2015

  13. Perceptions of earthquake and tsunami issues in U.S. Pacific Northwest port and harbor communities

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.

    2005-01-01

    Although there is considerable energy focused on assessing natural hazards associated with earthquakes and tsunamis in the U.S. Pacific Northwest, little has been done to understand societal vulnerability to these hazards. Part of understanding societal vulnerability includes assessing the perceptions and priorities of public sector individuals with traditional emergency management responsibilities and of private citizens who could play key roles in community recovery. In response to this knowledge gap, we examine earthquake and tsunami perceptions of stakeholders and decision makers from coastal communities in the U.S. Pacific Northwest, focusing on perceptions of (1) regional hazards and societal vulnerability, (2) the current state of readiness, and (3) priorities for future hazard adjustment efforts. Results of a mailed survey suggest that survey participants believe that earthquakes and tsunamis are credible community threats. Most communities are focusing on regional mitigation and response planning, with less effort devoted to recovery plans or to making individual organizations more resilient. Significant differences in expressed perceptions and priorities were observed between Oregon and Washington respondents, mainly on tsunami issues. Significant perception differences were also observed between private and public sector respondents. Our results suggest the need for further research and for outreach and planning initiatives in the Pacific Northwest to address significant gaps in earthquake and tsunami hazard awareness and readiness.

  14. Probabilistic seismic hazard zonation for the Cuban building code update

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Llanes-Buron, C.

    2013-05-01

    A probabilistic seismic hazard assessment has been performed in response to a revision and update of the Cuban building code (NC-46-99) for earthquake-resistant building construction. The hazard assessment have been done according to the standard probabilistic approach (Cornell, 1968) and importing the procedures adopted by other nations dealing with the problem of revising and updating theirs national building codes. Problems of earthquake catalogue treatment, attenuation of peak and spectral ground acceleration, as well as seismic source definition have been rigorously analyzed and a logic-tree approach was used to represent the inevitable uncertainties encountered through the whole seismic hazard estimation process. The seismic zonation proposed here, is formed by a map where it is reflected the behaviour of the spectral acceleration values for short (0.2 seconds) and large (1.0 seconds) periods on rock conditions with a 1642 -year return period, which being considered as maximum credible earthquake (ASCE 07-05). In addition, other three design levels are proposed (severe earthquake: with a 808 -year return period, ordinary earthquake: with a 475 -year return period and minimum earthquake: with a 225 -year return period). The seismic zonation proposed here fulfils the international standards (IBC-ICC) as well as the world tendencies in this thematic.

  15. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  16. Knowledge base about earthquakes as a tool to minimize strong events consequences

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  17. Seismotectonic Map of Afghanistan and Adjacent Areas

    USGS Publications Warehouse

    Wheeler, Russell L.; Rukstales, Kenneth S.

    2007-01-01

    Introduction This map is part of an assessment of Afghanistan's geology, natural resources, and natural hazards. One of the natural hazards is from earthquake shaking. One of the tools required to address the shaking hazard is a probabilistic seismic-hazard map, which was made separately. The information on this seismotectonic map has been used in the design and computation of the hazard map. A seismotectonic map like this one shows geological, seismological, and other information that previously had been scattered among many sources. The compilation can show spatial relations that might not have been seen by comparing the original sources, and it can suggest hypotheses that might not have occurred to persons who studied those scattered sources. The main map shows faults and earthquakes of Afghanistan. Plate convergence drives the deformations that cause the earthquakes. Accordingly, smaller maps and text explain the modern plate-tectonic setting of Afghanistan and its evolution, and relate both to patterns of faults and earthquakes.

  18. East Meets West: An Earthquake in India Helps Hazard Assessment in the Central United States

    USGS Publications Warehouse

    ,

    2002-01-01

    Although geographically distant, the State of Gujarat in India bears many geological similarities to the Mississippi Valley in the Central United States. The Mississippi Valley contains the New Madrid seismic zone that, during the winter of 1811-1812, produced the three largest historical earthquakes ever in the continental United States and remains the most seismically active region east of the Rocky Mountains. Large damaging earthquakes are rare in ‘intraplate’ settings like New Madrid and Gujarat, far from the boundaries of the world’s great tectonic plates. Long-lasting evidence left by these earthquakes is subtle (fig. 1). Thus, each intraplate earthquake provides unique opportunities to make huge advances in our ability to assess and understand the hazards posed by such events.

  19. Perception of Natural Hazards and Risk among University of Washington Students

    NASA Astrophysics Data System (ADS)

    Herr, K.; Brand, B.; Hamlin, N.; Ou, J.; Thomas, B.; Tudor, E.

    2012-12-01

    Familiarity with a given population's perception of natural hazards and the threats they present is vital for the development of effective education prior to and emergency management response after a natural event. While much work has been done in other active tectonic regions, perception of natural hazards and risk among Pacific Northwest (PNW) residents is poorly constrained. The objective of this work is to assess the current perception of earthquake and volcanic hazards and risk in the PNW, and to better understand the factors which drive the public's behavior concerning preparedness and response. We developed a survey to assess the knowledge of natural hazards common to the region, their perception of risk concerning these hazards, and their level of preparedness should a natural hazard occur. The survey was distributed to University of Washington students and employees via an internet link as part of a class project in 'Living with Volcanoes' (ESS 106) in March of 2012, which returned more than 900 responses. The UW student population was chosen as our first "population" to assess because of their uniqueness as a large, semi-transient population (typical residence of less than 5 years). Only 50% of participants correctly reported their proximity to an active volcano, indicating either lack of knowledge of active volcanoes in the region or poor spatial awareness. When asked which area were most at risk to lahars, respondents indicated that all areas close to the hazard source, including topographically elevated regions, were at a higher risk than more distal and low-lying localities that are also at high risk, indicating a lack of knowledge concerning the topographic dependency of this hazard. Participants perceived themselves to be able to cope better with an earthquake than a volcanic event. This perception may be due to lack of knowledge of volcanic hazards and their extent or due to a false sense of security concerning earthquakes fostered by regular earthquake drills and long periods of quiescence between large earthquake events. 60% of respondents had participated in earthquake drills; however, less than 45% provided the correct response when asked what they would do if an earthquake were to occur. In summary, knowledge of natural hazards and proximity to hazard sources was found to be low or inaccurate, which corresponds to a low perception of risk. Awareness of evacuation routes, emergency response or coping protocol for natural hazards was also found to be low, suggesting this large, semi-transient population lacks the understanding of proper preparation and response to a natural hazard. These results indicate the need for better education concerning the risks of natural hazards in this region and the steps for better preparedness.

  20. Palaeotsunamis and tsunami hazards in the Eastern Mediterranean.

    PubMed

    England, Philip; Howell, Andrew; Jackson, James; Synolakis, Costas

    2015-10-28

    The dominant uncertainties in assessing tsunami hazard in the Eastern Mediterranean are attached to the location of the sources. Reliable historical reports exist for five tsunamis associated with earthquakes at the Hellenic plate boundary, including two that caused widespread devastation. Because most of the relative motion across this boundary is aseismic, however, the modern record of seismicity provides little or no information about the faults that are likely to generate such earthquakes. Independent geological and geophysical observations of two large historical to prehistorical earthquakes, in Crete and Rhodes, lead to a coherent framework in which large to great earthquakes occurred not on the subduction boundary, but on reverse faults within the overlying crust. We apply this framework to the less complete evidence from the remainder of the Hellenic plate boundary zone, identifying candidate sources for future tsunamigenic earthquakes. Each such source poses a significant hazard to the North African coast of the Eastern Mediterranean. Because modern rates of seismicity are irrelevant to slip on the tsunamigenic faults, and because historical and geological data are too sparse, there is no reliable basis for a probabilistic assessment of this hazard, and a precautionary approach seems advisable. © 2015 The Author(s).

  1. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    NASA Astrophysics Data System (ADS)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-01

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  2. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationshipsmore » for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.« less

  3. Developing a global tsunami propagation database and its application for coastal hazard assessments in China

    NASA Astrophysics Data System (ADS)

    Wang, N.; Tang, L.; Titov, V.; Newman, J. C.; Dong, S.; Wei, Y.

    2013-12-01

    The tragedies of the 2004 Indian Ocean and 2011 Japan tsunamis have increased awareness of tsunami hazards for many nations, including China. The low land level and high population density of China's coastal areas place it at high risk for tsunami hazards. Recent research (Komatsubara and Fujiwara, 2007) highlighted concerns of a magnitude 9.0 earthquake on the Nankai trench, which may affect China's coasts not only in South China Sea, but also in the East Sea and Yellow Sea. Here we present our work in progress towards developing a global tsunami propagation database that can be used for hazard assessments by many countries. The propagation scenarios are computed by using NOAA's MOST numerical model. Each scenario represents a typical Mw 7.5 earthquake with predefined earthquake parameters (Gica et al., 2008). The model grid was interpolated from ETOPO1 at 4 arc-min resolution, covering -80° to72°N and 0 to 360°E. We use this database for preliminary tsunami hazard assessment along China's coastlines.

  4. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  5. Assessment of liquefaction-induced hazards using Bayesian networks based on standard penetration test data

    NASA Astrophysics Data System (ADS)

    Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan

    2018-05-01

    Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.

  6. Dynamic evaluation of seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2016-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A + B•(6 - M) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L, A characterizes the average annual rate of strong (M = 6) earthquakes, B determines the balance between magnitude ranges, and C estimates the fractal dimension of seismic locus in projection to the Earth surface. The parameters A, B, and C of USLE are used to assess, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity or paleo data), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures. The hazard maps for a given territory change dramatically, when the methodology is applied to a certain size moving time window, e.g. about a decade long for an intermediate-term regional assessment or exponentially increasing intervals for a daily local strong aftershock forecasting. The of dynamical seismic hazard and risks assessment is illustrated by applications to the territory of Greater Caucasus and Crimea and the two-year series of aftershocks of the 11 October 2008 Kurchaloy, Chechnya earthquake which case-history appears to be encouraging for further systematic testing as potential short-term forecasting tool.

  7. Inundation Mapping and Hazard Assessment of Tectonic and Landslide Tsunamis in Southeast Alaska

    NASA Astrophysics Data System (ADS)

    Suleimani, E.; Nicolsky, D.; Koehler, R. D., III

    2014-12-01

    The Alaska Earthquake Center conducts tsunami inundation mapping for coastal communities in Alaska, and is currently focused on the southeastern region and communities of Yakutat, Elfin Cove, Gustavus and Hoonah. This activity provides local emergency officials with tsunami hazard assessment, planning, and mitigation tools. At-risk communities are distributed along several segments of the Alaska coastline, each having a unique seismic history and potential tsunami hazard. Thus, a critical component of our project is accurate identification and characterization of potential tectonic and landslide tsunami sources. The primary tectonic element of Southeast Alaska is the Fairweather - Queen Charlotte fault system, which has ruptured in 5 large strike-slip earthquakes in the past 100 years. The 1958 "Lituya Bay" earthquake triggered a large landslide into Lituya Bay that generated a 540-m-high wave. The M7.7 Haida Gwaii earthquake of October 28, 2012 occurred along the same fault, but was associated with dominantly vertical motion, generating a local tsunami. Communities in Southeast Alaska are also vulnerable to hazards related to locally generated waves, due to proximity of communities to landslide-prone fjords and frequent earthquakes. The primary mechanisms for local tsunami generation are failure of steep rock slopes due to relaxation of internal stresses after deglaciation, and failure of thick unconsolidated sediments accumulated on underwater delta fronts at river mouths. We numerically model potential tsunami waves and inundation extent that may result from future hypothetical far- and near-field earthquakes and landslides. We perform simulations for each source scenario using the Alaska Tsunami Model, which is validated through a set of analytical benchmarks and tested against laboratory and field data. Results of numerical modeling combined with historical observations are compiled on inundation maps and used for site-specific tsunami hazard assessment by emergency planners.

  8. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the uncertainties and the need to candidly assess them. It can be applied to exploring policies under various hazard scenarios and mitigating other natural hazards.ariation in total cost, the sum of expected loss and mitigation cost, as a function of mitigation level. The optimal level of mitigation, n*, minimizes the total cost. The expected loss depends on the hazard model, so the better the hazard model, the better the mitigation policy (Stein and Stein, 2012).

  9. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  10. Development of Tools for the Rapid Assessment of Landslide Potential in Areas Exposed to Intense Storms, Earthquakes, and Other Triggering Mechanisms

    NASA Astrophysics Data System (ADS)

    Highland, Lynn

    2014-05-01

    Landslides frequently occur in connection with other types of hazardous phenomena such as earthquake or volcanic activity and intense rainstorms. Strong shaking, for example, often triggers extensive landslides in mountainous areas, which can then complicate response and compound socio-economic impacts over shaking losses alone. The U.S. Geological Survey (USGS) is exploring different ways to add secondary hazards to its Prompt Assessment of Global Earthquakes for Response (PAGER) system, which has been developed to deliver rapid earthquake impact and loss assessments following significant global earthquakes. The PAGER team found that about 22 percent of earthquakes with fatalities have deaths due to secondary causes, and the percentage of economic losses they incur has not been widely studied, but is probably significant. The current approach for rapid assessment and reporting of the potential and distribution of secondary earthquake-induced landslides involves empirical models that consider ground acceleration, slope, and rock-strength. A complementary situational awareness tool being developed is a region-specific landslide database for the U.S. The latter will be able to define, in a narrative form, the landslide types (debris flows, rock avalanches, shallow versus deep) that generally occur in each area, along with the type of soils, geology and meteorological effects that could have a bearing on soil saturation, and thus susceptibility. When a seismic event occurs in the U.S. and the PAGER system generates web-based earthquake information, these landslide narratives will simultaneously be made available, which will help in the assessment of the nature of landslides in that particular region. This landslide profile database could also be applied to landslide events that are not triggered by earthquake shaking, in conjunction with National Weather Service Alerts and other landslide/debris-flow alerting systems. Currently, prototypes are being developed for both the slope-based and the narrative assessment of landslide susceptibility and hazard.

  11. Earthquake prediction: the interaction of public policy and science.

    PubMed Central

    Jones, L M

    1996-01-01

    Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake. PMID:11607656

  12. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    NASA Astrophysics Data System (ADS)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would appear quasiperiodic, while at other times, the events can appear more Poissonian. Hence a given paleoseismic or instrumental record may not reflect the long-term seismicity of a fault, which has important implications for hazard assessment.

  13. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence probabilities P30(i) for all earthquakes (CEFMs) and calculated maximum coastal tsunami heights. In the synthesis, aleatory uncertainties relating to incompleteness of governing equations, CEFM modeling, bathymetry and topography data, etc, are modeled assuming a log-normal probabilistic distribution. Examples of tsunami hazard curves will be presented.

  14. Earthquake Hazard in the New Madrid Seismic Zone Remains a Concern

    USGS Publications Warehouse

    Frankel, A.D.; Applegate, D.; Tuttle, M.P.; Williams, R.A.

    2009-01-01

    There is broad agreement in the scientific community that a continuing concern exists for a major destructive earthquake in the New Madrid seismic zone. Many structures in Memphis, Tenn., St. Louis, Mo., and other communities in the central Mississippi River Valley region are vulnerable and at risk from severe ground shaking. This assessment is based on decades of research on New Madrid earthquakes and related phenomena by dozens of Federal, university, State, and consulting earth scientists. Considerable interest has developed recently from media reports that the New Madrid seismic zone may be shutting down. These reports stem from published research using global positioning system (GPS) instruments with results of geodetic measurements of strain in the Earth's crust. Because of a lack of measurable strain at the surface in some areas of the seismic zone over the past 14 years, arguments have been advanced that there is no buildup of stress at depth within the New Madrid seismic zone and that the zone may no longer pose a significant hazard. As part of the consensus-building process used to develop the national seismic hazard maps, the U.S. Geological Survey (USGS) convened a workshop of experts in 2006 to evaluate the latest findings in earthquake hazards in the Eastern United States. These experts considered the GPS data from New Madrid available at that time that also showed little to no ground movement at the surface. The experts did not find the GPS data to be a convincing reason to lower the assessment of earthquake hazard in the New Madrid region, especially in light of the many other types of data that are used to construct the hazard assessment, several of which are described here.

  15. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  16. 2017 One‐year seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  17. Seismic risk assessment and application in the central United States

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic risk is a somewhat subjective, but important, concept in earthquake engineering and other related decision-making. Another important concept that is closely related to seismic risk is seismic hazard. Although seismic hazard and seismic risk have often been used interchangeably, they are fundamentally different: seismic hazard describes the natural phenomenon or physical property of an earthquake, whereas seismic risk describes the probability of loss or damage that could be caused by a seismic hazard. The distinction between seismic hazard and seismic risk is of practical significance because measures for seismic hazard mitigation may differ from those for seismic risk reduction. Seismic risk assessment is a complicated process and starts with seismic hazard assessment. Although probabilistic seismic hazard analysis (PSHA) is the most widely used method for seismic hazard assessment, recent studies have found that PSHA is not scientifically valid. Use of PSHA will lead to (1) artifact estimates of seismic risk, (2) misleading use of the annual probability of exccedance (i.e., the probability of exceedance in one year) as a frequency (per year), and (3) numerical creation of extremely high ground motion. An alternative approach, which is similar to those used for flood and wind hazard assessments, has been proposed. ?? 2011 ASCE.

  18. Earthquake hazard and risk assessment based on Unified Scaling Law for Earthquakes: Greater Caucasus and Crimea

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

    2018-05-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10 N(M, L) = A + B·(5 - M) + C·log10 L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within a seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g., peak ground acceleration, PGA, or macro-seismic intensity). After a rigorous verification against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory). The methodology of seismic hazard and risk assessment is illustrated by application to the territory of Greater Caucasus and Crimea.

  19. Review Article: A comparison of flood and earthquake vulnerability assessment indicators

    NASA Astrophysics Data System (ADS)

    de Ruiter, Marleen C.; Ward, Philip J.; Daniell, James E.; Aerts, Jeroen C. J. H.

    2017-07-01

    In a cross-disciplinary study, we carried out an extensive literature review to increase understanding of vulnerability indicators used in the disciplines of earthquake- and flood vulnerability assessments. We provide insights into potential improvements in both fields by identifying and comparing quantitative vulnerability indicators grouped into physical and social categories. Next, a selection of index- and curve-based vulnerability models that use these indicators are described, comparing several characteristics such as temporal and spatial aspects. Earthquake vulnerability methods traditionally have a strong focus on object-based physical attributes used in vulnerability curve-based models, while flood vulnerability studies focus more on indicators applied to aggregated land-use classes in curve-based models. In assessing the differences and similarities between indicators used in earthquake and flood vulnerability models, we only include models that separately assess either of the two hazard types. Flood vulnerability studies could be improved using approaches from earthquake studies, such as developing object-based physical vulnerability curve assessments and incorporating time-of-the-day-based building occupation patterns. Likewise, earthquake assessments could learn from flood studies by refining their selection of social vulnerability indicators. Based on the lessons obtained in this study, we recommend future studies for exploring risk assessment methodologies across different hazard types.

  20. Multi-hazards risk assessment at different levels

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2012-04-01

    Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The results also allow to develop effective emergency response plans taking into account possible scenario events. Taking into consideration the size of the oil pipe line systems located in the highly active seismic zones, the results of seismic risk computation are used by TRANSNEFT JSC.

  1. Seismic hazard and risk assessment in the intraplate environment: The New Madrid seismic zone of the central United States

    USGS Publications Warehouse

    Wang, Z.

    2007-01-01

    Although the causes of large intraplate earthquakes are still not fully understood, they pose certain hazard and risk to societies. Estimating hazard and risk in these regions is difficult because of lack of earthquake records. The New Madrid seismic zone is one such region where large and rare intraplate earthquakes (M = 7.0 or greater) pose significant hazard and risk. Many different definitions of hazard and risk have been used, and the resulting estimates differ dramatically. In this paper, seismic hazard is defined as the natural phenomenon generated by earthquakes, such as ground motion, and is quantified by two parameters: a level of hazard and its occurrence frequency or mean recurrence interval; seismic risk is defined as the probability of occurrence of a specific level of seismic hazard over a certain time and is quantified by three parameters: probability, a level of hazard, and exposure time. Probabilistic seismic hazard analysis (PSHA), a commonly used method for estimating seismic hazard and risk, derives a relationship between a ground motion parameter and its return period (hazard curve). The return period is not an independent temporal parameter but a mathematical extrapolation of the recurrence interval of earthquakes and the uncertainty of ground motion. Therefore, it is difficult to understand and use PSHA. A new method is proposed and applied here for estimating seismic hazard in the New Madrid seismic zone. This method provides hazard estimates that are consistent with the state of our knowledge and can be easily applied to other intraplate regions. ?? 2007 The Geological Society of America.

  2. Program and plans of the U.S. Geological Survey for producing information needed in National Seismic hazards and risk assessment, fiscal years 1980-84

    USGS Publications Warehouse

    Hays, Walter W.

    1979-01-01

    In accordance with the provisions of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124), the U.S. Geological Survey has developed comprehensive plans for producing information needed to assess seismic hazards and risk on a national scale in fiscal years 1980-84. These plans are based on a review of the needs of Federal Government agencies, State and local government agencies, engineers and scientists engaged in consulting and research, professional organizations and societies, model code groups, and others. The Earthquake Hazards Reduction Act provided an unprecedented opportunity for participation in a national program by representatives of State and local governments, business and industry, the design professions, and the research community. The USGS and the NSF (National Science Foundation) have major roles in the national program. The ultimate goal of the program is to reduce losses from earthquakes. Implementation of USGS research in the Earthquake Hazards Reduction Program requires the close coordination of responsibility between Federal, State and local governments. The projected research plan in national seismic hazards and risk for fiscal years 1980-84 will be accomplished by USGS and non-USGS scientists and engineers. The latter group will participate through grants and contracts. The research plan calls for (1) national maps based on existing methods, (2) improved definition of earthquake source zones nationwide, (3) development of improved methodology, (4) regional maps based on the improved methodology, and (5) post-earthquake investigations. Maps and reports designed to meet the needs, priorities, concerns, and recommendations of various user groups will be the products of this research and provide the technical basis for improved implementation.

  3. Tsunami Loss Assessment For Istanbul

    NASA Astrophysics Data System (ADS)

    Hancilar, Ufuk; Cakti, Eser; Zulfikar, Can; Demircioglu, Mine; Erdik, Mustafa

    2010-05-01

    Tsunami risk and loss assessment incorporating with the inundation mapping in Istanbul and the Marmara Sea region are presented in this study. The city of Istanbul is under the threat of earthquakes expected to originate from the Main Marmara branch of North Anatolian Fault System. In the Marmara region the earthquake hazard reached very high levels with 2% annual probability of occurrence of a magnitude 7+ earthquake on the Main Marmara Fault. Istanbul is the biggest city of Marmara region as well as of Turkey with its almost 12 million inhabitants. It is home to 40% of the industrial facilities in Turkey and operates as the financial and trade hub of the country. Past earthquakes have evidenced that the structural reliability of residential and industrial buildings, as well as that of lifelines including port and harbor structures in the country is questionable. These facts make the management of earthquake risks imperative for the reduction of physical and socio-economic losses. The level of expected tsunami hazard in Istanbul is low as compared to earthquake hazard. Yet the assets at risk along the shores of the city make a thorough assessment of tsunami risk imperative. Important residential and industrial centres exist along the shores of the Marmara Sea. Particularly along the northern and eastern shores we see an uninterrupted settlement pattern with industries, businesses, commercial centres and ports and harbours in between. Following the inundation maps resulting from deterministic and probabilistic tsunami hazard analyses, vulnerability and risk analyses are presented and the socio-economic losses are estimated. This study is part of EU-supported FP6 project ‘TRANSFER'.

  4. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  5. The Wenchuan, China M8.0 Earthquake: A Lesson and Implication for Seismic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2008-12-01

    The Wenchuan, China M8.0 earthquake caused great damage and huge casualty. 69,197 people were killed, 374,176 people were injured, and 18,341 people are still missing. The estimated direct economic loss is about 126 billion U.S. dollar. The Wenchuan earthquake again demonstrated that earthquake does not kill people, but the built environments and induced hazards, landslides in particular, do. Therefore, it is critical to strengthen the built environments, such buildings and bridges, and to mitigate the induced hazards in order to avoid such disaster. As a part of the so-called North-South Seismic Zone in China, the Wenchuan earthquake occurred along the Longmen Shan thrust belt which forms a boundary between the Qinghai-Tibet Plateau and the Sichuan basin, and there is a long history (~4,000 years) of seismicity in the area. The historical records show that the area experienced high intensity (i.e., greater than IX) in the past several thousand years. In other words, the area is well-known to have high seismic hazard because of its tectonic setting and seismicity. However, only intensity VII (0.1 to 0.15g PGA) has been considered for seismic design for the built environments in the area. This was one of the main reasons that so many building collapses, particularly the school buildings, during the Wenchuan earthquake. It is clear that the seismic design (i.e., the design ground motion or intensity) is not adequate in the Wenchuan earthquake stricken area. A lesson can be learned from the Wenchuan earthquake on the seismic hazard and risk assessment. A lesson can also be learned from this earthquake on seismic hazard mitigation and/or seismic risk reduction.

  6. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  7. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  8. Time dependent data, time independent models: challenges of updating Australia's National Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Griffin, J.; Clark, D.; Allen, T.; Ghasemi, H.; Leonard, M.

    2017-12-01

    Standard probabilistic seismic hazard assessment (PSHA) simulates earthquake occurrence as a time-independent process. However paleoseismic studies in slowly deforming regions such as Australia show compelling evidence that large earthquakes on individual faults cluster within active periods, followed by long periods of quiescence. Therefore the instrumental earthquake catalog, which forms the basis of PSHA earthquake recurrence calculations, may only capture the state of the system over the period of the catalog. Together this means that data informing our PSHA may not be truly time-independent. This poses challenges in developing PSHAs for typical design probabilities (such as 10% in 50 years probability of exceedance): Is the present state observed through the instrumental catalog useful for estimating the next 50 years of earthquake hazard? Can paleo-earthquake data, that shows variations in earthquake frequency over time-scales of 10,000s of years or more, be robustly included in such PSHA models? Can a single PSHA logic tree be useful over a range of different probabilities of exceedance? In developing an updated PSHA for Australia, decadal-scale data based on instrumental earthquake catalogs (i.e. alternative area based source models and smoothed seismicity models) is integrated with paleo-earthquake data through inclusion of a fault source model. Use of time-dependent non-homogeneous Poisson models allows earthquake clustering to be modeled on fault sources with sufficient paleo-earthquake data. This study assesses the performance of alternative models by extracting decade-long segments of the instrumental catalog, developing earthquake probability models based on the remaining catalog, and testing performance against the extracted component of the catalog. Although this provides insights into model performance over the short-term, for longer timescales it is recognised that model choice is subject to considerable epistemic uncertainty. Therefore a formal expert elicitation process has been used to assign weights to alternative models for the 2018 update to Australia's national PSHA.

  9. Science should warn people of looming disaster

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2014-05-01

    Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of special knowledge, education, and communication. In fact, it appears that a few seismic hazard assessment programs and/or methodologies were tested appropriately against real observations before being endorsed for estimation of earthquake related risks. The fatal evidence and aftermath of the past decades prove that many of the existing internationally accepted methodologies are grossly misleading and are evidently unacceptable for any kind of responsible risk evaluation and knowledgeable disaster prevention. In contrast, the confirmed reliability of pattern recognition aimed at earthquake prone areas and times of increased probability, along with realistic earthquake scaling and scenario modeling, allow us to conclude that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering this state-of-the-art knowledge of looming disaster in advance catastrophic events. In a lieu of seismic observations long enough for a reliable probabilistic assessment or a comprehensive physical theory of earthquake recurrence, pattern recognition applied to available geophysical and/or geological data sets remains a broad avenue to follow in seismic hazard forecast/prediction. Moreover, better understanding seismic process in terms of non-linear dynamics of a hierarchical system of blocks-and-faults and deterministic chaos, progress to new approaches in assessing time-dependent seismic hazard based on multiscale analysis of seismic activity and reproducible intermediate-term earthquake prediction technique. The algorithms, which make use of multidisciplinary data available and account for fractal nature of earthquake distributions in space and time, have confirmed their reliability by durable statistical testing in the on-going regular real-time application lasted for more than 20 years. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve in forecast/prediction products to optimistic challenging views on Hazard Predictability in space and time, so that not to repeat missed opportunities for disaster preparedness like it happen in advance the 2009 L'Aquila, M6.3 earthquake in Italy and the 2011, M9.0 mega-thrust off the Pacific coast of Tōhoku region in Japan.

  10. The New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX) Project - An overview of its major findings

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Zschau, Jochen; Gasparini, Paolo

    2014-05-01

    Recent major natural disasters, such as the 2011 Tōhoku earthquake, tsunami and subsequent Fukushima nuclear accident, have raised awareness of the frequent and potentially far-reaching interconnections between natural hazards. Such interactions occur at the hazard level, where an initial hazard may trigger other events (e.g., an earthquake triggering a tsunami) or several events may occur concurrently (or nearly so), e.g., severe weather around the same time as an earthquake. Interactions also occur at the vulnerability level, where the initial event may make the affected community more susceptible to the negative consequences of another event (e.g., an earthquake weakens buildings, which are then damaged further by windstorms). There is also a temporal element involved, where changes in exposure may alter the total risk to a given area. In short, there is the likelihood that the total risk estimated when considering multiple hazard and risks and their interactions is greater than the sum of their individual parts. It is with these issues in mind that the European Commission, under their FP7 program, supported the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project (10.2010 to 12.2013). MATRIX set out to tackle multiple natural hazards (i.e., those of concern to Europe, namely earthquakes, landslides, volcanos, tsunamis, wild fires, storms and fluvial and coastal flooding) and risks within a common theoretical framework. The MATRIX work plan proceeded from an assessment of single-type risk methodologies (including how uncertainties should be treated), cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and an assessment of how the multi-hazard and risk viewpoint may be integrated into current decision making and risk mitigation programs, considering the existing single-hazard and risk focus. Three test sites were considered during the project: Naples, Cologne, and the French West Indies. In addition, a software platform, the MATRIX-Common IT sYstem (MATRIX-CITY), was developed to allow the evaluation of characteristic multi-hazard and risk scenarios in comparison to single-type analyses. This presentation therefore outlines the more significant outcomes of the project, in particular those dealing with the harmonization of single-type hazards, cascade event analysis, time-dependent vulnerability changes and the response of the disaster management community to the MATRIX point of view.

  11. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    USGS Publications Warehouse

    Gori, Paula L.

    1993-01-01

    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a

  12. Real-time Seismicity Evaluation as a Tool for the Earthquake and Tsunami Short-Term Hazard Assessment (Invited)

    NASA Astrophysics Data System (ADS)

    Papadopoulos, G. A.

    2010-12-01

    Seismic activity is a 3-D process varying in the space-time-magnitude domains. When in a target area the short-term activity deviates significantly from the usual (background) seismicity, then the modes of activity may include swarms, temporary quiescence, foreshock-mainshock-aftershock sequences, doublets and multiplets. This implies that making decision for civil protection purposes requires short-term seismic hazard assessment and evaluation. When a sizable earthquake takes place the critical question is about the nature of the event: mainshock or a foreshock which foreshadows the occurrence of a biger one? Also, the seismicity increase or decrease in a target area may signify either precursory changes or just transient seismicity variations (e.g. swarms) which do not conclude with a strong earthquake. Therefore, the real-time seismicity evaluation is the backbone of the short-term hazard assessment. The algorithm FORMA (Foreshock-Mainshock-Aftershock) is presented which detects and updates automatically and in near real-time significant variations of the seismicity according to the earthquake data flow from the monitoring center. The detection of seismicity variations is based on an expert system which for a given target area indicates the mode of seismicity from the variation of two parameters: the seismicity rate, r, and the b-value of the magnitude-frequency relation. Alert levels are produced according to the significance levels of the changes of r and b. The good performance of FORMA was verified retrospectively in several earthquake cases, e.g. for the L’ Aquila, Italy, 2009 earthquake sequence (Mmax 6.3) (Papadopoulos et al., 2010). Real-time testing was executed during January 2010 with the strong earthquake activity (Mmax 5.6) in the Corinth Rift, Central Greece. Evaluation outputs were publicly documented on a nearly daily basis with successful results. Evaluation of coastal and submarine earthquake activity is also of crucial importance for the short-term hazard assessment for near-field tsunamis, given that the time constraints for early warning is on the order of few minutes up to less than 1 hour. It is proposed that warning procedures for near-field tsunamis in a target area may benefit by combining a tsunami decision matrix with short-term seismic hazard evaluation. Simulated procedures incorporating retrospective tests in the Mediterranean Sea proved encouraging.

  13. Seismic Hazard Assessment at Esfaraen‒Bojnurd Railway, North‒East of Iran

    NASA Astrophysics Data System (ADS)

    Haerifard, S.; Jarahi, H.; Pourkermani, M.; Almasian, M.

    2018-01-01

    The objective of this study is to evaluate the seismic hazard at the Esfarayen-Bojnurd railway using the probabilistic seismic hazard assessment (PSHA) method. This method was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. Attenuation equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 1.2 × 1.2 km covering the study area, ground acceleration for every node was calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to return periods of 74, 475 and 2475 years.

  14. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  15. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: Altai-Sayan Region

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2017-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10N(M, L) = A + B·(5 - M) + C·log10L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum credible magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g. peak ground acceleration, PGA, or macro-seismic intensity etc.). After a rigorous testing against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory, etc.). This, USLE based, methodology of seismic hazard and risks assessment is applied to the territory of Altai-Sayan Region, of Russia. The study supported by the Russian Science Foundation Grant No. 15-17-30020.

  16. Loss Estimations due to Earthquakes and Secondary Technological Hazards

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2009-04-01

    Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.

  17. Spatial earthquake hazard assessment of Evansville, Indiana

    USGS Publications Warehouse

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.

    1997-01-01

    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  18. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  19. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  20. Local soil effects on the Ground Motion Prediction model for the Racha region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Shengelia, I.; Otinashvili, M.; Tvaliashvili, A.

    2016-12-01

    The Caucasus is a region of numerous natural hazards and ensuing disasters. Analysis of the losses due to past disasters indicates those most catastrophic in the region have historically been due to strong earthquakes. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is the peak ground acceleration because this parameter gives useful information for Seismic Hazard Assessment that was selected for the analysis. One of the most important topics that have a significant influence on earthquake records is the site ground conditions that are the main issue of the study because the same earthquake recorded at the same distance may cause different damage according to ground conditions. In the study earthquake records were selected for the Racha region in Georgia which has the highest seismic activity in the region. Next, new GMP models are obtained based on new digital data recorded in the same area. After removing the site effect the earthquake records on the rock site were obtained. Thus, two GMP models were obtained: one for the ground surface and the other for the rock site. At the end, comparison was done for the both models in order to analyze the influence of the local soil conditions on the GMP model.

  1. Sedimentary evidence of historical and prehistorical earthquakes along the Venta de Bravo Fault System, Acambay Graben (Central Mexico)

    NASA Astrophysics Data System (ADS)

    Lacan, Pierre; Ortuño, María; Audin, Laurence; Perea, Hector; Baize, Stephane; Aguirre-Díaz, Gerardo; Zúñiga, F. Ramón

    2018-03-01

    The Venta de Bravo normal fault is one of the longest structures in the intra-arc fault system of the Trans-Mexican Volcanic Belt. It defines, together with the Pastores Fault, the 80 km long southern margin of the Acambay Graben. We focus on the westernmost segment of the Venta de Bravo Fault and provide new paleoseismological information, evaluate its earthquake history, and assess the related seismic hazard. We analyzed five trenches, distributed at three different sites, in which Holocene surface faulting offsets interbedded volcanoclastic, fluvio-lacustrine and colluvial deposits. Despite the lack of known historical destructive earthquakes along this fault, we found evidence of at least eight earthquakes during the late Quaternary. Our results indicate that this is one of the major seismic sources of the Acambay Graben, capable of producing by itself earthquakes with magnitudes (MW) up to 6.9, with a slip rate of 0.22-0.24 mm yr- 1 and a recurrence interval between 1940 and 2390 years. In addition, a possible multi-fault rupture of the Venta de Bravo Fault together with other faults of the Acambay Graben could result in a MW > 7 earthquake. These new slip rates, earthquake recurrence rates, and estimation of slips per event help advance our understanding of the seismic hazard posed by the Venta de Bravo Fault and provide new parameters for further hazard assessment.

  2. Probabilistic Tsunami Hazard Assessment along Nankai Trough (2) a comprehensive assessment including a variety of earthquake source areas other than those that the Earthquake Research Committee, Japanese government (2013) showed

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2016-12-01

    For the forthcoming Nankai earthquake with M8 to M9 class, the Earthquake Research Committee(ERC)/Headquarters for Earthquake Research Promotion, Japanese government (2013) showed 15 examples of earthquake source areas (ESAs) as possible combinations of 18 sub-regions (6 segments along trough and 3 segments normal to trough) and assessed the occurrence probability within the next 30 years (from Jan. 1, 2013) was 60% to 70%. Hirata et al.(2015, AGU) presented Probabilistic Tsunami Hazard Assessment (PTHA) along Nankai Trough in the case where diversity of the next event's ESA is modeled by only the 15 ESAs. In this study, we newly set 70 ESAs in addition of the previous 15 ESAs so that total of 85 ESAs are considered. By producing tens of faults models, with various slip distribution patterns, for each of 85 ESAs, we obtain 2500 fault models in addition of previous 1400 fault models so that total of 3900 fault models are considered to model the diversity of the next Nankai earthquake rupture (Toyama et al.,2015, JpGU). For PTHA, the occurrence probability of the next Nankai earthquake is distributed to possible 3900 fault models in the viewpoint of similarity to the 15 ESAs' extents (Abe et al.,2015, JpGU). A major concept of the occurrence probability distribution is; (i) earthquakes rupturing on any of 15 ESAs that ERC(2013) showed most likely occur, (ii) earthquakes rupturing on any of ESAs whose along-trench extent is the same as any of 15 ESAs but trough-normal extent differs from it second likely occur, (iii) earthquakes rupturing on any of ESAs whose both of along-trough and trough-normal extents differ from any of 15 ESAs rarely occur. Procedures for tsunami simulation and probabilistic tsunami hazard synthesis are the same as Hirata et al (2015). A tsunami hazard map, synthesized under an assumption that the Nankai earthquakes can be modeled as a renewal process based on BPT distribution with a mean recurrence interval of 88.2 years (ERC, 2013) and an aperiodicity of 0.22, as the median of the values (0.20 to 0.24)that ERC (2013) recommended, suggests that several coastal segments along the southwest coast of Shikoku Island, the southeast coast of Kii Peninsula, and the west coast of Izu Peninsula show over 26 % in exceedance probability that maximum water rise exceeds 10 meters at any coastal point within the next 30 years.

  3. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  4. The wicked problem of earthquake hazard in developing countries: the example of Bangladesh

    NASA Astrophysics Data System (ADS)

    Steckler, M. S.; Akhter, S. H.; Stein, S.; Seeber, L.

    2017-12-01

    Many developing nations in earthquake-prone areas confront a tough problem: how much of their limited resources to use mitigating earthquake hazards? This decision is difficult because it is unclear when an infrequent major earthquake may happen, how big it could be, and how much harm it may cause. This issue faces nations with profound immediate needs and ongoing rapid urbanization. Earthquake hazard mitigation in Bangladesh is a wicked problem. It is the world's most densely populated nation, with 160 million people in an area the size of Iowa. Complex geology and sparse data make assessing a possibly-large earthquake hazard difficult. Hence it is hard to decide how much of the limited resources available should be used for earthquake hazard mitigation, given other more immediate needs. Per capita GDP is $1200, so Bangladesh is committed to economic growth and resources are needed to address many critical challenges and hazards. In their subtropical environment, rural Bangladeshis traditionally relied on modest mud or bamboo homes. Their rapidly growing, crowded capital, Dhaka, is filled with multistory concrete buildings likely to be vulnerable to earthquakes. The risk is compounded by the potential collapse of services and accessibility after a major temblor. However, extensive construction as the population shifts from rural to urban provides opportunity for earthquake-risk reduction. While this situation seems daunting, it is not hopeless. Robust risk management is practical, even for developing nations. It involves recognizing uncertainties and developing policies that should give a reasonable outcome for a range of the possible hazard and loss scenarios. Over decades, Bangladesh has achieved a thousandfold reduction in risk from tropical cyclones by building shelters and setting up a warning system. Similar efforts are underway for earthquakes. Smart investments can be very effective, even if modest. Hence, we suggest strategies consistent with high uncertainty and limited resources. The most crucial steps are enforcing building codes and public education on earthquake risk reduction. Requiring moderate investments that increases building costs by 5-10% can substantially improve safety and is a cost effective strategy. Over time, natural building turnover will make communities more resilient.

  5. Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS

    NASA Astrophysics Data System (ADS)

    Ahmad, Raed; Adris, Ahmad; Singh, Ramesh

    2016-07-01

    In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.

  6. Guide to Geologic Hazards in Alaska | Alaska Division of Geological &

    Science.gov Websites

    content Guide to Geologic Hazards in Alaska Glossary Coastal and river hazards image Coastal and river Storm surge Tsunami Earthquake related hazards image Earthquake related hazards Earthquake Earthquake Subsidence Surface fault rupture Tsunami Uplift Glacier hazards image Glacier hazards Avalanche Debris flow

  7. Assessment of existing and potential landslide hazards resulting from the April 25, 2015 Gorkha, Nepal earthquake sequence

    USGS Publications Warehouse

    Collins, Brian D.; Jibson, Randall W.

    2015-07-28

    This report provides a detailed account of assessments performed in May and June 2015 and focuses on valley-blocking landslides because they have the potential to pose considerable hazard to many villages in Nepal. First, we provide a seismological background of Nepal and then detail the methods used for both external and in-country data collection and interpretation. Our results consist of an overview of landsliding extent, a characterization of all valley-blocking landslides identified during our work, and a description of video resources that provide high resolution coverage of approximately 1,000 kilometers (km) of river valleys and surrounding terrain affected by the Gorkha earthquake sequence. This is followed by a description of site-specific landslide-hazard assessments conducted while in Nepal and includes detailed descriptions of five noteworthy case studies. Finally, we assess the expectation for additional landslide hazards during the 2015 summer monsoon season.

  8. EFEHR - the European Facilities for Earthquake Hazard and Risk: beyond the web-platform

    NASA Astrophysics Data System (ADS)

    Danciu, Laurentiu; Wiemer, Stefan; Haslinger, Florian; Kastli, Philipp; Giardini, Domenico

    2017-04-01

    European Facilities for Earthquake Hazard and Risk (EEFEHR) represents the sustainable community resource for seismic hazard and risk in Europe. The EFEHR web platform is the main gateway to access data, models and tools as well as provide expertise relevant for assessment of seismic hazard and risk. The main services (databases and web-platform) are hosted at ETH Zurich and operated by the Swiss Seismological Service (Schweizerischer Erdbebendienst SED). EFEHR web-portal (www.efehr.org) collects and displays (i) harmonized datasets necessary for hazard and risk modeling, e.g. seismic catalogues, fault compilations, site amplifications, vulnerabilities, inventories; (ii) extensive seismic hazard products, namely hazard curves, uniform hazard spectra and maps for national and regional assessments. (ii) standardized configuration files for re-computing the regional seismic hazard models; (iv) relevant documentation of harmonized datasets, models and web-services. Today, EFEHR distributes full output of the 2013 European Seismic Hazard Model, ESHM13, as developed within the SHARE project (http://www.share-eu.org/); the latest results of the 2014 Earthquake Model of the Middle East (EMME14), derived within the EMME Project (www.emme-gem.org); the 2001 Global Seismic Hazard Assessment Project (GSHAP) results and the 2015 updates of the Swiss Seismic Hazard. New datasets related to either seismic hazard or risk will be incorporated as they become available. We present the currents status of the EFEHR platform, with focus on the challenges, summaries of the up-to-date datasets, user experience and feedback, as well as the roadmap to future technological innovation beyond the web-platform development. We also show the new services foreseen to fully integrate with the seismological core services of European Plate Observing System (EPOS).

  9. Trimming the UCERF2 hazard logic tree

    USGS Publications Warehouse

    Porter, Keith A.; Field, Edward H.; Milner, Kevin

    2012-01-01

    The Uniform California Earthquake Rupture Forecast 2 (UCERF2) is a fully time‐dependent earthquake rupture forecast developed with sponsorship of the California Earthquake Authority (Working Group on California Earthquake Probabilities [WGCEP], 2007; Field et al., 2009). UCERF2 contains 480 logic‐tree branches reflecting choices among nine modeling uncertainties in the earthquake rate model shown in Figure 1. For seismic hazard analysis, it is also necessary to choose a ground‐motion‐prediction equation (GMPE) and set its parameters. Choosing among four next‐generation attenuation (NGA) relationships results in a total of 1920 hazard calculations per site. The present work is motivated by a desire to reduce the computational effort involved in a hazard analysis without understating uncertainty. We set out to assess which branching points of the UCERF2 logic tree contribute most to overall uncertainty, and which might be safely ignored (set to only one branch) without significantly biasing results or affecting some useful measure of uncertainty. The trimmed logic tree will have all of the original choices from the branching points that contribute significantly to uncertainty, but only one arbitrarily selected choice from the branching points that do not.

  10. Seismic Hazard and risk assessment for Romania -Bulgaria cross-border region

    NASA Astrophysics Data System (ADS)

    Simeonova, Stela; Solakov, Dimcho; Alexandrova, Irena; Vaseva, Elena; Trifonova, Petya; Raykova, Plamena

    2016-04-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic hazard and vulnerability to earthquakes are steadily increasing as urbanization and development occupy more areas that are prone to effects of strong earthquakes. The assessment of the seismic hazard and risk is particularly important, because it provides valuable information for seismic safety and disaster mitigation, and it supports decision making for the benefit of society. Romania and Bulgaria, situated in the Balkan Region as a part of the Alpine-Himalayan seismic belt, are characterized by high seismicity, and are exposed to a high seismic risk. Over the centuries, both countries have experienced strong earthquakes. The cross-border region encompassing the northern Bulgaria and southern Romania is a territory prone to effects of strong earthquakes. The area is significantly affected by earthquakes occurred in both countries, on the one hand the events generated by the Vrancea intermediate-depth seismic source in Romania, and on the other hand by the crustal seismicity originated in the seismic sources: Shabla (SHB), Dulovo, Gorna Orjahovitza (GO) in Bulgaria. The Vrancea seismogenic zone of Romania is a very peculiar seismic source, often described as unique in the world, and it represents a major concern for most of the northern part of Bulgaria as well. In the present study the seismic hazard for Romania-Bulgaria cross-border region on the basis of integrated basic geo-datasets is assessed. The hazard results are obtained by applying two alternative approaches - probabilistic and deterministic. The MSK64 intensity (MSK64 scale is practically equal to the new EMS98) is used as output parameter for the hazard maps. We prefer to use here the macroseismic intensity instead of PGA, because it is directly related to the degree of damages and, moreover, the epicentral intensity is the original parameter in the historical earthquake catalogues. A particular advantage of using intensities is that the very irregular pattern of the attenuation field of the Vrancea intermediate depth earthquakes can be estimated from detailed macroseismic observations that are available (in both countries) for the study region. Additionally, de-aggregation of the seismic hazard for a recurrence period of 475 years (probability of exceedance of 10% in 50 years) for intensity was performed for 9 cities (administrative centers) situated in northern Bulgaria. Finally, applying SELENA software earthquake risk for Bulgarian part of the cross-boarder region is analyzed. The results presented for the Romania-Bulgaria cross border region are part of the work carried out in the DACEA Project (2010-2013) that was implemented in the framework of the Romania - Bulgaria Cross Border Cooperation Programme (2007-2013).

  11. Using Integrated Earth and Social Science Data for Disaster Risk Assessment

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.; Yetman, G.

    2016-12-01

    Society faces many different risks from both natural and technological hazards. In some cases, disaster risk managers focus on only a few risks, e.g., in regions where a single hazard such as earthquakes dominate. More often, however, disaster risk managers deal with multiple hazards that pose diverse threats to life, infrastructure, and livelihoods. From the viewpoint of scientists, hazards are often studied based on traditional disciplines such as seismology, hydrology, climatology, and epidemiology. But from the viewpoint of disaster risk managers, data are needed on all hazards in a specific region and on the exposure and vulnerability of population, infrastructure, and economic resources and activity. Such managers also need to understand how hazards, exposures, and vulnerabilities may interact, and human and environmental systems respond, to hazard events, as in the case of the Fukushima nuclear disaster that followed from the Sendai earthquake and tsunami. In this regard, geospatial tools that enable visualization and analysis of both Earth and social science data can support the use case of disaster risk managers who need to quickly assess where specific hazard events occur relative to population and critical infrastructure. Such information can help them assess the potential severity of actual or predicted hazard events, identify population centers or key infrastructure at risk, and visualize hazard dynamics, e.g., earthquakes and their aftershocks or the paths of severe storms. This can then inform efforts to mitigate risks across multiple hazards, including reducing exposure and vulnerability, strengthening system resiliency, improving disaster response mechanisms, and targeting mitigation resources to the highest or most critical risks. We report here on initial efforts to develop hazard mapping tools that draw on open web services and support simple spatial queries about population exposure. The NASA Socioeconomic Data and Applications Center (SEDAC) Hazards Mapper, a web-based mapping tool, enables users to estimate population living in areas subject to flood or tornado warnings, near recent earthquakes, or around critical infrastructure. The HazPop mobile app, implemented for iOS devices, utilizes location services to support disaster risk managers working in field conditions.

  12. Composite Earthquake Catalog of the Yellow Sea for Seismic Hazard Studies

    NASA Astrophysics Data System (ADS)

    Kang, S. Y.; Kim, K. H.; LI, Z.; Hao, T.

    2017-12-01

    The Yellow Sea (a.k.a West Sea in Korea) is an epicontinental and semi-closed sea located between Korea and China. Recent earthquakes in the Yellow Sea including, but not limited to, the Seogyuckryulbi-do (1 April 2014, magnitude 5.1), Heuksan-do (21 April 2013, magnitude 4.9), Baekryung-do (18 May 2013, magnitude 4.9) earthquakes, and the earthquake swarm in the Boryung offshore region in 2013, remind us of the seismic hazards affecting east Asia. This series of earthquakes in the Yellow Sea raised numerous questions. Unfortunately, both governments have trouble in monitoring seismicity in the Yellow Sea because earthquakes occur beyond their seismic networks. For example, the epicenters of the magnitude 5.1 earthquake in the Seogyuckryulbi-do region in 2014 reported by the Korea Meteorological Administration and China Earthquake Administration differed by approximately 20 km. This illustrates the difficulty with seismic monitoring and locating earthquakes in the region, despite the huge effort made by both governments. Joint effort is required not only to overcome the limits posed by political boundaries and geographical location but also to study seismicity and the underground structures responsible. Although the well-established and developing seismic networks in Korea and China have provided unprecedented amount and quality of seismic data, high quality catalog is limited to the recent 10s of years, which is far from major earthquake cycle. It is also noticed the earthquake catalog from either country is biased to its own and cannot provide complete picture of seismicity in the Yellow Sea. In order to understand seismic hazard and tectonics in the Yellow Sea, a composite earthquake catalog has been developed. We gathered earthquake information during last 5,000 years from various sources. There are good reasons to believe that some listings account for same earthquake, but in different source parameters. We established criteria in order to provide consistent information in the Yellow Sea composite earthquake catalog (YComCat). Since earthquake catalog plays critical role in the seismic hazard assessment, YComCat provides improved input to reduce uncertainties in the seismic hazard estimations.

  13. Earthquake Hazard and Risk in Sub-Saharan Africa: current status of the Global Earthquake model (GEM) initiative in the region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay; Midzi, Vunganai; Ateba, Bekoa; Mulabisana, Thifhelimbilu; Marimira, Kwangwari; Hlatywayo, Dumisani J.; Akpan, Ofonime; Amponsah, Paulina; Georges, Tuluka M.; Durrheim, Ray

    2013-04-01

    Large magnitude earthquakes have been observed in Sub-Saharan Africa in the recent past, such as the Machaze event of 2006 (Mw, 7.0) in Mozambique and the 2009 Karonga earthquake (Mw 6.2) in Malawi. The December 13, 1910 earthquake (Ms = 7.3) in the Rukwa rift (Tanzania) is the largest of all instrumentally recorded events known to have occurred in East Africa. The overall earthquake hazard in the region is on the lower side compared to other earthquake prone areas in the globe. However, the risk level is high enough for it to receive attention of the African governments and the donor community. The latest earthquake hazard map for the sub-Saharan Africa was done in 1999 and updating is long overdue as several development activities in the construction industry is booming allover sub-Saharan Africa. To this effect, regional seismologists are working together under the GEM (Global Earthquake Model) framework to improve incomplete, inhomogeneous and uncertain catalogues. The working group is also contributing to the UNESCO-IGCP (SIDA) 601 project and assessing all possible sources of data for the catalogue as well as for the seismotectonic characteristics that will help to develop a reasonable hazard model in the region. In the current progress, it is noted that the region is more seismically active than we thought. This demands the coordinated effort of the regional experts to systematically compile all available information for a better output so as to mitigate earthquake risk in the sub-Saharan Africa.

  14. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  15. Seismic hazard along a crude oil pipeline in the event of an 1811-1812 type New Madrid earthquake. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, H.H.M.; Chen, C.H.S.

    1990-04-16

    An assessment of the seismic hazard that exists along the major crude oil pipeline running through the New Madrid seismic zone from southeastern Louisiana to Patoka, Illinois is examined in the report. An 1811-1812 type New Madrid earthquake with moment magnitude 8.2 is assumed to occur at three locations where large historical earthquakes have occurred. Six pipeline crossings of the major rivers in West Tennessee are chosen as the sites for hazard evaluation because of the liquefaction potential at these sites. A seismologically-based model is used to predict the bedrock accelerations. Uncertainties in three model parameters, i.e., stress parameter, cutoffmore » frequency, and strong-motion duration are included in the analysis. Each parameter is represented by three typical values. From the combination of these typical values, a total of 27 earthquake time histories can be generated for each selected site due to an 1811-1812 type New Madrid earthquake occurring at a postulated seismic source.« less

  16. Reevaluation of the Seismicity and seismic hazards of Northeastern Libya

    NASA Astrophysics Data System (ADS)

    Ben Suleman, abdunnur; Aousetta, Fawzi

    2014-05-01

    Libya, located at the northern margin of the African continent, underwent many episodes of orogenic activities. These episodes of orogenic activities affected and shaped the geological setting of the country. This study represents a detailed investigation that aims to focus on the seismicity and its implications on earthquake hazards of Northeastern Libya. At the end of year 2005 the Libyan National Seismological Network starts functioning with 15 stations. The Seismicity of the area under investigation was reevaluated using data recorded by the recently established network. The Al-Maraj earthquake occurred in May 22nd 2005was analyzed. This earthquake was located in a known seismically active area. This area was the sight of the well known 1963 earthquake that kills over 200 people. Earthquakes were plotted and resulting maps were interpreted and discussed. The level of seismic activity is higher in some areas, such as the city of Al-Maraj. The offshore areas north of Al-Maraj seem to have higher seismic activity. It is highly recommended that the recent earthquake activity is considered in the seismic hazard assessments for the northeastern part of Libya.

  17. Toward Building a New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  18. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Trugman, Daniel Taylor

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical features of the historical seismic record, they are inconsistent with the complexity underlying earthquake occurrence and can lead to misleading assessments of seismic hazard when applied in practice. The six principle chapters of this thesis explore the extent to which the behavior of real earthquakes deviates from these simplified models, and the implications that the observed deviations have for our understanding of earthquake rupture processes and seismic hazard. Chapter 1 provides a brief thematic overview and introduction to the scope of this thesis. Chapter 2 examines the complexity of the 2010 M7.2 El Mayor-Cucapah earthquake, focusing on the relation between its unexpected and unprecedented occurrence and anthropogenic stresses from the nearby Cerro Prieto Geothermal Field. Chapter 3 compares long-term changes in seismicity within California's three largest geothermal fields in an effort to characterize the relative influence of natural and anthropogenic stress transients on local seismic hazard. Chapter 4 describes a hybrid, hierarchical clustering algorithm that can be used to relocate earthquakes using waveform cross-correlation, and applies the new algorithm to study the spatiotemporal evolution of two recent seismic swarms in western Nevada. Chapter 5 describes a new spectral decomposition technique that can be used to analyze the dynamic source properties of large datasets of earthquakes, and applies this approach to revisit the question of self-similar scaling of southern California seismicity. Chapter 6 builds upon these results and applies the same spectral decomposition technique to examine the source properties of several thousand recent earthquakes in southern Kansas that are likely human-induced by massive oil and gas operations in the region. Chapter 7 studies the connection between source spectral properties and earthquake hazard, focusing on spatial variations in dynamic stress drop and its influence on ground motion amplitudes. Finally, Chapter 8 provides a summary of the key findings of and relations between these studies, and outlines potential avenues of future research.

  19. Integrating population dynamics into mapping human exposure to seismic hazard

    NASA Astrophysics Data System (ADS)

    Freire, S.; Aubrecht, C.

    2012-11-01

    Disaster risk is not fully characterized without taking into account vulnerability and population exposure. Assessment of earthquake risk in urban areas would benefit from considering the variation of population distribution at more detailed spatial and temporal scales, and from a more explicit integration of this improved demographic data with existing seismic hazard maps. In the present work, "intelligent" dasymetric mapping is used to model population dynamics at high spatial resolution in order to benefit the analysis of spatio-temporal exposure to earthquake hazard in a metropolitan area. These night- and daytime-specific population densities are then classified and combined with seismic intensity levels to derive new spatially-explicit four-class-composite maps of human exposure. The presented approach enables a more thorough assessment of population exposure to earthquake hazard. Results show that there are significantly more people potentially at risk in the daytime period, demonstrating the shifting nature of population exposure in the daily cycle and the need to move beyond conventional residence-based demographic data sources to improve risk analyses. The proposed fine-scale maps of human exposure to seismic intensity are mainly aimed at benefiting visualization and communication of earthquake risk, but can be valuable in all phases of the disaster management process where knowledge of population densities is relevant for decision-making.

  20. Global earthquake casualties due to secondary effects: A quantitative analysis for improving PAGER losses

    USGS Publications Warehouse

    Wald, David J.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.

  1. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    USGS Publications Warehouse

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  2. Assessing the earthquake hazards in urban areas

    USGS Publications Warehouse

    Hays, W.W.; Gori, P.L.; Kockelman, W.J.

    1988-01-01

    Major urban areas in widely scattered geographic locations across the United States are a t varying degrees of risk from earthquakes. the locations of these urban areas include Charleston, South Carolina; Memphis Tennessee; St.Louis, Missouri; Salt Lake City, Utah; Seattle-Tacoma, Washington; Portland, Oregon; and Anchorage, Alaska; even Boston, Massachusetts, and Buffalo New York, have a history of large earthquakes. Cooperative research during the past decade has focused on assessing the nature and degree of the risk or seismic hazard i nthe broad geographic regions around each urban area. The strategy since the 1970's has been to bring together local, State, and Federal resources to solve the problem of assessing seismic risk. Successfl sooperative programs have been launched in the San Francisco Bay and Los Angeles regions in California and the Wasatch Front region in Utah. 

  3. 75 FR 8042 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-23

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction.... FOR FURTHER INFORMATION CONTACT: Dr. Jack Hayes, National Earthquake Hazards Reduction Program...

  4. Implications from palaeoseismological investigations at the Markgrafneusiedl Fault (Vienna Basin, Austria) for seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Hintersberger, Esther; Decker, Kurt; Lomax, Johanna; Lüthgens, Christopher

    2018-02-01

    Intraplate regions characterized by low rates of seismicity are challenging for seismic hazard assessment, mainly for two reasons. Firstly, evaluation of historic earthquake catalogues may not reveal all active faults that contribute to regional seismic hazard. Secondly, slip rate determination is limited by sparse geomorphic preservation of slowly moving faults. In the Vienna Basin (Austria), moderate historical seismicity (Imax, obs / Mmax, obs = 8/5.2) concentrates along the left-lateral strike-slip Vienna Basin Transfer Fault (VBTF). In contrast, several normal faults branching out from the VBTF show neither historical nor instrumental earthquake records, although geomorphological data indicate Quaternary displacement along those faults. Here, located about 15 km outside of Vienna, the Austrian capital, we present a palaeoseismological dataset of three trenches that cross one of these splay faults, the Markgrafneusiedl Fault (MF), in order to evaluate its seismic potential. Comparing the observations of the different trenches, we found evidence for five to six surface-breaking earthquakes during the last 120 kyr, with the youngest event occurring at around 14 ka. The derived surface displacements lead to magnitude estimates ranging between 6.2 ± 0.5 and 6.8 ± 0.4. Data can be interpreted by two possible slip models, with slip model 1 showing more regular recurrence intervals of about 20-25 kyr between the earthquakes with M ≥ 6.5 and slip model 2 indicating that such earthquakes cluster in two time intervals in the last 120 kyr. Direct correlation between trenches favours slip model 2 as the more plausible option. Trench observations also show that structural and sedimentological records of strong earthquakes with small surface offset have only low preservation potential. Therefore, the earthquake frequency for magnitudes between 6 and 6.5 cannot be constrained by the trenching records. Vertical slip rates of 0.02-0.05 mm a-1 derived from the trenches compare well to geomorphically derived slip rates of 0.02-0.09 mm a-1. Magnitude estimates from fault dimensions suggest that the largest earthquakes observed in the trenches activated the entire fault surface of the MF including the basal detachment that links the normal fault with the VBTF. The most important implications of these palaeoseismological results for seismic hazard assessment are as follows. (1) The MF is an active seismic source, capable of rupturing the surface despite the lack of historical earthquakes. (2) The MF is kinematically and geologically equivalent to a number of other splay faults of the VBTF. It is reasonable to assume that these faults are potential sources of large earthquakes as well. The frequency of strong earthquakes near Vienna is therefore expected to be significantly higher than the earthquake frequency reconstructed for the MF alone. (3) Although rare events, the potential for earthquake magnitudes equal or greater than M = 7.0 in the Vienna Basin should be considered in seismic hazard studies.

  5. Space geodetic tools provide early warnings for earthquakes and volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Aoki, Yosuke

    2017-04-01

    Development of space geodetic techniques such as Global Navigation Satellite System and Synthetic Aperture Radar in last few decades allows us to monitor deformation of Earth's surface in unprecedented spatial and temporal resolution. These observations, combined with fast data transmission and quick data processing, enable us to quickly detect and locate earthquakes and volcanic eruptions and assess potential hazards such as strong earthquake shaking, tsunamis, and volcanic eruptions. These techniques thus are key parts of early warning systems, help identify some hazards before a cataclysmic event, and improve the response to the consequent damage.

  6. 77 FR 19224 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... should be sent to National Earthquake Hazards Reduction Program Director, National Institute of Standards...

  7. 77 FR 27439 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... should be sent to National Earthquake Hazards Reduction Program Director, National Institute of Standards...

  8. 75 FR 75457 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-03

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... meeting should be sent to National Earthquake Hazards Reduction Program Director, National Institute of...

  9. 76 FR 64325 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... relationship of Presidential Policy Directive/PPD-8: National Preparedness to National Earthquake Hazards...

  10. 76 FR 72905 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction.... ADDRESSES: Questions regarding the meeting should be sent to National Earthquake Hazards Reduction Program...

  11. 76 FR 8712 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... Committee's 2011 Annual Report of the Effectiveness of the National Earthquake Hazards Reduction Program...

  12. 77 FR 18792 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... should be sent to National Earthquake Hazards Reduction Program Director, National Institute of Standards...

  13. 75 FR 18787 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... sent to National Earthquake Hazards Reduction Program Director, National Institute of Standards and...

  14. Space-Time Earthquake Rate Models for One-Year Hazard Forecasts in Oklahoma

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Michael, A. J.

    2017-12-01

    The recent one-year seismic hazard assessments for natural and induced seismicity in the central and eastern US (CEUS) (Petersen et al., 2016, 2017) rely on earthquake rate models based on declustered catalogs (i.e., catalogs with foreshocks and aftershocks removed), as is common practice in probabilistic seismic hazard analysis. However, standard declustering can remove over 90% of some induced sequences in the CEUS. Some of these earthquakes may still be capable of causing damage or concern (Petersen et al., 2015, 2016). The choices of whether and how to decluster can lead to seismicity rate estimates that vary by up to factors of 10-20 (Llenos and Michael, AGU, 2016). Therefore, in order to improve the accuracy of hazard assessments, we are exploring ways to make forecasts based on full, rather than declustered, catalogs. We focus on Oklahoma, where earthquake rates began increasing in late 2009 mainly in central Oklahoma and ramped up substantially in 2013 with the expansion of seismicity into northern Oklahoma and southern Kansas. We develop earthquake rate models using the space-time Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988; Ogata, AISM, 1998; Zhuang et al., JASA, 2002), which characterizes both the background seismicity rate as well as aftershock triggering. We examine changes in the model parameters over time, focusing particularly on background rate, which reflects earthquakes that are triggered by external driving forces such as fluid injection rather than other earthquakes. After the model parameters are fit to the seismicity data from a given year, forecasts of the full catalog for the following year can then be made using a suite of 100,000 ETAS model simulations based on those parameters. To evaluate this approach, we develop pseudo-prospective yearly forecasts for Oklahoma from 2013-2016 and compare them with the observations using standard Collaboratory for the Study of Earthquake Predictability tests for consistency.

  15. Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.

    2010-09-24

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.

  16. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-01-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  17. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  18. 78 FR 8109 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... meeting on the National Earthquake Hazards Reduction Program (NEHRP) web site at http://nehrp.gov...

  19. 77 FR 75610 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... meeting on the National Earthquake Hazards Reduction Program (NEHRP) Web site at http://nehrp.gov...

  20. Probabilistic tsunami hazard assessment for the Makran region with focus on maximum magnitude assumption

    NASA Astrophysics Data System (ADS)

    Hoechner, Andreas; Babeyko, Andrey Y.; Zamora, Natalia

    2016-06-01

    Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.

  1. Probabilistic tsunami hazard assessment for the Makran region with focus on maximum magnitude assumption

    NASA Astrophysics Data System (ADS)

    Hoechner, A.; Babeyko, A. Y.; Zamora, N.

    2015-09-01

    Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.

  2. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    NASA Astrophysics Data System (ADS)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  3. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes

    NASA Astrophysics Data System (ADS)

    Williams, Jack G.; Rosser, Nick J.; Kincey, Mark E.; Benjamin, Jessica; Oven, Katie J.; Densmore, Alexander L.; Milledge, David G.; Robinson, Tom R.; Jordan, Colm A.; Dijkstra, Tom A.

    2018-01-01

    Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days) humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM) can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  4. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  5. Oregon Hazard Explorer for Lifelines Program (OHELP): A web-based geographic information system tool for assessing potential Cascadia earthquake hazard

    NASA Astrophysics Data System (ADS)

    Sharifi Mood, M.; Olsen, M. J.; Gillins, D. T.; Javadnejad, F.

    2016-12-01

    The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.

  6. Comparative risk assessments for the city of Pointe-à-Pitre (French West Indies): earthquakes and storm surge

    NASA Astrophysics Data System (ADS)

    Reveillere, A. R.; Bertil, D. B.; Douglas, J. D.; Grisanti, L. G.; Lecacheux, S. L.; Monfort, D. M.; Modaressi, H. M.; Müller, H. M.; Rohmer, J. R.; Sedan, O. S.

    2012-04-01

    In France, risk assessments for natural hazards are usually carried out separately and decision makers lack comprehensive information. Moreover, since the cause of the hazard (e.g. meteorological, geological) and the physical phenomenon that causes damage (e.g. inundation, ground shaking) may be fundamentally different, the quantitative comparison of single risk assessments that were not conducted in a compatible framework is not straightforward. Comprehensive comparative risk assessments exist in a few other countries. For instance, the Risk Map Germany project has developed and applied a methodology for quantitatively comparing the risk of relevant natural hazards at various scales (city, state) in Germany. The present on-going work applies a similar methodology to the Pointe-à-Pitre urban area, which represents more than half of the population of Guadeloupe, an overseas region in the French West Indies. Relevant hazards as well as hazard intensity levels differ from continental Europe, which will lead to different conclusions. French West Indies are prone to a large number of hazards, among which hurricanes, volcanic eruptions and earthquakes dominate. Hurricanes cause damage through three phenomena: wind, heavy rainfall and storm surge, the latter having had a preeminent role during the largest historical event in 1928. Seismic risk is characterized by many induced phenomena, among which earthquake shocks dominate. This study proposes a comparison of earthquake and cyclonic storm surge risks. Losses corresponding to hazard intensities having the same probability of occurrence are calculated. They are quantified in a common loss unit, chosen to be the direct economic losses. Intangible or indirect losses are not considered. The methodology therefore relies on (i) a probabilistic hazard assessment, (ii) a loss ratio estimation for the exposed elements and (iii) an economic estimation of these assets. Storm surge hazard assessment is based on the selection of relevant historical cyclones and on the simulation of the associated wave and cyclonic surge. The combined local sea elevations, called "set-up", are then fitted with a statistical distribution in order to obtain its time return characteristics. Several run-ups are then extracted, the inundation areas are calculated and the relative losses of the affected assets are deduced. The Probabilistic Seismic Hazard Assessment and the exposed elements location and seismic vulnerability result from past public risk assessment studies. The loss estimations are computed for several return time periods, measured in percentage of buildings being in a given EMS-98 damage state per grid block, which are then converted into loss ratio. In parallel, an asset estimation is conducted. It is mainly focused on private housing, but it considers some major public infrastructures as well. The final outcome of this work is a direct economic loss-frequency plot for earthquake and storm surge. The Probable Maximum Loss and the Average Annual Loss derivate from this risk curve. In addition, different sources of uncertainty are identified through the loss estimation process. The full propagation of these uncertainties can provide an interval of confidence, which can be assigned to the risk-curve and we show how such additional information can be useful for risk comparison.

  7. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5

  8. Global Omori law decay of triggered earthquakes: large aftershocks outside the classical aftershock zone

    USGS Publications Warehouse

    Parsons, Tom

    2002-01-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the 1999 shocks in Turkey and the 2001 earthquakes in El Salvador. In this study, earthquakes with Ms ≥ 7.0 from the Harvard centroid moment tensor (CMT) catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near (defined as having shear stress change ∣Δτ∣ ≥ 0.01 MPa) the Ms ≥ 7.0 shocks are associated with calculated shear stress increases, while ∼39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, these triggered earthquakes obey an Omori law rate decay that lasts between ∼7–11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main shock centroid. Omori's law is one of the few time-predictable patterns evident in the global occurrence of earthquakes. If large triggered earthquakes habitually obey Omori's law, then their hazard can be more readily assessed. The characteristic rate change with time and spatial distribution can be used to rapidly assess the likelihood of triggered earthquakes following events of Ms ≥ 7.0. I show an example application to the M = 7.7 13 January 2001 El Salvador earthquake where use of global statistics appears to provide a better rapid hazard estimate than Coulomb stress change calculations.

  9. Global Omori law decay of triggered earthquakes: Large aftershocks outside the classical aftershock zone

    USGS Publications Warehouse

    Parsons, T.

    2002-01-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the 1999 shocks in Turkey and the 2001 earthquakes in El Salvador. In this study, earthquakes with Ms ≥ 7.0 from the Harvard centroid moment tensor (CMT) catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occured near (defined as having shear stress change |Δ| 0.01 MPa) the Ms ≥ 7.0 shocks are associated with calculated shear stress increases, while ~39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, these triggered earthquakes obey an Omori law rate decay that lasts between ~7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main shock centroid. Omori's law is one of the few time-predictable patterns evident in the global occurrence of earthquakes. If large triggered earthquakes habitually obey Omori's law, then their hazard can be more readily assessed. The characteristics rate change with time and spatial distribution can be used to rapidly assess the likelihood of triggered earthquakes following events of Ms ≥7.0. I show an example application to the M = 7.7 13 January 2001 El Salvador earthquake where use of global statistics appears to provide a better rapid hazard estimate than Coulomb stress change calculations.

  10. Global Omori law decay of triggered earthquakes: Large aftershocks outside the classical aftershock zone

    NASA Astrophysics Data System (ADS)

    Parsons, Tom

    2002-09-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the1999 shocks in Turkey and the 2001 earthquakes in El Salvador. In this study, earthquakes with Ms ≥ 7.0 from the Harvard centroid moment tensor (CMT) catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near (defined as having shear stress change ∣Δτ∣ ≥ 0.01 MPa) the Ms ≥ 7.0 shocks are associated with calculated shear stress increases, while ˜39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, these triggered earthquakes obey an Omori law rate decay that lasts between ˜7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main shock centroid. Omori's law is one of the few time-predictable patterns evident in the global occurrence of earthquakes. If large triggered earthquakes habitually obey Omori's law, then their hazard can be more readily assessed. The characteristic rate change with time and spatial distribution can be used to rapidly assess the likelihood of triggered earthquakes following events of Ms ≥ 7.0. I show an example application to the M = 7.7 13 January 2001 El Salvador earthquake where use of global statistics appears to provide a better rapid hazard estimate than Coulomb stress change calculations.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha

    Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.

  12. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  13. Using SAR and GPS for Hazard Management and Response: Progress and Examples from the Advanced Rapid Imaging and Analysis (ARIA) Project

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Simons, M.; Hua, H.; Yun, S. H.; Agram, P. S.; Milillo, P.; Sacco, G. F.; Webb, F.; Rosen, P. A.; Lundgren, P.; Milillo, G.; Manipon, G. J. M.; Moore, A. W.; Liu, Z.; Polet, J.; Cruz, J.

    2014-12-01

    ARIA is a joint JPL/Caltech project to automate synthetic aperture radar (SAR) and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. We have built a prototype SAR and GPS data system that forms the foundation for hazard monitoring and response capability, as well as providing imaging capabilities important for science studies. Together, InSAR and GPS have the ability to capture surface deformation in high spatial and temporal resolution. For earthquakes, this deformation provides information that is complementary to seismic data on location, geometry and magnitude of earthquakes. Accurate location information is critical for understanding the regions affected by damaging shaking. Regular surface deformation measurements from SAR and GPS are useful for monitoring changes related to many processes that are important for hazard and resource management such as volcanic deformation, groundwater withdrawal, and landsliding. Observations of SAR coherence change have a demonstrated use for damage assessment for hazards such as earthquakes, tsunamis, hurricanes, and volcanic eruptions. These damage assessment maps can be made from imagery taken day or night and are not affected by clouds, making them valuable complements to optical imagery. The coherence change caused by the damage from hazards (building collapse, flooding, ash fall) is also detectable with intelligent algorithms, allowing for rapid generation of damage assessment maps over large areas at fine resolution, down to the spatial scale of single family homes. We will present the progress and results we have made on automating the analysis of SAR data for hazard monitoring and response using data from the Italian Space Agency's (ASI) COSMO-SkyMed constellation of X-band SAR satellites. Since the beginning of our project with ASI, our team has imaged deformation and coherence change caused by many natural hazard events around the world. We will present progress on our data system technology that enables rapid and reliable production of imagery. Lastly, we participated in the March 2014 FEMA exercise based on a repeat of the 1964 M9.2 Alaska earthquake, providing simulated data products for use in this hazards response exercise. We will present lessons learned from this and other simulation exercises.

  14. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    NASA Astrophysics Data System (ADS)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  15. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    NASA Astrophysics Data System (ADS)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.

  16. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.

  17. Challenges in Assessing Seismic Hazard in Intraplate Europe

    NASA Astrophysics Data System (ADS)

    Hintersberger, E.; Kuebler, S.; Landgraf, A.; Stein, S. A.

    2014-12-01

    Intraplate regions are often characterized by scattered, clustered and migrating seismicity and the occurrence of low-strain areas next to high-strain ones. Increasing evidence for large paleoearthquakes in such regions together with population growth and development of critical facilities, call for better assessments of earthquake hazards. Existing seismic hazard assessment for intraplate Europe is based on instrumental and historical seismicity of the past 1000 years, as well some active fault data. These observations face important limitations due to the quantity and quality of the available data bases. Even considering the long record of historical events in some populated areas of Europe, this time-span of thousand years likely fails to capture some faults' typical large-event recurrence intervals that are in the order of tens of thousands of years. Paleoseismology helps lengthen the observation window, but only produces point measurements, and preferentially in regions suspected to be seismically active. As a result, the expected maximum magnitudes of future earthquakes are quite uncertain, likely to be underestimated, and earthquakes are likely to occur in unexpected locations. These issues in particular arise in the heavily populated Rhine Graben and Vienna Basin areas, and in considering the hazard to critical facilities like nuclear power plants posed by low-probability events.

  18. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  19. Insights into earthquake hazard map performance from shaking history simulations

    NASA Astrophysics Data System (ADS)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher-than-mapped shaking — arises by chance or reflects biases in the map. Due to this problem, there are limits to how well we can expect hazard maps to predict future shaking, as well as to our ability to test the performance of a hazard map based on available observations.

  20. United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; ,

    2008-01-01

    The U.S. Geological Survey?s maps of earthquake shaking hazards provide information essential to creating and updating the seismic design provisions of building codes and insurance rates used in the United States. Periodic revisions of these maps incorporate the results of new research. Buildings, bridges, highways, and utilities built to meet modern seismic design provisions are better able to withstand earthquakes, not only saving lives but also enabling critical activities to continue with less disruption. These maps can also help people assess the hazard to their homes or places of work and can also inform insurance rates.

  1. Challenges in assessing seismic hazard in intraplate Europe

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Liu, Mian; Camelbeeck, Thierry; Merino, Miguel; Landgraf, Angela; Hintersberger, Esther; Kübler, Simon

    2016-04-01

    Intraplate seismicity is often characterized by episodic, clustered and migrating earth- quakes and extended after-shock sequences. Can these observations - primarily from North America, China and Australia - usefully be applied to seismic hazard assessment for intraplate Europe? Existing assessments are based on instrumental and historical seismicity of the past c. 1000 years, as well as some data for active faults. This time span probably fails to capture typical large-event recurrence intervals of the order of tens of thousands of years. Palaeoseismology helps to lengthen the observation window, but preferentially produces data in regions suspected to be seismically active. Thus the expected maximum magnitudes of future earthquakes are fairly uncertain, possibly underestimated, and earthquakes are likely to occur in unexpected locations. These issues particularly arise in considering the hazards posed by low-probability events to both heavily populated areas and critical facilities. For example, are the variations in seismicity (and thus assumed seismic hazard) along the Rhine Graben a result of short sampling or are they real? In addition to a better assessment of hazards with new data and models, it is important to recognize and communicate uncertainties in hazard estimates. The more users know about how much confidence to place in hazard maps, the more effectively the maps can be used.

  2. Broadband Ground Motion Simulation Recipe for Scenario Hazard Assessment in Japan

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Fujiwara, H.; Irikura, K.

    2014-12-01

    The National Seismic Hazard Maps for Japan, which consist of probabilistic seismic hazard maps (PSHMs) and scenario earthquake shaking maps (SESMs), have been published every year since 2005 by the Earthquake Research Committee (ERC) in the Headquarter for Earthquake Research Promotion, which was established in the Japanese government after the 1995 Kobe earthquake. The publication was interrupted due to problems in the PSHMs revealed by the 2011 Tohoku earthquake, and the Subcommittee for Evaluations of Strong Ground Motions ('Subcommittee') has been examining the problems for two and a half years (ERC, 2013; Fujiwara, 2014). However, the SESMs and the broadband ground motion simulation recipe used in them are still valid at least for crustal earthquakes. Here, we outline this recipe and show the results of validation tests for it.Irikura and Miyake (2001) and Irikura (2004) developed a recipe for simulating strong ground motions from future crustal earthquakes based on a characterization of their source models (Irikura recipe). The result of the characterization is called a characterized source model, where a rectangular fault includes a few rectangular asperities. Each asperity and the background area surrounding the asperities have their own uniform stress drops. The Irikura recipe defines the parameters of the fault and asperities, and how to simulate broadband ground motions from the characterized source model. The recipe for the SESMs was constructed following the Irikura recipe (ERC, 2005). The National Research Institute for Earth Science and Disaster Prevention (NIED) then made simulation codes along this recipe to generate SESMs (Fujiwara et al., 2006; Morikawa et al., 2011). The Subcommittee in 2002 validated a preliminary version of the SESM recipe by comparing simulated and observed ground motions for the 2000 Tottori earthquake. In 2007 and 2008, the Subcommittee carried out detailed validations of the current version of the SESM recipe and the NIED codes using ground motions from the 2005 Fukuoka earthquake. Irikura and Miyake (2011) summarized the latter validations, concluding that the ground motions were successfully simulated as shown in the figure. This indicates that the recipe has enough potential to generate broadband ground motions for scenario hazard assessment in Japan.

  3. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  4. Geomodels of coseismic landslides environments in Central Chile.

    NASA Astrophysics Data System (ADS)

    Serey, A.; Sepulveda, S. A.; Murphy, W.; Petley, D. N.

    2017-12-01

    Landslides are a major source of fatalities and damage during strong earthquakes in mountain areas. Detailed geomodels of coseismic landslides environments are essential parts of seismic landslide hazard analyses. The development of a site specific geological model is required, based on consideration of the regional and local geological and geomorphological history and the current ground surface conditions. An engineering geological model is any approximation of the geological conditions, at varying scales, created for the purpose of solving an engineering problem. In our case, the objective is the development of a methodology for earthquake-induced landslide hazard assessment applicable to urban/territorial planning and disaster prevention strategies assessment at a regional scale adapted for the Chilean tectonic conditions. We have developed the only 2 complete inventories of landslides triggered by earthquakes in Chile. The first from the Mw 6.2, shallow crustal Aysén earthquake in 2007. Second one from the Mw 8.8, megathrust subduction Maule earthquake in 2010. From the comparison of these 2 inventories with others from abroad, as well as analysis of large, prehistoric landslide inventories proposed as likely induced by seismic activity we have determined topographic, geomorphological, geological and seismic controlling factors in the occurrence of earthquake-triggered landslides. With the information collected we have defined different environments for generation of coseismic landslides based on the construction of geomodels. As a result we have built several geomodels in the Santiago Cordillera in central Chile (33°S), based upon the San Ramón Fault, a west-vergent reverse fault that outcrops at the edge of Santiago basin recently found to be active and a likely source of seismic activity in the future, with potential of triggering landslides in the Santiago mountain front as well as inland into the Mapocho and Maipo Cordilleran valleys. In conclusion these geomodels are a powerful tool for earthquake-induced landslide hazard assessment. As an implication we can identify landslide-prone areas, distinguish different seismic scenarios and describe related potential hazards, including burial and river damming by large rock slides and rock avalanches.

  5. How well should probabilistic seismic hazard maps work?

    NASA Astrophysics Data System (ADS)

    Vanneste, K.; Stein, S.; Camelbeeck, T.; Vleminckx, B.

    2016-12-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in earthquake hazard maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSHA model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating the shaking history of an area with assumed distribution of earthquakes, frequency-magnitude relation, temporal occurrence model, and ground-motion prediction equation. We compare the "observed" shaking at many sites over time to that predicted by a hazard map generated for the same set of parameters. PSHA predicts that the fraction of sites at which shaking will exceed that mapped is p = 1 - exp(t/T), where t is the duration of observations and T is the map's return period. This implies that shaking in large earthquakes is typically greater than shown on hazard maps, as has occurred in a number of cases. A large number of simulated earthquake histories yield distributions of shaking consistent with this forecast, with a scatter about this value that decreases as t/T increases. The median results are somewhat lower than predicted for small values of t/T and approach the predicted value for larger values of t/T. Hence, the algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. Validation is more complicated because a real observed earthquake history can yield a fractional exceedance significantly higher or lower than that predicted while still being consistent with the hazard map in question. As a result, given that in the real world we have only a single sample, it is hard to assess whether a misfit between a map and observations arises by chance or reflects a biased map.

  6. Surficial Seismology: Landslides, Glaciers, and Volcanoes in the Pacific Northwest through a Seismic Lens

    NASA Astrophysics Data System (ADS)

    Allstadt, Kate

    The following work is focused on the use of both traditional and novel seismological tools, combined with concepts from other disciplines, to investigate shallow seismic sources and hazards. The study area is the dynamic landscape of the Pacific Northwest and its wide-ranging earthquake, landslide, glacier, and volcano-related hazards. The first chapter focuses on landsliding triggered by earthquakes, with a shallow crustal earthquake in Seattle as a case study. The study demonstrates that utilizing broadband synthetic seismograms and rigorously incorporating 3D basin amplification, 1D site effects, and fault directivity, allows for a more complete assessment of regional seismically induced landslide hazard. The study shows that the hazard is severe for Seattle, and provides a framework for future probabilistic maps and near real-time hazard assessment. The second chapter focuses on landslides that generate seismic waves and how these signals can be harnessed to better understand landslide dynamics. This is demonstrated using two contrasting Pacific Northwest landslides. The 2010 Mount Meager, BC, landslide generated strong long period waves. New full waveform inversion methods reveal the time history of forces the landslide exerted on the earth that is used to quantify event dynamics. Despite having a similar volume (˜107 m3), The 2009 Nile Valley, WA, landslide did not generate observable long period motions because of its smaller accelerations, but pulses of higher frequency waves were valuable in piecing together the complex sequence of events. The final chapter details the difficulties of monitoring glacier-clad volcanoes. The focus is on small, repeating, low-frequency earthquakes at Mount Rainier that resemble volcanic earthquakes. However, based on this investigation, they are actually glacial in origin: most likely stick-slip sliding of glaciers triggered by snow loading. Identification of the source offers a view of basal glacier processes, discriminates against alarming volcanic noises, and has implications for repeating earthquakes in tectonic environments. This body of work demonstrates that by combining methods and concepts from seismology and other disciplines in new ways, we can obtain a better understanding and a fresh perspective of the physics behind the shallow seismic sources and hazards that threaten the Pacific Northwest.

  7. Assessing the seismic risk potential of South America

    USGS Publications Warehouse

    Jaiswal, Kishor; Petersen, Mark D.; Harmsen, Stephen; Smoczyk, Gregory M.

    2016-01-01

    We present here a simplified approach to quantifying regional seismic risk. The seismic risk for a given region can be inferred in terms of average annual loss (AAL) that represents long-term value of earthquake losses in any one year caused from a long-term seismic hazard. The AAL are commonly measured in the form of earthquake shaking-induced deaths, direct economic impacts or indirect losses caused due to loss of functionality. In the context of South American subcontinent, the analysis makes use of readily available public data on seismicity, population exposure, and the hazard and vulnerability models for the region. The seismic hazard model was derived using available seismic catalogs, fault databases, and the hazard methodologies that are analogous to the U.S. Geological Survey’s national seismic hazard mapping process. The Prompt Assessment of Global Earthquakes for Response (PAGER) system’s direct empirical vulnerability functions in terms of fatality and economic impact were used for performing exposure and risk analyses. The broad findings presented and the risk maps produced herein are preliminary, yet they do offer important insights into the underlying zones of high and low seismic risks in the South American subcontinent. A more detailed analysis of risk may be warranted by engaging local experts, especially in some of the high risk zones identified through the present investigation.

  8. Source Spectra and Site Response for Two Indonesian Earthquakes: the Tasikmalaya and Kerinci Events of 2009

    NASA Astrophysics Data System (ADS)

    Gunawan, I.; Cummins, P. R.; Ghasemi, H.; Suhardjono, S.

    2012-12-01

    Indonesia is very prone to natural disasters, especially earthquakes, due to its location in a tectonically active region. In September-October 2009 alone, intraslab and crustal earthquakes caused the deaths of thousands of people, severe infrastructure destruction and considerable economic loss. Thus, both intraslab and crustal earthquakes are important sources of earthquake hazard in Indonesia. Analysis of response spectra for these intraslab and crustal earthquakes are needed to yield more detail about earthquake properties. For both types of earthquakes, we have analysed available Indonesian seismic waveform data to constrain source and path parameters - i.e., low frequency spectral level, Q, and corner frequency - at reference stations that appear to be little influenced by site response.. We have considered these analyses for the main shocks as well as several aftershocks. We obtain corner frequencies that are reasonably consistent with the constant stress drop hypothesis. Using these results, we consider using them to extract information about site response form other stations form the Indonesian strong motion network that appear to be strongly affected by site response. Such site response data, as well as earthquake source parameters, are important for assessing earthquake hazard in Indonesia.

  9. 44 CFR 361.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards... Federal Emergency Management Agency (FEMA) and States in the administration of FEMA's earthquake hazards...

  10. 44 CFR 361.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards... Federal Emergency Management Agency (FEMA) and States in the administration of FEMA's earthquake hazards...

  11. 44 CFR 361.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards... Federal Emergency Management Agency (FEMA) and States in the administration of FEMA's earthquake hazards...

  12. 44 CFR 361.1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards... Federal Emergency Management Agency (FEMA) and States in the administration of FEMA's earthquake hazards...

  13. 44 CFR 361.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards... Federal Emergency Management Agency (FEMA) and States in the administration of FEMA's earthquake hazards...

  14. Neo-deterministic seismic hazard assessment in North Africa

    NASA Astrophysics Data System (ADS)

    Mourabit, T.; Abou Elenean, K. M.; Ayadi, A.; Benouar, D.; Ben Suleman, A.; Bezzeghoud, M.; Cheddadi, A.; Chourak, M.; ElGabry, M. N.; Harbi, A.; Hfaiedh, M.; Hussein, H. M.; Kacem, J.; Ksentini, A.; Jabour, N.; Magrin, A.; Maouche, S.; Meghraoui, M.; Ousadou, F.; Panza, G. F.; Peresan, A.; Romdhane, N.; Vaccari, F.; Zuccolo, E.

    2014-04-01

    North Africa is one of the most earthquake-prone areas of the Mediterranean. Many devastating earthquakes, some of them tsunami-triggering, inflicted heavy loss of life and considerable economic damage to the region. In order to mitigate the destructive impact of the earthquakes, the regional seismic hazard in North Africa is assessed using the neo-deterministic, multi-scenario methodology (NDSHA) based on the computation of synthetic seismograms, using the modal summation technique, at a regular grid of 0.2 × 0.2°. This is the first study aimed at producing NDSHA maps of North Africa including five countries: Morocco, Algeria, Tunisia, Libya, and Egypt. The key input data for the NDSHA algorithm are earthquake sources, seismotectonic zonation, and structural models. In the preparation of the input data, it has been really important to go beyond the national borders and to adopt a coherent strategy all over the area. Thanks to the collaborative efforts of the teams involved, it has been possible to properly merge the earthquake catalogues available for each country to define with homogeneous criteria the seismogenic zones, the characteristic focal mechanism associated with each of them, and the structural models used to model wave propagation from the sources to the sites. As a result, reliable seismic hazard maps are produced in terms of maximum displacement ( D max), maximum velocity ( V max), and design ground acceleration.

  15. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  16. Assessment of tsunami hazard to the U.S. Atlantic margin

    USGS Publications Warehouse

    ten Brink, Uri S.; Chaytor, Jason; Geist, Eric L.; Brothers, Daniel S.; Andrews, Brian D.

    2014-01-01

    Tsunamis caused by atmospheric disturbances and by coastal earthquakes may be more frequent than those generated by landslides, but their amplitudes are probably smaller. Among the possible far-field earthquake sources, only earthquakes located within the Gulf of Cadiz or west of the Tore-Madeira Rise are likely to affect the U.S. coast. It is questionable whether earthquakes on the Puerto Rico Trench are capable of producing a large enough tsunami that will affect the U.S. Atlantic coast. More information is needed to evaluate the seismic potential of the northern Cuba fold-and-thrust belt. The hazard from a volcano flank collapse in the Canary Islands is likely smaller than originally stated, and there is not enough information to evaluate the magnitude and frequency of flank collapse from the Azores Islands. Both deterministic and probabilistic methods to evaluate the tsunami hazard from the margin are available for application to the Atlantic margin, but their implementation requires more information than is currently available.

  17. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.

  18. The fujairah united arab emirates (uae) (ml = 5.1) earthquake of march 11, 2002 a reminder for the immediate need to develop and implement a national hazard mitigation strategy

    NASA Astrophysics Data System (ADS)

    Al-Homoud, A.

    2003-04-01

    On March 11, 2002, at mid nigh, the Fujairah Masafi region in the UAE was shaken by an earthquake of shallow depth and local magnitude m = 5.1 on Richter Scale. The earthquake occurred on Dibba fault in the UAE with epicenter of the earthquake at 20 km NW of Fujairha city. The focal depth was just 10 km. The earthquake was felt in most parts of the northern emirates: Dubai, Sharjah, Ajman, Ras Al-Khaima, and Um-Qwain. The "main shock" was followed in the following weeks by more than twenty five earthquakes with local magnitude ranging from m = 4 to m = 4.8. The location of those earthquakes was along Zagros Reverse Faulting System in the Iranian side the Arabian Gulf, opposite to the Shores of the UAE. Most of these earthquakes were shallow too and were actually felt by the people. However, there was another strong earthquake in early April 2002 in the same Masafi region with local magnitude m = 5.1 and focal depth 30 km, therefore it was not felt by the northern emirates residents. No major structural damages to buildings and lifeline systems were reported in the several cities located in the vicinity of the earthquake epicenter. The very small values of ground accelerations were not enough to test the structural integrity of tall building and major infrastructures. Future major earthquakes anticipated in the region in close vicinity of northern emirates, once they occur, and considering the noticeable local site effect of the emirates sandy soils of high water table levels, will surely put these newly constructed building into the real test. Even though there were no casualties in the March 11th event, but there was major fear as a result of the loud sound of rock rupture heard in the mountains close to Maafi, the noticeable disturbance of animals and birds minutes before the earthquake incident and during the incident, cracks in the a good number of Masafi buildings and major damages that occurred in "old" buildings of Fujairah Masafi area, the closest city to the epicenter of the earthquake. Indeed, the March 11, 2002 and "aftershocks" scared the citizens of Masafi and surrounding regions and ignited the attention of the public and government to the subject matter of earthquake hazard, specialty this earthquake came one year after the near by Indian m = 6.5 destructive Earthquake. Indeed the recent m = 6.2 June 22 destructive earthquake too that hit north west Iran, has again reminded the UAE public and government with the need to take quick and concrete measures to dtake the necessary steps to mitigate any anticipated earthquake hazard. This study reflects in some details on the following aspects related to the region and vicinity: geological and tectonic setting, seismicity, earthquake activity data base and seismic hazard assessment. Moreover, it documents the following aspects of the March 11, 2002 earthquake: tectonic, seismological, instrumental seismic data, aftershocks, strong motion recordings and response spectral and local site effect analysis, geotechnical effects and structural observations in the region affected by the earthquake. The study identifies local site ground amplification effects and liquefaction hazard potential in some parts of the UAE. Moreover, the study reflects on the coverage of the incident in the media, public and government response, state of earthquake engineering practice in the construction industry in the UAE, and the national preparedness and public awareness issues. However, it is concluded for this event that the mild damages that occurred in Masafi region were due to poor quality of construction, and lack of underestimating of the design base shear. Practical recommendations are suggested for the authorities to avoid damages in newly constructed buildings and lifelines as a result of future stronger earthquakes, in addition to recommendations on a national strategy for earthquake hazard mitigation in the UAE, which is still missing. The recommendations include the development and implementation of a design code for earthquake loading in the UAE, development of macro and micro seismic hazard maps, development of local site effect and liquefaction hazard maps, installation of a national earthquake monitoring network, assessment of the vulnerability of critical structures and life line facilities, public awareness, training of rescue teams in civil defense, etc.

  19. Unexpected earthquake hazard revealed by Holocene rupture on the Kenchreai Fault (central Greece): Implications for weak sub-fault shear zones

    NASA Astrophysics Data System (ADS)

    Copley, Alex; Grützner, Christoph; Howell, Andy; Jackson, James; Penney, Camilla; Wimpenny, Sam

    2018-03-01

    High-resolution elevation models, palaeoseismic trenching, and Quaternary dating demonstrate that the Kenchreai Fault in the eastern Gulf of Corinth (Greece) has ruptured in the Holocene. Along with the adjacent Pisia and Heraion Faults (which ruptured in 1981), our results indicate the presence of closely-spaced and parallel normal faults that are simultaneously active, but at different rates. Such a configuration allows us to address one of the major questions in understanding the earthquake cycle, specifically what controls the distribution of interseismic strain accumulation? Our results imply that the interseismic loading and subsequent earthquakes on these faults are governed by weak shear zones in the underlying ductile crust. In addition, the identification of significant earthquake slip on a fault that does not dominate the late Quaternary geomorphology or vertical coastal motions in the region provides an important lesson in earthquake hazard assessment.

  20. Uncertainties in evaluation of hazard and seismic risk

    NASA Astrophysics Data System (ADS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru; Ortanza Cioflan, Carmen; Manea, Elena-Florinela

    2015-04-01

    Two methods are commonly used for seismic hazard assessment: probabilistic (PSHA) and deterministic(DSHA) seismic hazard analysis.Selection of a ground motion for engineering design requires a clear understanding of seismic hazard and risk among stakeholders, seismologists and engineers. What is wrong with traditional PSHA or DSHA ? PSHA common used in engineering is using four assumptions developed by Cornell in 1968:(1)-Constant-in-time average occurrence rate of earthquakes; (2)-Single point source; (3).Variability of ground motion at a site is independent;(4)-Poisson(or "memory - less") behavior of earthquake occurrences. It is a probabilistic method and "when the causality dies, its place is taken by probability, prestigious term meant to define the inability of us to predict the course of nature"(Nils Bohr). DSHA method was used for the original design of Fukushima Daichii, but Japanese authorities moved to probabilistic assessment methods and the probability of exceeding of the design basis acceleration was expected to be 10-4-10-6 . It was exceeded and it was a violation of the principles of deterministic hazard analysis (ignoring historical events)(Klügel,J,U, EGU,2014, ISSO). PSHA was developed from mathematical statistics and is not based on earthquake science(invalid physical models- point source and Poisson distribution; invalid mathematics; misinterpretation of annual probability of exceeding or return period etc.) and become a pure numerical "creation" (Wang, PAGEOPH.168(2011),11-25). An uncertainty which is a key component for seismic hazard assessment including both PSHA and DSHA is the ground motion attenuation relationship or the so-called ground motion prediction equation (GMPE) which describes a relationship between a ground motion parameter (i.e., PGA,MMI etc.), earthquake magnitude M, source to site distance R, and an uncertainty. So far, no one is taking into consideration strong nonlinear behavior of soils during of strong earthquakes. But, how many cities, villages, metropolitan areas etc. in seismic regions are constructed on rock? Most of them are located on soil deposits? A soil is of basic type sand or gravel (termed coarse soils), silt or clay (termed fine soils) etc. The effect on nonlinearity is very large. For example, if we maintain the same spectral amplification factor (SAF=5.8942) as for relatively strong earthquake on May 3,1990(MW=6.4),then at Bacǎu seismic station for Vrancea earthquake on May 30,1990 (MW =6.9) the peak acceleration has to be a*max =0.154g and the actual recorded was only, amax =0.135g(-14.16%). Also, for Vrancea earthquake on August 30,1986(MW=7.1),the peak acceleration has to be a*max = 0.107g instead of real value recorded of 0.0736 g(- 45.57%). There are many data for more than 60 seismic stations. There is a strong nonlinear dependence of SAF with earthquake magnitude in each site. The authors are coming with an alternative approach called "real spectral amplification factors" instead of GMPE for all extra-Carpathian area where all cities and villages are located on soil deposits. Key words: Probabilistic Seismic Hazard; Uncertainties; Nonlinear seismology; Spectral amplification factors(SAF).

  1. An Atlas of ShakeMaps and population exposure catalog for earthquake loss modeling

    USGS Publications Warehouse

    Allen, T.I.; Wald, D.J.; Earle, P.S.; Marano, K.D.; Hotovec, A.J.; Lin, K.; Hearne, M.G.

    2009-01-01

    We present an Atlas of ShakeMaps and a catalog of human population exposures to moderate-to-strong ground shaking (EXPO-CAT) for recent historical earthquakes (1973-2007). The common purpose of the Atlas and exposure catalog is to calibrate earthquake loss models to be used in the US Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER). The full ShakeMap Atlas currently comprises over 5,600 earthquakes from January 1973 through December 2007, with almost 500 of these maps constrained-to varying degrees-by instrumental ground motions, macroseismic intensity data, community internet intensity observations, and published earthquake rupture models. The catalog of human exposures is derived using current PAGER methodologies. Exposure to discrete levels of shaking intensity is obtained by correlating Atlas ShakeMaps with a global population database. Combining this population exposure dataset with historical earthquake loss data, such as PAGER-CAT, provides a useful resource for calibrating loss methodologies against a systematically-derived set of ShakeMap hazard outputs. We illustrate two example uses for EXPO-CAT; (1) simple objective ranking of country vulnerability to earthquakes, and; (2) the influence of time-of-day on earthquake mortality. In general, we observe that countries in similar geographic regions with similar construction practices tend to cluster spatially in terms of relative vulnerability. We also find little quantitative evidence to suggest that time-of-day is a significant factor in earthquake mortality. Moreover, earthquake mortality appears to be more systematically linked to the population exposed to severe ground shaking (Modified Mercalli Intensity VIII+). Finally, equipped with the full Atlas of ShakeMaps, we merge each of these maps and find the maximum estimated peak ground acceleration at any grid point in the world for the past 35 years. We subsequently compare this "composite ShakeMap" with existing global hazard models, calculating the spatial area of the existing hazard maps exceeded by the combined ShakeMap ground motions. In general, these analyses suggest that existing global, and regional, hazard maps tend to overestimate hazard. Both the Atlas of ShakeMaps and EXPO-CAT have many potential uses for examining earthquake risk and epidemiology. All of the datasets discussed herein are available for download on the PAGER Web page ( http://earthquake.usgs.gov/ eqcenter/pager/prodandref/ ). ?? 2009 Springer Science+Business Media B.V.

  2. Post-disaster Risk Assessment for Hilly Terrain exposed to Seismic Loading

    NASA Astrophysics Data System (ADS)

    Yates, Katherine; Villeneuve, Marlene; Wilson, Thomas

    2013-04-01

    The 2010-present Canterbury earthquake sequence in the central South Island of New Zealand has identified and highlighted the value of practical, standardised and coordinated geotechnical risk assessment guidelines for inhabited structures in the aftermath of a geotechnical disaster. The lack of such guidelines and provisions to enforce risk assessments was a major gap which hindered coordinated, timely and transparent management of geotechnical risk. The earthquake sequence initiated a series of rockfall, cliff collapse and landslide events around the Port Hills southeast of Christchurch. This was particularly the case with the 22 February 2011 earthquakes, which put thousands of people inhabiting the area at risk. Lives were lost and thousands of houses and critical infrastructure were damaged. Given the highly seismic environment in New Zealand and a significant number of active faults near population centres, it is prudent to develop such guidelines to ensure response mechanisms and geotechnical risk assessment is effective following an earthquake rupture in a largely populated urban environment. For response and associated risk assessments to be effective, the mechanisms of the geotechnical failure should be taken into consideration as part of the life safety assessment. This is to ensure that the hazard's potential risk is fully assessed and encompassed in decisions regarding life safety. This paper examines the event sequence, slope failure mechanisms and the geotechnical risk management approach that developed immediately post-earthquake. It highlights experiences from key municipal, management and operational stakeholders who were involved in geotechnical risk assessment during the Canterbury earthquake sequence, and sheds light on the evolution of information needed through time during the emergency response and identify the hard won lessons. It then discusses what is needed for life safety assessment post-earthquake and create awareness of potential geotechnical hazards. This is not only important to New Zealand but has international implications as there are many other regions of the world also subject to high seismic risk.

  3. Seismic hazard analysis with PSHA method in four cities in Java.

    NASA Astrophysics Data System (ADS)

    Elistyawati, Y.; Palupi, I. R.; Suharsono

    2016-11-01

    In this study the tectonic earthquakes was observed through the peak ground acceleration through the PSHA method by dividing the area of the earthquake source. This study applied the earthquake data from 1965 - 2015 that has been analyzed the completeness of the data, location research was the entire Java with stressed in four large cities prone to earthquakes. The results were found to be a hazard map with a return period of 500 years, 2500 years return period, and the hazard curve were four major cities (Jakarta, Bandung, Yogyakarta, and the city of Banyuwangi). Results Java PGA hazard map 500 years had a peak ground acceleration within 0 g ≥ 0.5 g, while the return period of 2500 years had a value of 0 to ≥ 0.8 g. While, the PGA hazard curves on the city's most influential source of the earthquake was from sources such as fault Cimandiri backgroud, for the city of Bandung earthquake sources that influence the seismic source fault dent background form. In other side, the city of Yogyakarta earthquake hazard curve of the most influential was the source of the earthquake background of the Opak fault, and the most influential hazard curve of Banyuwangi earthquake was the source of Java and Sumba megatruts earthquake.

  4. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  5. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  6. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-17

    ... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... on NEHRP earthquake related activities and to gather information for the 2011 Annual Report of the...

  7. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  8. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  9. OpenQuake, a platform for collaborative seismic hazard and risk assessment

    NASA Astrophysics Data System (ADS)

    Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben

    2013-04-01

    Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental instruments for the creation of seismogenic input models for seismic hazard assessment, a critical input to the OpenQuake Engine. OpenQuake Modeller will consist of a suite of tools (Hazard Modellers Toolkit) for characterizing the seismogenic sources of earthquakes and their models of earthquakes recurrence. An earthquake catalogue homogenization tool, for integration, statistical comparison and user-defined harmonization of multiple catalogues of earthquakes is also included in the OpenQuake modeling tools. • A data capture tool for active faults; a tool that allows geologists to draw (new) fault discoveries on a map in an intuitive GIS-environment and add details on the fault through the tool. This data, once quality checked, can then be integrated with the global active faults database, which will increase in value with every new fault insertion. Building on many ongoing efforts and the knowledge of scientists worldwide, GEM will for the first time integrate state-of-the-art data, models, results and open-source tools into a single platform. The platform will continue to increase in value, in particular for use in local contexts, through contributions from and collaborations with scientists and organisations worldwide. This presentation will showcase the OpenQuake Platform, focusing on the IT solutions that have been adopted as well as the added value that the platform will bring to scientists worldwide.

  10. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    NASA Astrophysics Data System (ADS)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  11. A Guide to School Vulnerability Assessments: Key Principles for Safe Schools

    ERIC Educational Resources Information Center

    Office of Safe and Drug-Free Schools, US Department of Education, 2008

    2008-01-01

    Crises affect schools across the country every day. While natural hazards such as tornadoes, floods, hurricanes, and earthquakes may be thought of more commonly as emergencies, schools are also at risk from other hazards such as school violence, infectious disease, and terrorist threats. Through the vulnerability assessment process, schools can…

  12. Seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir; Nekrasova, Anastasia

    2014-05-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. Any kind of risk estimates R(g) at location g results from a convolution of the natural hazard H(g) with the exposed object under consideration O(g) along with its vulnerability V(O(g)). Note that g could be a point, or a line, or a cell on or under the Earth surface and that distribution of hazards, as well as objects of concern and their vulnerability, could be time-dependent. There exist many different risk estimates even if the same object of risk and the same hazard are involved. It may result from the different laws of convolution, as well as from different kinds of vulnerability of an object of risk under specific environments and conditions. Both conceptual issues must be resolved in a multidisciplinary problem oriented research performed by specialists in the fields of hazard, objects of risk, and object vulnerability, i.e. specialists in earthquake engineering, social sciences and economics. To illustrate this general concept, we first construct seismic hazard assessment maps based on the Unified Scaling Law for Earthquakes (USLE). The parameters A, B, and C of USLE, i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an area of linear size L, are used to estimate the expected maximum magnitude in 50 years and the corresponding expected ground shaking intensity in a cell g of a uniform grid of the region of interest. Then such a seismic hazard map is used to generate earthquake risk maps based on the exposed population density. Some oversimplified convolutions R(g) = H(g)•gP•F(gP) of seismic hazard assessment maps H(g) are applied in a few regions with population density distribution P of vulnerability V=F(gP), where g is a cell of a uniform grid and gP is the integral of the population density over the cell g.

  13. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thenhaus, P.C.

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-12 in the area of New Madrid, Missouri. They are considered to be the greatest earthquakes in the conterminous U.S. because they were felt and caused damage at far greater distances than any other earthquakes in US history. In contrast to California, where earthquakes are felt frequently, the damaging earthquakes that have occurred in the Eastern US are generally regarded as only historical phenomena. A fundamental problem in the Eastern US, therefore, is that the earthquake hazard is not generally considered today in land-use andmore » civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the tools that geoscientists have to study the region. The so-called earthquake hazard is defined by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. The term earthquake risk, on the other hand, refers to aspects of the expected damage to manmade structures and to lifelines as a result of the earthquake hazard.« less

  14. How well can we test probabilistic seismic hazard maps?

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart

    2017-04-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake recurrence, and so decreases as the largest earthquakes occur in more simulations. Our results are important for evaluating the performance of a hazard map based on misfits in fractional exceedance, and for assessing whether such misfit arises by chance or reflects a bias in the map. More specifically, we determined for a broad range of Gutenberg-Richter a-values theoretical confidence intervals on allowed misfits in fractional exceedance and on the percentage of hazard-map bias that can thus be detected by comparison with observed shaking histories. Given that in the real world we only have one shaking history for an area, these results indicate that even if a hazard map does not fit the observations, it is very difficult to assess its veracity, especially for low-to-moderate-seismicity regions. Because our model is a simplified version of reality, any additional uncertainty or complexity will tend to widen these confidence intervals.

  15. An Ensemble Approach for Improved Short-to-Intermediate-Term Seismic Potential Evaluation

    NASA Astrophysics Data System (ADS)

    Yu, Huaizhong; Zhu, Qingyong; Zhou, Faren; Tian, Lei; Zhang, Yongxian

    2017-06-01

    Pattern informatics (PI), load/unload response ratio (LURR), state vector (SV), and accelerating moment release (AMR) are four previously unrelated subjects, which are sensitive, in varying ways, to the earthquake's source. Previous studies have indicated that the spatial extent of the stress perturbation caused by an earthquake scales with the moment of the event, allowing us to combine these methods for seismic hazard evaluation. The long-range earthquake forecasting method PI is applied to search for the seismic hotspots and identify the areas where large earthquake could be expected. And the LURR and SV methods are adopted to assess short-to-intermediate-term seismic potential in each of the critical regions derived from the PI hotspots, while the AMR method is used to provide us with asymptotic estimates of time and magnitude of the potential earthquakes. This new approach, by combining the LURR, SV and AMR methods with the choice of identified area of PI hotspots, is devised to augment current techniques for seismic hazard estimation. Using the approach, we tested the strong earthquakes occurred in Yunnan-Sichuan region, China between January 1, 2013 and December 31, 2014. We found that most of the large earthquakes, especially the earthquakes with magnitude greater than 6.0 occurred in the seismic hazard regions predicted. Similar results have been obtained in the prediction of annual earthquake tendency in Chinese mainland in 2014 and 2015. The studies evidenced that the ensemble approach could be a useful tool to detect short-to-intermediate-term precursory information of future large earthquakes.

  16. The 1909 Taipei earthquake: implication for seismic hazard in Taipei

    USGS Publications Warehouse

    Kanamori, Hiroo; Lee, William H.K.; Ma, Kuo-Fong

    2012-01-01

    The 1909 April 14 Taiwan earthquake caused significant damage in Taipei. Most of the information on this earthquake available until now is from the written reports on its macro-seismic effects and from seismic station bulletins. In view of the importance of this event for assessing the shaking hazard in the present-day Taipei, we collected historical seismograms and station bulletins of this event and investigated them in conjunction with other seismological data. We compared the observed seismograms with those from recent earthquakes in similar tectonic environments to characterize the 1909 earthquake. Despite the inevitably large uncertainties associated with old data, we conclude that the 1909 Taipei earthquake is a relatively deep (50–100 km) intraplate earthquake that occurred within the subducting Philippine Sea Plate beneath Taipei with an estimated M_W of 7 ± 0.3. Some intraplate events elsewhere in the world are enriched in high-frequency energy and the resulting ground motions can be very strong. Thus, despite its relatively large depth and a moderately large magnitude, it would be prudent to review the safety of the existing structures in Taipei against large intraplate earthquakes like the 1909 Taipei earthquake.

  17. Hazard Assessment and Early Warning of Tsunamis: Lessons from the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2012-12-01

    The March 11, 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history, and was the best recorded subduction-zone earthquakes in the world. In particular, various offshore geophysical observations revealed large horizontal and vertical seafloor movements, and the tsunami was recorded on high-quality, high-sampling gauges. Analysis of such tsunami waveforms shows a temporal and spatial slip distribution during the 2011 Tohoku earthquake. The fault rupture started near the hypocenter and propagated into both deep and shallow parts of the plate interface. Very large, ~25 m, slip off Miyagi on the deep part of plate interface corresponds to an interplate earthquake of M 8.8, the location and size similar to 869 Jogan earthquake model, and was responsible for the large tsunami inundation in Sendai and Ishinomaki plains. Huge slip, more than 50 m, occurred on the shallow part near the trench axis ~3 min after the earthquake origin time. This delayed shallow rupture (M 8.8) was similar to the 1896 "tsunami earthquake," and was responsible for the large tsunami on the northern Sanriku coast, measured at ~100 km north of the largest slip. Thus the Tohoku earthquake can be decomposed into an interplate earthquake and the triggered "tsunami earthquake." The Japan Meteorological Agency issued tsunami warning 3 minutes after the earthquake, and saved many lives. However, their initial estimation of tsunami height was underestimated, because the earthquake magnitude was initially estimated as M 7.9, hence the computed tsunami heights were lower. The JMA attempts to improve the tsunami warning system, including technical developments to estimate the earthquake size in a few minutes by using various and redundant information, to deploy and utilize the offshore tsunami observations, and to issue a warning based on the worst case scenario if a possibility of giant earthquake exists. Predicting a trigger of another large earthquake would still be a challenge. Tsunami hazard assessments or long-term forecast of earthquakes have not considered such a triggering or simultaneous occurrence of different types of earthquakes. The large tsunami at the Fukushima nuclear power station was due to the combination of the deep and shallow slip. Disaster prevention for low-frequency but large-scale hazard must be considered. The Japanese government established a general policy to for two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, but cause devastating disaster once they occur. For such events, saving people's lives is the first priority and soft measures such as tsunami hazard maps, evacuation facilities or disaster education will be prepared. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared to protect lives and properties of residents as well as economic and industrial activities.

  18. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of population in the daily cycle to re-assess exposure to earthquake hazard in the Lisbon Metropolitan Area, home to almost three million people. New high-resolution (50 m grids) daytime and nighttime population distribution maps are developed using dasymetric mapping. The modeling approach uses areal interpolation to combine best-available census data and statistics with land use and land cover data. Mobility statistics are considered for mapping daytime distribution, and empirical parameters used for interpolation are obtained from a previous effort in high resolution population mapping of part of the study area. Finally, the population distribution maps are combined with the Seismic Hazard Intensity map to: (1) quantify and compare human exposure to seismic intensity levels in the daytime and nighttime periods, and (2) derive nighttime and daytime overall Earthquake Risk maps. This novel approach yields previously unavailable spatio-temporal population distribution information for the study area, enabling refined and more accurate earthquake risk mapping and assessment. Additionally, such population exposure datasets can be combined with different hazard maps to improve spatio-temporal assessment and risk mapping for any type of hazard, natural or man-made. We believe this improved characterization of vulnerability and risk can benefit all phases of the disaster management process where human exposure has to be considered, namely in emergency planning, risk mitigation, preparedness, and response to an event.

  19. Effects of Strike-Slip Fault Segmentation on Earthquake Energy and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Madden, E. H.; Cooke, M. L.; Savage, H. M.; McBeck, J.

    2014-12-01

    Many major strike-slip faults are segmented along strike, including those along plate boundaries in California and Turkey. Failure of distinct fault segments at depth may be the source of multiple pulses of seismic radiation observed for single earthquakes. However, how and when segmentation affects fault behavior and energy release is the basis of many outstanding questions related to the physics of faulting and seismic hazard. These include the probability for a single earthquake to rupture multiple fault segments and the effects of segmentation on earthquake magnitude, radiated seismic energy, and ground motions. Using numerical models, we quantify components of the earthquake energy budget, including the tectonic work acting externally on the system, the energy of internal rock strain, the energy required to overcome fault strength and initiate slip, the energy required to overcome frictional resistance during slip, and the radiated seismic energy. We compare the energy budgets of systems of two en echelon fault segments with various spacing that include both releasing and restraining steps. First, we allow the fault segments to fail simultaneously and capture the effects of segmentation geometry on the earthquake energy budget and on the efficiency with which applied displacement is accommodated. Assuming that higher efficiency correlates with higher probability for a single, larger earthquake, this approach has utility for assessing the seismic hazard of segmented faults. Second, we nucleate slip along a weak portion of one fault segment and let the quasi-static rupture propagate across the system. Allowing fractures to form near faults in these models shows that damage develops within releasing steps and promotes slip along the second fault, while damage develops outside of restraining steps and can prohibit slip along the second fault. Work is consumed in both the propagation of and frictional slip along these new fractures, impacting the energy available for further slip and for subsequent earthquakes. This suite of models reveals that efficiency may be a useful tool for determining the relative seismic hazard of different segmented fault systems, while accounting for coseismic damage zone production is critical in assessing fault interactions and the associated energy budgets of specific systems.

  20. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume relative to the previous year. These results imply that such hazard maps have the potential to be valuable tools for policy makers and regulators in managing the seismic risks associated with unconventional oil and gas production.

  1. Seismic hazard assessment of the Kivu rift segment based on a new seismotectonic zonation model (western branch, East African Rift system)

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Mulumba, Jean-Luc; Sebagenzi, Mwene Ntabwoba Stanislas; Bondo, Silvanos Fiama; Kervyn, François; Havenith, Hans-Balder

    2017-10-01

    In the frame of the Belgian GeoRisCA multi-risk assessment project focusing on the Kivu and northern Tanganyika rift region in Central Africa, a new probabilistic seismic hazard assessment has been performed for the Kivu rift segment in the central part of the western branch of the East African rift system. As the geological and tectonic setting of this region is incompletely known, especially the part lying in the Democratic Republic of the Congo, we compiled homogeneous cross-border tectonic and neotectonic maps. The seismic risk assessment is based on a new earthquake catalogue based on the ISC reviewed earthquake catalogue and supplemented by other local catalogues and new macroseismic epicenter data spanning 126 years, with 1068 events. The magnitudes have been homogenized to Mw and aftershocks removed. The final catalogue used for the seismic hazard assessment spans 60 years, from 1955 to 2015, with 359 events and a magnitude of completeness of 4.4. The seismotectonic zonation into 7 seismic source areas was done on the basis of the regional geological structure, neotectonic fault systems, basin architecture and distribution of thermal springs and earthquake epicenters. The Gutenberg-Richter seismic hazard parameters were determined by the least square linear fit and the maximum likelihood method. Seismic hazard maps have been computed using existing attenuation laws with the Crisis 2012 software. We obtained higher PGA values (475 years return period) for the Kivu rift region than the previous estimates. They also vary laterally in function of the tectonic setting, with the lowest value in the volcanically active Virunga - Rutshuru zone, highest in the currently non-volcanic parts of Lake Kivu, Rusizi valley and North Tanganyika rift zone, and intermediate in the regions flanking the axial rift zone.

  2. Sensitivity analysis of the FEMA HAZUS-MH MR4 Earthquake Model using seismic events affecting King County Washington

    NASA Astrophysics Data System (ADS)

    Neighbors, C.; Noriega, G. R.; Caras, Y.; Cochran, E. S.

    2010-12-01

    HAZUS-MH MR4 (HAZards U. S. Multi-Hazard Maintenance Release 4) is a risk-estimation software developed by FEMA to calculate potential losses due to natural disasters. Federal, state, regional, and local government use the HAZUS-MH Earthquake Model for earthquake risk mitigation, preparedness, response, and recovery planning (FEMA, 2003). In this study, we examine several parameters used by the HAZUS-MH Earthquake Model methodology to understand how modifying the user-defined settings affect ground motion analysis, seismic risk assessment and earthquake loss estimates. This analysis focuses on both shallow crustal and deep intraslab events in the American Pacific Northwest. Specifically, the historic 1949 Mw 6.8 Olympia, 1965 Mw 6.6 Seattle-Tacoma and 2001 Mw 6.8 Nisqually normal fault intraslab events and scenario large-magnitude Seattle reverse fault crustal events are modeled. Inputs analyzed include variations of deterministic event scenarios combined with hazard maps and USGS ShakeMaps. This approach utilizes the capacity of the HAZUS-MH Earthquake Model to define landslide- and liquefaction- susceptibility hazards with local groundwater level and slope stability information. Where Shakemap inputs are not used, events are run in combination with NEHRP soil classifications to determine site amplification effects. The earthquake component of HAZUS-MH applies a series of empirical ground motion attenuation relationships developed from source parameters of both regional and global historical earthquakes to estimate strong ground motion. Ground motion and resulting ground failure due to earthquakes are then used to calculate, direct physical damage for general building stock, essential facilities, and lifelines, including transportation systems and utility systems. Earthquake losses are expressed in structural, economic and social terms. Where available, comparisons between recorded earthquake losses and HAZUS-MH earthquake losses are used to determine how region coordinators can most effectively utilize their resources for earthquake risk mitigation. This study is being conducted in collaboration with King County, WA officials to determine the best model inputs necessary to generate robust HAZUS-MH models for the Pacific Northwest.

  3. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    NASA Astrophysics Data System (ADS)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2018-04-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  4. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    NASA Astrophysics Data System (ADS)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2017-11-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  5. A stochastic automata network for earthquake simulation and hazard estimation

    NASA Astrophysics Data System (ADS)

    Belubekian, Maya Ernest

    1998-11-01

    This research develops a model for simulation of earthquakes on seismic faults with available earthquake catalog data. The model allows estimation of the seismic hazard at a site of interest and assessment of the potential damage and loss in a region. There are two approaches for studying the earthquakes: mechanistic and stochastic. In the mechanistic approach, seismic processes, such as changes in stress or slip on faults, are studied in detail. In the stochastic approach, earthquake occurrences are simulated as realizations of a certain stochastic process. In this dissertation, a stochastic earthquake occurrence model is developed that uses the results from dislocation theory for the estimation of slip released in earthquakes. The slip accumulation and release laws and the event scheduling mechanism adopted in the model result in a memoryless Poisson process for the small and moderate events and in a time- and space-dependent process for large events. The minimum and maximum of the hazard are estimated by the model when the initial conditions along the faults correspond to a situation right after a largest event and after a long seismic gap, respectively. These estimates are compared with the ones obtained from a Poisson model. The Poisson model overestimates the hazard after the maximum event and underestimates it in the period of a long seismic quiescence. The earthquake occurrence model is formulated as a stochastic automata network. Each fault is divided into cells, or automata, that interact by means of information exchange. The model uses a statistical method called bootstrap for the evaluation of the confidence bounds on its results. The parameters of the model are adjusted to the target magnitude patterns obtained from the catalog. A case study is presented for the city of Palo Alto, where the hazard is controlled by the San Andreas, Hayward and Calaveras faults. The results of the model are used to evaluate the damage and loss distribution in Palo Alto. The sensitivity analysis of the model results to the variation in basic parameters shows that the maximum magnitude has the most significant impact on the hazard, especially for long forecast periods.

  6. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  7. Continuing Megathrust Earthquake Potential in northern Chile after the 2014 Iquique Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Hayes, G. P.; Herman, M. W.; Barnhart, W. D.; Furlong, K. P.; Riquelme, S.; Benz, H.; Bergman, E.; Barrientos, S. E.; Earle, P. S.; Samsonov, S. V.

    2014-12-01

    The seismic gap theory, which identifies regions of elevated hazard based on a lack of recent seismicity in comparison to other portions of a fault, has successfully explained past earthquakes and is useful for qualitatively describing where future large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which until recently had not ruptured in a megathrust earthquake since a M~8.8 event in 1877. On April 1 2014, a M 8.2 earthquake occurred within this northern Chile seismic gap, offshore of the city of Iquique; the size and spatial extent of the rupture indicate it was not the earthquake that had been anticipated. Here, we present a rapid assessment of the seismotectonics of the March-April 2014 seismic sequence offshore northern Chile, including analyses of earthquake (fore- and aftershock) relocations, moment tensors, finite fault models, moment deficit calculations, and cumulative Coulomb stress transfer calculations over the duration of the sequence. This ensemble of information allows us to place the current sequence within the context of historic seismicity in the region, and to assess areas of remaining and/or elevated hazard. Our results indicate that while accumulated strain has been released for a portion of the northern Chile seismic gap, significant sections have not ruptured in almost 150 years. These observations suggest that large-to-great sized megathrust earthquakes will occur north and south of the 2014 Iquique sequence sooner than might be expected had the 2014 events ruptured the entire seismic gap.

  8. InSAR Evidence for an active shallow thrust fault beneath the city of Spokane Washington, USA

    USGS Publications Warehouse

    Wicks, Charles W.; Weaver, Craig S.; Bodin, Paul; Sherrod, Brian

    2013-01-01

    In 2001, a nearly five month long sequence of shallow, mostly small magnitude earthquakes occurred beneath the city of Spokane, a city with a population of about 200,000, in the state of Washington. During most of the sequence, the earthquakes were not well located because seismic instrumentation was sparse. Despite poor-quality locations, the earthquake hypocenters were likely very shallow, because residents near the city center both heard and felt many of the earthquakes. The combination of poor earthquake locations and a lack of known surface faults with recent movement make assessing the seismic hazards related to the earthquake swarm difficult. However, the potential for destruction from a shallow moderate-sized earthquake is high, for example Christchurch New Zealand in 2011, so assessing the hazard potential of a seismic structure involved in the Spokane earthquake sequence is important. Using interferometric synthetic aperture radar (InSAR) data from the European Space Agency ERS2 and ENVISAT satellites and the Canadian Space Agency RADARSAT-1, satellite we are able to show that slip on a shallow previously unknown thrust fault, which we name the Spokane Fault, is the source of the earthquake sequence. The part of the Spokane Fault that slipped during the 2001 earthquake sequence underlies the north part of the city, and slip on the fault was concentrated between ~0.3 and 2 km depth. Projecting the buried fault plane to the surface gives a possible surface trace for the Spokane Fault that strikes northeast from the city center into north Spokane.

  9. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    NASA Astrophysics Data System (ADS)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as being incorrect for scientific reasons and here I argue that it is also ineffective for psychological reasons. Instead of calming the people or by underestimating the hazard in strongly active areas by the GSHAP approach, they should be told quantitatively the consequences of the reasonably worst case and be motivated to prepare for it, whether or not it may hit the present or the next generation. In a worst case scenario for L'Aquila, the number of expected fatalities and injured should have been calculated for an event in the range of M6.5 to M7, as I did for a civil defense exercise in Umbria, Italy. With the prospect that approximately 500 people may die in an earthquake in the immediate or distant future, some residents might have built themselves an earthquake closet (similar to a simple tornado shelter) in a corner of their apartment, into which they might have dashed to safety at the onset of the P-wave before the destructive S-wave arrived. I conclude that in earthquake prone areas quantitative loss estimates due to a reasonable worst case earthquake should replace probabilistic hazard and risk estimates. This is a service, which experts owe the community. Insurance companies and academics may still find use for probabilistic estimates of losses, especially in areas of low seismic hazard, where the worst case scenario approach is less appropriate.

  10. Reducing Vulnerability of Ports and Harbors to Earthquake and Tsunami Hazards

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Recent scientific research suggests the Pacific Northwest could experience catastrophic earthquakes in the near future, both from distant and local sources, posing a significant threat to coastal communities. Damage could result from numerous earthquake-related hazards, such as severe ground shaking, soil liquefaction, landslides, land subsidence/uplift, and tsunami inundation. Because of their geographic location, ports and harbors are especially vulnerable to these hazards. Ports and harbors, however, are important components of many coastal communities, supporting numerous activities critical to the local and regional economy and possibly serving as vital post-event, response-recovery transportation links. A collaborative, multi-year initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to earthquake and tsunami hazards, involving Oregon Sea Grant (OSG), Washington Sea Grant (WSG), the National Oceanic and Atmospheric Administration Coastal Services Center (CSC), and the U.S. Geological Survey Center for Science Policy (CSP). Specific products of this research, planning, and outreach initiative include a regional stakeholder issues and needs assessment, a community-based mitigation planning process, a Geographic Information System (GIS) — based vulnerability assessment methodology, an educational web-site and a regional data archive. This paper summarizes these efforts, including results of two pilot port-harbor community projects, one in Yaquina Bay, Oregon and the other in Sinclair Inlet, Washington. Finally, plans are outlined for outreach to other port and harbor communities in the Pacific Northwest and beyond, using "getting started" workshops and a web-based tutorial.

  11. Understanding risk and resilience to natural hazards

    USGS Publications Warehouse

    Wood, Nathan

    2011-01-01

    Natural hazards threaten the safety and economic wellbeing of communities. These hazards include sudden-onset hazards, such as earthquakes, and slowly emerging, chronic hazards, such as those associated with climate change. To help public officials, emergency and other managers, the business community, and at-risk individuals reduce the risks posed by such hazards, the USGS Western Geographic Science Center is developing new ways to assess and communicate societal risk and resilience to catastrophic and chronic natural hazards.

  12. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  13. Volunteers in the earthquake hazard reduction program

    USGS Publications Warehouse

    Ward, P.L.

    1978-01-01

    With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 

  14. Italian Case Studies Modelling Complex Earthquake Sources In PSHA

    NASA Astrophysics Data System (ADS)

    Gee, Robin; Peruzza, Laura; Pagani, Marco

    2017-04-01

    This study presents two examples of modelling complex seismic sources in Italy, done in the framework of regional probabilistic seismic hazard assessment (PSHA). The first case study is for an area centred around Collalto Stoccaggio, a natural gas storage facility in Northern Italy, located within a system of potentially seismogenic thrust faults in the Venetian Plain. The storage exploits a depleted natural gas reservoir located within an actively growing anticline, which is likely driven by the Montello Fault, the underlying blind thrust. This fault has been well identified by microseismic activity (M<2) detected by a local seismometric network installed in 2012 (http://rete-collalto.crs.inogs.it/). At this time, no correlation can be identified between the gas storage activity and local seismicity, so we proceed with a PSHA that considers only natural seismicity, where the rates of earthquakes are assumed to be time-independent. The source model consists of faults and distributed seismicity to consider earthquakes that cannot be associated to specific structures. All potentially active faults within 50 km of the site are considered, and are modelled as 3D listric surfaces, consistent with the proposed geometry of the Montello Fault. Slip rates are constrained using available geological, geophysical and seismological information. We explore the sensitivity of the hazard results to various parameters affected by epistemic uncertainty, such as ground motions prediction equations with different rupture-to-site distance metrics, fault geometry, and maximum magnitude. The second case is an innovative study, where we perform aftershock probabilistic seismic hazard assessment (APSHA) in Central Italy, following the Amatrice M6.1 earthquake of August 24th, 2016 (298 casualties) and the subsequent earthquakes of Oct 26th and 30th (M6.1 and M6.6 respectively, no deaths). The aftershock hazard is modelled using a fault source with complex geometry, based on literature data and field evidence associated with the August mainshock. Earthquake activity rates during the very first weeks after the deadly earthquake were used to calibrated an Omori-Utsu decay curve, and the magnitude distribution of aftershocks is assumed to follow a Gutenberg-Richter distribution. We apply uniform and non-uniform spatial distribution of the seismicity across the fault source, by modulating the rates as a decreasing function of distance from the mainshock. The hazard results are computed for short-exposure periods (1 month, before the occurrences of October earthquakes) and compared to the background hazard given by law (MPS04), and to observations at some reference sites. We also show the results of disaggregation computed for the city of Amatrice. Finally, we attempt to update the results in light of the new "main" events that occurred afterwards in the region. All source modeling and hazard calculations are performed using the OpenQuake engine. We discuss the novelties of these works, and the benefits and limitations of both analyses, particularly in such different contexts of seismic hazard.

  15. New seafloor map of the Puerto Rico trench helps assess earthquake and tsunami hazards

    NASA Astrophysics Data System (ADS)

    Brink, Uri ten; Danforth, William; Polloni, Christopher; Andrews, Brian; Llanes, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-09-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure l). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S.Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands [McCann et al., 2004]. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918 [Mercado and McCann, 1998]. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico [Mercado et al., 2002; Schwab et al., 1991],although their ages are unknown.

  16. New seafloor map of the Puerto Rico Trench helps assess earthquake and tsunami hazards

    USGS Publications Warehouse

    ten Brink, Uri S.; Danforth, William; Polloni, Christopher; Andrews, Brian D.; Llanes Estrada, Pilar; Smith, Shepard; Parker, Eugene; Uozumi, Toshihiko

    2004-01-01

    The Puerto Rico Trench, the deepest part of the Atlantic Ocean, is located where the North American (NOAM) plate is subducting under the Caribbean plate (Figure l). The trench region may pose significant seismic and tsunami hazards to Puerto Rico and the U.S.Virgin Islands, where 4 million U.S. citizens reside. Widespread damage in Puerto Rico and Hispaniola from an earthquake in 1787 was estimated to be the result of a magnitude 8 earthquake north of the islands [McCann et al., 2004]. A tsunami killed 40 people in NW Puerto Rico following a magnitude 7.3 earthquake in 1918 [Mercado and McCann, 1998]. Large landslide escarpments have been mapped on the seafloor north of Puerto Rico [Mercado et al., 2002; Schwab et al., 1991],although their ages are unknown.

  17. Seismicity and Seismic Hazard along the Western part of the Eurasia-Nubia plate boundary

    NASA Astrophysics Data System (ADS)

    Bezzeghoud, Mourad; Fontiela, João; Ferrão, Celia; Borges, José Fernando; Caldeira, Bento; Dib, Assia; Ousadou, Farida

    2016-04-01

    The seismic phenomenon is the most damaging natural hazard known in the Mediterranean area. The western part of the Eurasia-Nubia plate boundary extends from the Azores to the Mediterranean region. The oceanic part of the plate boundary is well delimited from the Azores Islands, along the Azores-Gibraltar fault to approximately 12°W (west of the Strait of Gibraltar). From 12°W to 3.5°E, including the Iberia-Nubia region and extending to the western part of Algeria, the boundary is more diffuse and forms a wider area of deformation. The boundary between the Iberia and Nubia plates is the most complex part of the margin. This region corresponds to the transition from an oceanic boundary to a continental boundary, where Iberia and Nubia collide. Although most earthquakes along this plate boundary are shallow and generally have magnitudes less than 5.5, there have been several high-magnitude events. Many devastating earthquakes, some of them tsunami-triggering, inflicted heavy loss and considerable economic damage to the region. From 1920 to present, three earthquakes with magnitudes of about 8.0 (Mw 8.2, 25 November 1941; Ms 8.0, 25 February 1969; and Mw 7.9, 26 May 1975) occurred in the oceanic region, and four earthquakes with magnitudes of about 7.0 (Mw 7.1, 8 May 1939, Santa Maria Island and Mw 7.1, January 1980, Terceira and Graciosa Islands, both in the Azores; Ms 7.1, 20 May 1931, Azores-Gibraltar fracture zone; and Mw 7.3, 10 October 1980, El Asnam, Algeria) occurred along the western part of the Eurasia-Nubia plate boundary. In general, large earthquakes (M ≥7) occur within the oceanic region, with the exception of the El Asnam (Algeria) earthquakes. Some of these events caused extensive damage. The 1755 Lisbon earthquake (˜Mw 9) on the Portugal Atlantic margin, about 200 km W-SW of Cape St. Vincent, was followed by a tsunami and fires that caused the near-total destruction of Lisbon and adjacent areas. Estimates of the death toll in Lisbon alone (~70,000) make it one of the deadliest earthquakes in history. Measured in lives lost, the 1926, 1980 and 1998 Azores earthquakes (Portugal), the 1954 and 1980 El Asnam earthquakes (North Algeria), the 1994 and 2004 Alhoceima earthquakes (North Morocco), and the 2003 Boumerdes earthquakes (North Algeria) were the worst earthquakes in the past 120 years in the study area. Hence, this region has experienced many large and damaging earthquakes. The city of Cairo (Egypt) was struck in October 1992 by an Mw 5.8 magnitude earthquake, which caused large damage. In 1935, the Syrte region in Libya experienced an M6.9 earthquake with severe damage. Generally, North Africa has experienced moderate earthquakes. However, the region remains vulnerable due to the shallow seismicity, the poor mechanical properties of its soil and local site conditions, and the consequent strength of the ground shaking. Knowing the behaviour of a seismogenic area, particularly the fault zone, will lead us to better assess the hazard and risk in and around large urban areas. In order to mitigate the destructive impact of the earthquakes, the regional seismic hazard in North Africa is assessed using different approaches (ex. deterministic and probabilistic) using historical and instrumental seismicity, earthquake sources, seismotectonic zonation, structural models and attenuation laws. As a result, reliable seismic hazard maps are produced in terms of maximum displacement and in terms of maximum intensity map. This research is funded by the Fundação para a Ciência e a Tecnologia (FCT, Portugal) under the project ICT-UID/GEO/04683/2013. This study also was conducted within the scope of the MEDYNA FP7-PEOPLE-2013-IRSES project, WP-1: Present-day Kinematics and seismic hazards, funded by the Seventh Framework European Programme.

  18. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake.

    PubMed

    Hayes, Gavin P; Herman, Matthew W; Barnhart, William D; Furlong, Kevin P; Riquelme, Sebástian; Benz, Harley M; Bergman, Eric; Barrientos, Sergio; Earle, Paul S; Samsonov, Sergey

    2014-08-21

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which had not ruptured in a megathrust earthquake since a M ∼8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March-April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  19. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake

    USGS Publications Warehouse

    Hayes, Gavin P.; Herman, Matthew W.; Barnhart, William D.; Furlong, Kevin P.; Riquelme, Sebástian; Benz, Harley M.; Bergman, Eric; Barrientos, Sergio; Earle, Paul S.; Samsonov, Sergey

    2014-01-01

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile which had not ruptured in a megathrust earthquake since a M ~8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March–April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  20. Extreme seismicity and disaster risks: Hazard versus vulnerability (Invited)

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2013-12-01

    Although the extreme nature of earthquakes has been known for millennia due to the resultant devastation from many of them, the vulnerability of our civilization to extreme seismic events is still growing. It is partly because of the increase in the number of high-risk objects and clustering of populations and infrastructure in the areas prone to seismic hazards. Today an earthquake may affect several hundreds thousand lives and cause significant damage up to hundred billion dollars; it can trigger an ecological catastrophe if occurs in close vicinity to a nuclear power plant. Two types of extreme natural events can be distinguished: (i) large magnitude low probability events, and (ii) the events leading to disasters. Although the first-type events may affect earthquake-prone countries directly or indirectly (as tsunamis, landslides etc.), the second-type events occur mainly in economically less-developed countries where the vulnerability is high and the resilience is low. Although earthquake hazards cannot be reduced, vulnerability to extreme events can be diminished by monitoring human systems and by relevant laws preventing an increase in vulnerability. Significant new knowledge should be gained on extreme seismicity through observations, monitoring, analysis, modeling, comprehensive hazard assessment, prediction, and interpretations to assist in disaster risk analysis. The advanced disaster risk communication skill should be developed to link scientists, emergency management authorities, and the public. Natural, social, economic, and political reasons leading to disasters due to earthquakes will be discussed.

  1. Hazard Evaluation in Valparaíso: the MAR VASTO Project

    NASA Astrophysics Data System (ADS)

    Indirli, Maurizio; Razafindrakoto, Hoby; Romanelli, Fabio; Puglisi, Claudio; Lanzoni, Luca; Milani, Enrico; Munari, Marco; Apablaza, Sotero

    2011-03-01

    The Project "MAR VASTO" (Risk Management in Valparaíso/Manejo de Riesgos en Valparaíso), funded by BID/IADB (Banco InterAmericano de Desarrollo/InterAmerican Development Bank), has been managed by ENEA, with an Italian/Chilean joined partnership and the support of local institutions. Valparaíso tells the never-ending story of a tight interaction between society and environment and the city has been declared a Patrimony of Humanity by UNESCO since 2003. The main goals of the project have been to evaluate in the Valparaíso urban area the impact of main hazards (earthquake, tsunami, fire, and landslide), defining scenarios and maps on a geo-referenced GIS database. In particular, for earthquake hazard assessment the realistic modelling of ground motion is a very important base of knowledge for the preparation of groundshaking scenarios which serve as a valid and economic tool to be fruitfully used by civil engineers, supplying a particularly powerful tool for the prevention aspects of Civil Defense. When numerical modelling is successfully compared with records (as in the case of the Valparaíso, 1985 earthquake), the resulting synthetic seismograms permit the generation of groundshaking maps, based upon a set of possible scenario earthquakes. Where no recordings are available for the scenario event, synthetic signals can be used to estimate ground motion without having to wait for a strong earthquake to occur (pre-disaster microzonation). For the tsunami hazard, the available reports, [e.g., SHOA (1999) Carta de Inundacion por Tsunami para la bahia de Valparaíso, Chile, http://www.shoa.cl/servicios/citsu/citsu.php], have been used as the reference documents for the hazard assessment for the Valparaíso site. The deep and detailed studies already carried out by SHOA have been complemented with (a) sets of parametric studies of the tsunamigenic potential of the 1985 and 1906 scenario earthquakes; and (b) analytical modelling of tsunami waveforms for different scenarios, in order to provide a complementary dataset to be used for the tsunami hazard assessment at Valparaíso. In addition, other targeted activities have been carried out, such as architectonic/urban planning studies/vulnerability evaluation for a pilot building stock in a historic area and a vulnerability analysis for three monumental churches. In this paper, a general description of the work is given, taking into account the in situ work that drove the suggestion of guidelines for mitigation actions.

  2. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario

    NASA Astrophysics Data System (ADS)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.

    2018-01-01

    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically synthesized bedrock ground motion for both the 1897 and 1934 earthquakes on non-linear analysis of local site conditions through DEEPSOIL Geotechnical analysis package present surface level peak ground acceleration of the order of 0.05-0.14 g for the 1934 Bihar-Nepal earthquake while for the 1897 Shillong earthquake it is found to be in the range of 0.03-0.11 g. The factor of safety (FOS) against liquefaction, the probability of liquefaction ( P L), the liquefaction potential index (LPI), and the liquefaction risk index are estimated under the influence of these two earthquakes wherein the city is classified into severe (LPI > 15), high (5 < LPI ≤ 15), moderate (0 < LPI ≤ 5), and non-liquefiable (LPI = 0) susceptibility zones. While the 1934 Bihar-Nepal earthquake induced moderate to severe liquefaction hazard condition in the city in mostly the deltaic plain and interdistributary marsh geomorphologic units with 13.5% sites exhibiting moderate hazard with a median LPI of 1.8, 8.5% sites depicting high with a median LPI of 9.1 and 4% sites with a median LPI of 18.9 exhibiting severe hazard condition, 1897 Shillong earthquake induced mostly non-liquefaction condition with very few sites depicting moderate and high liquefaction hazard. A conservative liquefaction hazard scenario of the city on the other hand estimated through deterministic approach for 10% probability of exceedance in 50 years predicts a high hazard zone in the 3.5-19 m depth region with FOS < 1 and P L > 65% comprising of coarse-grained sediments of sand, silty sand, and clayey silty sand in mostly the deltaic plain geomorphologic unit with 39.1% sites depicting severe liquefaction hazard with a median LPI of 28.3. A non-linear regression analysis on both the historical and deterministic liquefaction scenarios in P L versus LPI domain with ± 1 standard deviation confidence bound generated a cubic polynomial relationship between the two liquefaction hazard proxies. This study considered a bench mark for other cities in the country and elsewhere forms an integral part of the mega-seismic microzonation endeavors undertaken in all the earthquake-prone counties in the world.

  3. Earthquake scenarios based on lessons from the past

    NASA Astrophysics Data System (ADS)

    Solakov, Dimcho; Simeonova, Stella; Aleksandrova, Irena; Popova, Iliana

    2010-05-01

    Earthquakes are the most deadly of the natural disasters affecting the human environment; indeed catastrophic earthquakes have marked the whole human history. Global seismic hazard and vulnerability to earthquakes are increasing steadily as urbanization and development occupy more areas that are prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The implementation of the earthquake scenarios into the policies for seismic risk reduction will allow focusing on the prevention of earthquake effects rather than on intervention following the disasters. The territory of Bulgaria (situated in the eastern part of the Balkan Peninsula) represents a typical example of high seismic risk area. Over the centuries, Bulgaria has experienced strong earthquakes. At the beginning of the 20-the century (from 1901 to 1928) five earthquakes with magnitude larger than or equal to MS=7.0 occurred in Bulgaria. However, no such large earthquakes occurred in Bulgaria since 1928, which may induce non-professionals to underestimate the earthquake risk. The 1986 earthquake of magnitude MS=5.7 occurred in the central northern Bulgaria (near the town of Strazhitsa) is the strongest quake after 1928. Moreover, the seismicity of the neighboring countries, like Greece, Turkey, former Yugoslavia and Romania (especially Vrancea-Romania intermediate earthquakes), influences the seismic hazard in Bulgaria. In the present study deterministic scenarios (expressed in seismic intensity) for two Bulgarian cities (Rouse and Plovdiv) are presented. The work on scenarios was guided by the perception that usable and realistic (also in the sense of being compatible with seismic histories of cities that are several centuries long) ground motion maps had to be produced for urban areas. By deterministic scenario it is mean a representation of the severity of ground shaking over an urban area, using one or more hazard descriptors. Such representation can be obtained: - either from the assumption of a "reference earthquake" specified by a magnitude or an epicentral intensity, associated to a particular earthquake source - or, directly, showing values of local macroseimic intensity generated by a damaging, real earthquakes of the past. In the study we chose for the second method using the values of macroseimic intensity caused by damaging historical earthquakes (the 1928 quakes in southern Bulgaria; the 1940 and the 1977 Vrancea intermediate earthquakes) - lessons from the past. Such scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning.

  4. Neo-Deterministic Seismic Hazard Assessment at Watts Bar Nuclear Power Plant Site, Tennessee, USA

    NASA Astrophysics Data System (ADS)

    Brandmayr, E.; Cameron, C.; Vaccari, F.; Fasan, M.; Romanelli, F.; Magrin, A.; Vlahovic, G.

    2017-12-01

    Watts Bar Nuclear Power Plant (WBNPP) is located within the Eastern Tennessee Seismic Zone (ETSZ), the second most naturally active seismic zone in the US east of the Rocky Mountains. The largest instrumental earthquakes in the ETSZ are M 4.6, although paleoseismic evidence supports events of M≥6.5. Events are mainly strike-slip and occur on steeply dipping planes at an average depth of 13 km. In this work, we apply the neo-deterministic seismic hazard assessment to estimate the potential seismic input at the plant site, which has been recently targeted by the Nuclear Regulatory Commission for a seismic hazard reevaluation. First, we perform a parametric test on some seismic source characteristics (i.e. distance, depth, strike, dip and rake) using a one-dimensional regional bedrock model to define the most conservative scenario earthquakes. Then, for the selected scenario earthquakes, the estimate of the ground motion input at WBNPP is refined using a two-dimensional local structural model (based on the plant's operator documentation) with topography, thus looking for site amplification and different possible rupture processes at the source. WBNNP features a safe shutdown earthquake (SSE) design with PGA of 0.18 g and maximum spectral amplification (SA, 5% damped) of 0.46 g (at periods between 0.15 and 0.5 s). Our results suggest that, although for most of the considered scenarios the PGA is relatively low, SSE values can be reached and exceeded in the case of the most conservative scenario earthquakes.

  5. Reassessing the New Madrid Seismic Zone

    NASA Astrophysics Data System (ADS)

    Atkinson, Gail; Bakun, Bill; Bodin, Paul; Boore, David; Camer, Chris; Frankel, Art; Gasperini, Paulo; Gomberg, Joan; Hanks, Tom; Hermann, Bob; Hough, Susan; Johnston, Arch; Kenner, Shelley; Langston, Chuck; Linker, Mark; Mayne, Paul; Petersen, Mark; Powell, Christine; Prescott, Will; Schweig, Eugene; Segall, Paul; Stein, Seth; Stuart, Bill; Tuttle, Martitia; VanArsdale, Roy

    The central enigma of the mid-continent region in the United States known as the New Madrid seismic zone (NMSZ; Figure 1) involves the mechanisms that give rise to recurrent great earthquakes far from plate boundaries. Given the lack of significant topographic relief that is the hallmark of tectonic activity in most actively deforming regions, most of us feel a need to “pinch ourselves to see if we're dreaming” when confronted with evidence that, at some probability levels, the earthquake hazard throughout the NMSZ is comparable to that estimated for the San Francisco Bay region.Although assessing the hazard in the NMSZ is in many ways more challenging than in the western United States, and the uncertainties are much greater, careful scientific study has led to a consensus on the issues most critical to seismic hazard assessment.

  6. On the adaptive daily forecasting of seismic aftershock hazard

    NASA Astrophysics Data System (ADS)

    Ebrahimian, Hossein; Jalayer, Fatemeh; Asprone, Domenico; Lombardi, Anna Maria; Marzocchi, Warner; Prota, Andrea; Manfredi, Gaetano

    2013-04-01

    Post-earthquake ground motion hazard assessment is a fundamental initial step towards time-dependent seismic risk assessment for buildings in a post main-shock environment. Therefore, operative forecasting of seismic aftershock hazard forms a viable support basis for decision-making regarding search and rescue, inspection, repair, and re-occupation in a post main-shock environment. Arguably, an adaptive procedure for integrating the aftershock occurrence rate together with suitable ground motion prediction relations is key to Probabilistic Seismic Aftershock Hazard Assessment (PSAHA). In the short-term, the seismic hazard may vary significantly (Jordan et al., 2011), particularly after the occurrence of a high magnitude earthquake. Hence, PSAHA requires a reliable model that is able to track the time evolution of the earthquake occurrence rates together with suitable ground motion prediction relations. This work focuses on providing adaptive daily forecasts of the mean daily rate of exceeding various spectral acceleration values (the aftershock hazard). Two well-established earthquake occurrence models suitable for daily seismicity forecasts associated with the evolution of an aftershock sequence, namely, the modified Omori's aftershock model and the Epidemic Type Aftershock Sequence (ETAS) are adopted. The parameters of the modified Omori model are updated on a daily basis using Bayesian updating and based on the data provided by the ongoing aftershock sequence based on the methodology originally proposed by Jalayer et al. (2011). The Bayesian updating is used also to provide sequence-based parameter estimates for a given ground motion prediction model, i.e. the aftershock events in an ongoing sequence are exploited in order to update in an adaptive manner the parameters of an existing ground motion prediction model. As a numerical example, the mean daily rates of exceeding specific spectral acceleration values are estimated adaptively for the L'Aquila 2009 aftershock catalog. The parameters of the modified Omori model are estimated in an adaptive manner using the Bayesian updating based on the aftershock events that had already taken place at each day elapsed and using the Italian generic sequence (Lolli and Gasperini 2003) as prior information. For the ETAS model, the real-time daily forecast of the spatio-temporal evolution of the L'Aquila sequence provided for the Italian Civil Protection for managing the emergency (Marzocchi and Lombardi, 2009) is utilized. Moreover, the parameters of the ground motion prediction relation proposed by Sabetta and Pugliese (1996) are updated adaptively and on a daily basis using Bayesian updating based on the ongoing aftershock sequence. Finally, the forecasted daily rates of exceeding (first-mode) spectral acceleration values are compared with observed rates of exceedance calculated based on the wave-forms that have actually taken place. References Jalayer, F., Asprone, D., Prota, A., Manfredi, G. (2011). A decision support system for post-earthquake reliability assessment of structures subjected to after-shocks: an application to L'Aquila earthquake, 2009. Bull. Earthquake Eng. 9(4) 997-1014. Jordan, T.H., Chen Y-T., Gasparini P., Madariaga R., Main I., Marzocchi W., Papadopoulos G., Sobolev G., Yamaoka K., and J. Zschau (2011). Operational earthquake forecasting: State of knowledge and guidelines for implementation, Ann. Geophys. 54(4) 315-391, doi 10.4401/ag-5350. Lolli, B., and P. Gasperini (2003). Aftershocks hazard in Italy part I: estimation of time-magnitude distribution model parameters and computation of probabilities of occurrence. Journal of Seismology 7(2) 235-257. Marzocchi, W., and A.M. Lombardi (2009). Real-time forecasting following a damaging earthquake, Geophys. Res. Lett. 36, L21302, doi: 10.1029/2009GL040233. Sabetta F., A. Pugliese (1996) Estimation of response spectra and simulation of nonstationary earthquake ground motions. Bull Seismol Soc Am 86(2) 337-352.

  7. Deterministic Tectonic Origin Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.

    2014-12-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) and medium (1 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the well-known historical earthquakes such as AD 365 or AD 1303 in the Hellenic Arc, but also earthquakes with lower magnitudes do constitute to the tsunami hazard in the study area.

  8. 44 CFR 361.7 - General eligible expenditures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.7 General eligible expenditures... specifically for carrying out earthquake hazards reduction activities are eligible when engaged in the...

  9. 44 CFR 361.7 - General eligible expenditures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.7 General eligible expenditures... specifically for carrying out earthquake hazards reduction activities are eligible when engaged in the...

  10. 44 CFR 361.7 - General eligible expenditures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.7 General eligible expenditures... specifically for carrying out earthquake hazards reduction activities are eligible when engaged in the...

  11. 44 CFR 361.7 - General eligible expenditures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.7 General eligible expenditures... specifically for carrying out earthquake hazards reduction activities are eligible when engaged in the...

  12. 44 CFR 361.7 - General eligible expenditures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.7 General eligible expenditures... specifically for carrying out earthquake hazards reduction activities are eligible when engaged in the...

  13. Contribution to the assessment of the imminent seismic hazard: Geophysical, statistical (and more) challenges in the territory of Greece

    NASA Astrophysics Data System (ADS)

    Adamaki, Angeliki K.; Papadimitriou, Eleftheria E.; Karakostas, Vassilis G.; Tsaklidis, George M.

    2013-04-01

    The necessity of the imminent seismic hazard assessment stems from a strong social component which is the outcome of the need of people to inquire more in order to understand nature exhaustively and not partially, either to satisfy their inner curiosity or in favor of their self preservation instinct against the physical phenomena that the human kind cannot control. Choosing this path to follow, many seismologists have focused on forecasting the temporal and spatial distribution of earthquakes in short time scales. The possibility of knowing with a degree of certainty the way an earthquake sequence evolves proves to be an important object of research. Being more specific, the present work summarizes applications of seismicity and statistical models on seismic catalogues of areas that are specified by their tectonic structures and their past seismicity, providing information on the temporal and spatial evolution of local seismic activity, which can point out seismicity rate "irregularities" or changes as precursors of strong events, either in case of a main shock or a strong aftershock. In order to study these rate changes both preceding and following a strong earthquake, seismicity models are applied in order to estimate the Coulomb stress changes resulting from the occurrence of a strong earthquake and their results are combined with the application of a Restricted Epidemic Type Aftershock Sequence model. There are many active tectonic structures in the territory of Greece that are related with the occurrence of strong earthquakes, especially near populated areas, and the aim of this work is to contribute to the assessment of the imminent seismic hazard by applying the aforementioned models and techniques and studying the temporal evolution of several seismic sequences that occurred in the Aegean area in the near past.

  14. Vulnerability assessment of a port and harbor community to earthquake and tsunami hazards: Integrating technical expert and stakeholder input

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Research suggests that the Pacific Northwest could experience catastrophic earthquakes and tsunamis in the near future, posing a significant threat to the numerous ports and harbors along the coast. A collaborative, multiagency initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to these hazards, involving Oregon Sea Grant, Washington Sea Grant, the National Oceanic and Atmospheric Administration Coastal Services Center, and the U.S. Geological Survey Center for Science Policy. One element of this research, planning, and outreach initiative is a natural hazard mitigation and emergency preparedness planning process that combines technical expertise with local stakeholder values and perceptions. This paper summarizes and examines one component of the process, the vulnerability assessment methodology, used in the pilot port and harbor community of Yaquina River, Oregon, as a case study of assessing vulnerability at the local level. In this community, stakeholders were most concerned with potential life loss and other nonstructural vulnerability issues, such as inadequate hazard awareness, communication, and response logistics, rather than structural issues, such as damage to specific buildings or infrastructure.

  15. What caused a large number of fatalities in the Tohoku earthquake?

    NASA Astrophysics Data System (ADS)

    Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.

    2012-04-01

    The Mw9.0 earthquake caused 20,000 deaths and missing persons in northeastern Japan. 115 years prior to this event, there were three historical tsunamis that struck the region, one of which is a "tsunami earthquake" resulted with a death toll of 22,000. Since then, numerous breakwaters were constructed along the entire northeastern coasts and tsunami evacuation drills were carried out and hazard maps were distributed to local residents on numerous communities. However, despite the constructions and preparedness efforts, the March 11 Tohoku earthquake caused numerous fatalities. The strong shaking lasted three minutes or longer, thus all residents recognized that this is the strongest and longest earthquake that they had been ever experienced in their lives. The tsunami inundated an enormous area at about 560km2 over 35 cities along the coast of northeast Japan. To find out the reasons behind the high number of fatalities due to the March 11 tsunami, we interviewed 150 tsunami survivors at public evacuation shelters in 7 cities mainly in Iwate prefecture in mid-April and early June 2011. Interviews were done for about 30min or longer focused on their evacuation behaviors and those that they had observed. On the basis of the interviews, we found that residents' decisions not to evacuate immediately were partly due to or influenced by earthquake science results. Below are some of the factors that affected residents' decisions. 1. Earthquake hazard assessments turned out to be incorrect. Expected earthquake magnitudes and resultant hazards in northeastern Japan assessed and publicized by the government were significantly smaller than the actual Tohoku earthquake. 2. Many residents did not receive accurate tsunami warnings. The first tsunami warning were too small compared with the actual tsunami heights. 3. The previous frequent warnings with overestimated tsunami height influenced the behavior of the residents. 4. Many local residents above 55 years old experienced the 1960 Chile tsunami, which was significantly smaller than that of the 11 March tsunami. This sense of "knowing" put their lives at high risk. 5. Some local residents believed that with the presence of a breakwater, only slight flooding would occur. 6. Many people did not understand why tsunami is created under the sea. Therefore, relation of earthquake and tsunami is not quite linked to many people. These interviews made it clear that many deaths resulted because current technology and earthquake science underestimated tsunami heights, warning systems failed, and breakwaters were not strong or high enough. However, even if these problems occur in future earthquakes, better knowledge regarding earthquakes and tsunami hazards could save more lives. In an elementary school when children have fresh brain, it is necessary for them to learn the basic mechanism of tsunami generation.

  16. Detecting Faults in Southern California using Computer-Vision Techniques and Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) Interferometry

    NASA Astrophysics Data System (ADS)

    Barba, M.; Rains, C.; von Dassow, W.; Parker, J. W.; Glasscoe, M. T.

    2013-12-01

    Knowing the location and behavior of active faults is essential for earthquake hazard assessment and disaster response. In Interferometric Synthetic Aperture Radar (InSAR) images, faults are revealed as linear discontinuities. Currently, interferograms are manually inspected to locate faults. During the summer of 2013, the NASA-JPL DEVELOP California Disasters team contributed to the development of a method to expedite fault detection in California using remote-sensing technology. The team utilized InSAR images created from polarimetric L-band data from NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) project. A computer-vision technique known as 'edge-detection' was used to automate the fault-identification process. We tested and refined an edge-detection algorithm under development through NASA's Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) project. To optimize the algorithm we used both UAVSAR interferograms and synthetic interferograms generated through Disloc, a web-based modeling program available through NASA's QuakeSim project. The edge-detection algorithm detected seismic, aseismic, and co-seismic slip along faults that were identified and compared with databases of known fault systems. Our optimization process was the first step toward integration of the edge-detection code into E-DECIDER to provide decision support for earthquake preparation and disaster management. E-DECIDER partners that will use the edge-detection code include the California Earthquake Clearinghouse and the US Department of Homeland Security through delivery of products using the Unified Incident Command and Decision Support (UICDS) service. Through these partnerships, researchers, earthquake disaster response teams, and policy-makers will be able to use this new methodology to examine the details of ground and fault motions for moderate to large earthquakes. Following an earthquake, the newly discovered faults can be paired with infrastructure overlays, allowing emergency response teams to identify sites that may have been exposed to damage. The faults will also be incorporated into a database for future integration into fault models and earthquake simulations, improving future earthquake hazard assessment. As new faults are mapped, they will further understanding of the complex fault systems and earthquake hazards within the seismically dynamic state of California.

  17. Assessment of seismic risk in Tashkent, Uzbekistan and Bishkek, Kyrgyz Republic

    USGS Publications Warehouse

    Erdik, M.; Rashidov, T.; Safak, E.; Turdukulov, A.

    2005-01-01

    The impact of earthquakes in urban centers prone to disastrous earthquakes necessitates the analysis of associated risk for rational formulation of contingency plans and mitigation strategies. In urban centers the seismic risk is best quantified and portrayed through the preparation of 'Earthquake damage and Loss Scenarios'. The components of such scenarios are the assessment of the hazard, inventories and the vulnerabilities of elements at risk. For the development of earthquake risk scenario in Tashkent-Uzbekistan and Bishkek-Kyrgyzstan an approach based on spectral displacements is utilized. This paper will present the important features of a comprehensive study, highlight the methodology, discuss the results and provide insights to the future developments. ?? 2005 Elsevier Ltd. All rights reserved.

  18. New Intensity Attenuation in Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.; Tibaldi, A.; Bonali, F.; Gogoladze, Z.; Kvavadze, N.; Kvedelidze, I.

    2016-12-01

    In seismic-prone zones, increase of urbanization and infrastructures in turn produces increase of seismic risk that is mainly related to: the level of seismic hazard itself, the seismic resistance of dwelling houses, and many other factors. The relevant objectives of the present work is to improve the regional seismic hazard maps of Georgia, by implementing state-of-the art probabilistic seismic hazard assessment techniques and outputs from recent national and international collaborations. Seismic zoning is the identification of zones of similar levels of earthquake hazard. With reference to seismic zoning by ground motion assessment, the shaking intensity essentially depends on i) regional seismicity, ii) attenuation of ground motion with distance, iii) local site effects on ground motion. In the last decade, seismic hazard assessment is presented in terms of Peak Ground Acceleration (PGA), Peak Ground Velocity (PGV), or other recorded parameters. But there are very limited strong motion dataset in Georgia. Furthermore, vulnerability of buildings still is estimated for intensity, and there are no information about correlation between the distribution of ground motion recorded parameters and damage. So, macroseimic Intensity is still a very important parameter for strong ground motion evaluation. In the present work, we calibrated intensity prediction equations (IPE) for the Georgian dataset based on about 78 reviewed earthquakes. Metadata for Intensity (MSK 64 scale) were constrained and predictionequations for various types of distance (epicentral and hypocentral distance, Joyner-Boore distance, closest distance to the fault rupture plane) were calibrated. Relations between intensity and PGA values were derived. For this we used hybrid-empirical ground motion equation derived for Georgia and run scenario earthquakes for events with macroseismic data.

  19. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    NASA Astrophysics Data System (ADS)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  20. 44 CFR 361.6 - Documentation of matching contributions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.6 Documentation of matching... comprising its earthquake hazards reduction project, including the project budget, shall reflect a level of...

  1. 44 CFR 361.6 - Documentation of matching contributions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.6 Documentation of matching... comprising its earthquake hazards reduction project, including the project budget, shall reflect a level of...

  2. 44 CFR 361.6 - Documentation of matching contributions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.6 Documentation of matching... comprising its earthquake hazards reduction project, including the project budget, shall reflect a level of...

  3. 78 FR 12780 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-25

    ... INFORMATION CONTACT: To request additional information about this ICR, Elizabeth Lemersal, Earthquake Hazards... . SUPPLEMENTARY INFORMATION: Title: Earthquake Hazards Program Research and Monitoring. OMB Control Number: 1028... findings are essential to fulfilling USGS's responsibility under the Earthquake Hazards Reduction Act to...

  4. 44 CFR 361.6 - Documentation of matching contributions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.6 Documentation of matching... comprising its earthquake hazards reduction project, including the project budget, shall reflect a level of...

  5. 44 CFR 361.6 - Documentation of matching contributions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.6 Documentation of matching... comprising its earthquake hazards reduction project, including the project budget, shall reflect a level of...

  6. Late Holocene megathrust earthquakes in south central Chile

    NASA Astrophysics Data System (ADS)

    Garrett, Ed; Shennan, Ian; Gulliver, Pauline; Woodroffe, Sarah

    2013-04-01

    A lack of comprehensive understanding of the seismic hazards associated with a subduction zone can lead to inadequate anticipation of earthquake and tsunami magnitudes. Four hundred and fifty years of Chilean historical documents record the effects of numerous great earthquakes; however, with recurrence intervals between the largest megathrust earthquakes approaching 300 years, seismic hazard assessment requires longer chronologies. This research seeks to verify and extend historical records in south central Chile using a relative-sea level approach to palaeoseismology. Our quantitative, diatom-based approaches to relative sea-level reconstruction are successful in reconstructing the magnitude of coseismic deformation during recent, well documented Chilean earthquakes. The few disparities between my estimates and independent data highlight the possibility of shaking-induced sediment consolidation in tidal marshes. Following this encouraging confirmation of the approach, we quantify land-level changes in longer sedimentary records from the centre of the rupture zone of the 1960 Valdivia earthquake. Here, laterally extensive marsh soils abruptly overlain by low intertidal sediments attest to the occurrence of four megathrust earthquakes. Sites preserve evidence of the 1960 and 1575 earthquakes and we constrain the timing of two predecessors to 1270 to 1410 and 1050 to 1200. The sediments and biostratigraphy lack evidence for the historically documented 1737 and 1837 earthquakes.

  7. Monitoring the Dead Sea Region by Multi-Parameter Stations

    NASA Astrophysics Data System (ADS)

    Mohsen, A.; Weber, M. H.; Kottmeier, C.; Asch, G.

    2015-12-01

    The Dead Sea Region is an exceptional ecosystem whose seismic activity has influenced all facets of the development, from ground water availability to human evolution. Israelis, Palestinians and Jordanians living in the Dead Sea region are exposed to severe earthquake hazard. Repeatedly large earthquakes (e.g. 1927, magnitude 6.0; (Ambraseys, 2009)) shook the whole Dead Sea region proving that earthquake hazard knows no borders and damaging seismic events can strike anytime. Combined with the high vulnerability of cities in the region and with the enormous concentration of historical values this natural hazard results in an extreme earthquake risk. Thus, an integration of earthquake parameters at all scales (size and time) and their combination with data of infrastructure are needed with the specific aim of providing a state-of-the-art seismic hazard assessment for the Dead Sea region as well as a first quantitative estimate of vulnerability and risk. A strong motivation for our research is the lack of reliable multi-parameter ground-based geophysical information on earthquakes in the Dead Sea region. The proposed set up of a number of observatories with on-line data access will enable to derive the present-day seismicity and deformation pattern in the Dead Sea region. The first multi-parameter stations were installed in Jordan, Israel and Palestine for long-time monitoring. All partners will jointly use these locations. All stations will have an open data policy, with the Deutsches GeoForschungsZentrum (GFZ, Potsdam, Germany) providing the hard and software for real-time data transmission via satellite to Germany, where all partners can access the data via standard data protocols.

  8. Advanced National Seismic System—Current status, development opportunities, and priorities for 2017–2027

    USGS Publications Warehouse

    ,

    2017-05-25

    SummaryEarthquakes pose a threat to the safety of over 143 million people living in the United States. Earthquake impacts can be significantly reduced if communities understand their risk and take proactive steps to mitigate that risk. The Advanced National Seismic System (ANSS) is a cooperative effort to collect and analyze seismic and geodetic data on earthquakes, issue timely and reliable notifications of their occurrence and impacts, and provide data for earthquake research and the hazard and risk assessments that are the foundation for creating an earthquakeresilient nation.

  9. Geotechnical effects of the 2015 magnitude 7.8 Gorkha, Nepal, earthquake and aftershocks

    USGS Publications Warehouse

    Moss, Robb E. S.; Thompson, Eric M.; Kieffer, D Scott; Tiwari, Binod; Hashash, Youssef M A; Acharya, Indra; Adhikari, Basanta; Asimaki, Domniki; Clahan, Kevin B.; Collins, Brian D.; Dahal, Sachindra; Jibson, Randall W.; Khadka, Diwakar; Macdonald, Amy; Madugo, Chris L M; Mason, H Benjamin; Pehlivan, Menzer; Rayamajhi, Deepak; Uprety, Sital

    2015-01-01

    This article summarizes the geotechnical effects of the 25 April 2015 M 7.8 Gorkha, Nepal, earthquake and aftershocks, as documented by a reconnaissance team that undertook a broad engineering and scientific assessment of the damage and collected perishable data for future analysis. Brief descriptions are provided of ground shaking, surface fault rupture, landsliding, soil failure, and infrastructure performance. The goal of this reconnaissance effort, led by Geotechnical Extreme Events Reconnaissance, is to learn from earthquakes and mitigate hazards in future earthquakes.

  10. Report on the Aseismic Slip, Tremor, and Earthquakes Workshop

    USGS Publications Warehouse

    Gomberg, Joan; Roeloffs, Evelyn; Trehu, Anne; Dragert, Herb; Meertens, Charles

    2008-01-01

    This report summarizes the discussions and information presented during the workshop on Aseismic Slip, Tremor, and Earthquakes. Workshop goals included improving coordination among those involved in conducting research related to these phenomena, assessing the implications for earthquake hazard assessment, and identifying ways to capitalize on the education and outreach opportunities presented by these phenomena. Research activities of focus included making, disseminating, and analyzing relevant measurements; the relationships among tremor, aseismic or 'slow-slip', and earthquakes; and discovering the underlying causative physical processes. More than 52 participants contributed to the workshop, held February 25-28, 2008 in Sidney, British Columbia. The workshop was sponsored by the U.S. Geological Survey, the National Science Foundation?s Earthscope Program and UNAVCO Consortium, and the Geological Survey of Canada. This report has five parts. In the first part, we integrate the information exchanged at the workshop as it relates to advancing our understanding of earthquake generation and hazard. In the second part, we summarize the ideas and concerns discussed in workshop working groups on Opportunities for Education and Outreach, Data and Instrumentation, User and Public Needs, and Research Coordination. The third part presents summaries of the oral presentations. The oral presentations are grouped as they were at the workshop in the categories of phenomenology, underlying physical processes, and implications for earthquake hazards. The fourth part contains the meeting program and the fifth part lists the workshop participants. References noted in parentheses refer to the authors of presentations made at the workshop, and published references are noted in square brackets and listed in the Reference section. Appendix A contains abstracts of all participant presentations and posters, which also have been posted online, along with presentations and author contact information at http://www.earthscope.org/science/cascadia.

  11. The Saguenay Fjord, Quebec, Canada: Integrating marine geotechnical and geophysical data for spatial seismic slope stability and hazard assessment

    USGS Publications Warehouse

    Urgeles, R.; Locat, J.; Lee, H.J.; Martin, F.

    2002-01-01

    In 1996 a major flood occurred in the Saguenay region, Quebec, Canada, delivering several km3 of sediment to the Saguenay Fjord. Such sediments covered large areas of the, until then, largely contaminated fjord bottom, thus providing a natural capping layer. Recent swath bathymetry data have also shown that sediment landslides are widely present in the upper section of the Saguenay Fjord, and therefore, should a new event occur, it would probably expose the old contaminated sediments. Landslides in the Upper Saguenay Fjord are most probably due to earthquakes given its proximity to the Charlevoix seismic region and to that of the 1988 Saguenay earthquake. In consequence, this study tries to characterize the permanent ground deformations induced by different earthquake scenarios from which shallow sediment landslides could be triggered. The study follows a Newmark analysis in which, firstly, the seismic slope performance is assessed, secondly, the seismic hazard analyzed, and finally an evaluation of the seismic landslide hazard is made. The study is based on slope gradients obtained from EM1000 multibeam bathymetry data as well as water content and undrained shear strength measurements made in box and gravity cores. Ground motions integrating local site conditions were simulated using synthetic time histories. The study assumes the region of the 1988 Saguenay earthquake as the most likely source area for earthquakes capable of inducing large ground motions in the Upper Saguenay region. Accordingly, we have analyzed several shaking intensities to deduce that generalized sediment displacements will begin to occur when moment magnitudes exceed 6. Major displacements, failure, and subsequent landslides could occur only from earthquake moment magnitudes exceeding 6.75. ?? 2002 Elsevier Science B.V. All rights reserved.

  12. National Earthquake Hazards Reduction Program; time to expand

    USGS Publications Warehouse

    Steinbrugge, K.V.

    1990-01-01

    All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role? 

  13. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    NASA Astrophysics Data System (ADS)

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in vent position, volume, eruption rate, wind directions and topography. Therefore, numerical simulations with controlled parameters are needed for more precise volcanic eruption predictions. The use of the next-generation system should enable the visualization of past volcanic eruptions datasets such as distributions, eruption volumes and eruption rates, on maps and diagrams using timeline and GIS technology. Similar volcanic eruptions scenarios should be easily searchable from the eruption database. Using the volcano hazard assessment system, prediction of the time and area that would be affected by volcanic eruptions at any locations near the volcano should be possible, using numerical simulations. The system should estimate volcanic hazard risks by overlaying the distributions of volcanic deposits on major roads, houses and evacuation areas using a GIS enabled systems. Probabilistic volcanic hazards maps in active volcano sites should be made based on numerous numerical simulations. The next-generation real-time hazard assessment system would be implemented with user-friendly interface, making the risk assessment system easily usable and accessible online.

  14. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  15. Seismic Hazard and Risk Assessments for Beijing-Tianjin-Tangshan, China, Area

    USGS Publications Warehouse

    Xie, F.; Wang, Z.; Liu, J.

    2011-01-01

    Seismic hazard and risk in the Beijing-Tianjin-Tangshan, China, area were estimated from 500-year intensity observations. First, we digitized the intensity observations (maps) using ArcGIS with a cell size of 0.1 ?? 0.1??. Second, we performed a statistical analysis on the digitized intensity data, determined an average b value (0.39), and derived the intensity-frequency relationship (hazard curve) for each cell. Finally, based on a Poisson model for earthquake occurrence, we calculated seismic risk in terms of a probability of I ??? 7, 8, or 9 in 50 years. We also calculated the corresponding 10 percent probability of exceedance of these intensities in 50 years. The advantages of assessing seismic hazard and risk from intensity records are that (1) fewer assumptions (i. e., earthquake source and ground motion attenuation) are made, and (2) site-effect is included. Our study shows that the area has high seismic hazard and risk. Our study also suggests that current design peak ground acceleration or intensity for the area may not be adequate. ?? 2010 Birkh??user / Springer Basel AG.

  16. Multidisciplinary Geo-scientific Hazard Analyses: Istanbul Microzonation Projects

    NASA Astrophysics Data System (ADS)

    Kara, Sema; Baş, Mahmut; Kılıç, Osman; Tarih, Ahmet; Yahya Menteşe, Emin; Duran, Kemal

    2017-04-01

    Istanbul (Turkey) is located on the west edge of North Anatolia Fault and hence is an earthquake prone city with a population that exceeds 15 million people. In addition, the city is still growing as center of commerce, tourism and culture that increases the exposure more and more. During the last decade, although Istanbul grew faster than ever in its history, precautions against a possible earthquake have also increased steadily. The two big earthquakes (in Kocaeli and Duzce Provinces) occurred in 1999 alongside Istanbul and these events became the trigger events that accelerated the disaster risk reduction activities in Istanbul. Following a loss estimation study carried out by Japanese International Cooperation Agency (JICA) in 2001 and Istanbul Earthquake Master Plan prepared by four major universities' researchers in 2003; it was evaluated that understanding and analyzing the geological structure in Istanbul was the main concern. Thereafter Istanbul Metropolitan Municipality's Directorate of Earthquake and Ground Research (DEGRE) carried out two major geo-scientific studies called "microzonation studies" covering 650 km2 of Istanbul's urbanized areas between 2006 and 2009. The studies were called "microzonation" because the analysis resolution was as dense as 250m grids and included various assessments on hazards such as ground shaking, liquefaction, karstification, landslide, flooding, and surface faulting. After the evaluation of geological, geotechnical and geophysical measurements; Earthquake and Tsunami Hazard Maps for all Istanbul, slope, engineering geology, ground water level, faulting, ground shaking, inundation, shear wave velocity and soil classification maps for the project areas were obtained. In the end "Land Suitability Maps" are derived from the combination of inputs using multi-hazard approach. As a result, microzonation is tool for risk oriented urban planning; consisting of interdisciplinary multi-hazard risk analyses. The outputs of microzonation are used in land development/use plans, hazard identification in urban transformation, determination of the routes and characteristics of various types of engineering structures such as highways, tunnels, bridges, railroads, viaducts and ports. Hence, by the use of detailed geo-scientific analyses, basics of earthquake resilient urbanization is guaranteed.

  17. Deterministic seismic hazard macrozonation of India

    NASA Astrophysics Data System (ADS)

    Kolathayar, Sreevalsa; Sitharam, T. G.; Vipin, K. S.

    2012-10-01

    Earthquakes are known to have occurred in Indian subcontinent from ancient times. This paper presents the results of seismic hazard analysis of India (6°-38°N and 68°-98°E) based on the deterministic approach using latest seismicity data (up to 2010). The hazard analysis was done using two different source models (linear sources and point sources) and 12 well recognized attenuation relations considering varied tectonic provinces in the region. The earthquake data obtained from different sources were homogenized and declustered and a total of 27,146 earthquakes of moment magnitude 4 and above were listed in the study area. The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided into small grids of size 0.1° × 0.1° (approximately 10 × 10 km), and the hazard parameters were calculated at the center of each of these grid cells by considering all the seismic sources within a radius of 300 to 400 km. Rock level peak horizontal acceleration (PHA) and spectral accelerations for periods 0.1 and 1 s have been calculated for all the grid points with a deterministic approach using a code written in MATLAB. Epistemic uncertainty in hazard definition has been tackled within a logic-tree framework considering two types of sources and three attenuation models for each grid point. The hazard evaluation without logic tree approach also has been done for comparison of the results. The contour maps showing the spatial variation of hazard values are presented in the paper.

  18. Assessment of seismic hazard in the North Caucasus

    NASA Astrophysics Data System (ADS)

    Ulomov, V. I.; Danilova, T. I.; Medvedeva, N. S.; Polyakova, T. P.; Shumilina, L. S.

    2007-07-01

    The seismicity of the North Caucasus is the highest in the European part of Russia. The detection of potential seismic sources here and long-term prediction of earthquakes are extremely important for the assessment of seismic hazard and seismic risk in this densely populated and industrially developed region of the country. The seismogenic structures of the Iran-Caucasus-Anatolia and Central Asia regions, adjacent to European Russia, are the subjects of this study. These structures are responsible for the specific features of regional seismicity and for the geodynamic interaction with adjacent areas of the Scythian and Turan platforms. The most probable potential sources of earthquakes with magnitudes M = 7.0 ± 0.2 and 7.5 ± 0.2 in the North Caucasus are located. The possible macroseismic effect of one of them is assessed.

  19. Mega-thrust and Intra-slab Earthquakes Beneath Tokyo Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Sato, H.; Koketsu, K.; Hagiwara, H.; Wu, F.; Okaya, D.; Iwasaki, T.; Kasahara, K.

    2006-12-01

    In central Japan the Philippine Sea plate (PSP) subducts beneath the Tokyo Metropolitan area, the Kanto region, where it causes mega-thrust earthquakes, such as the 1703 Genroku earthquake (M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. The vertical proximity of this down going lithospheric plate is of concern because the greater Tokyo urban region has a population of 42 million and is the center of approximately 40% of the nation's economic activities. A M7+ earthquake in this region at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The M7+ earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In 2002, a consortium of universities and government agencies in Japan started the Special Project for Earthquake Disaster Mitigation in Urban Areas, a project to improve information needed for seismic hazards analyses of the largest urban centers. Assessment in Kanto of the seismic hazard produced by the Philippine Sea Plate (PSP) mega-thrust earthquakes requires identification of all significant faults and possible earthquake scenarios and rupture behavior, regional characterizations of PSP geometry and the overlying Honshu arc physical properties (e.g., seismic wave velocities, densities, attenuation), and local near-surface seism ic site effects. Our study addresses (1) improved regional characterization of the PSP geometry based on new deep seismic reflection profiles (Sato etal.,2005), reprocessed off-shore profiles (Kimura et al.,2005), and a dense seismic array in the Boso peninsular (Hagiwara et al., 2006) and (2) identification of asperities of the mega-thrust at the top of the PSP. We qualitatively examine the relationship between seismic reflections and asperities inferred by reflection physical properties. We also discuss the relation between deformation of PSP and intra-slab M7+ earthquakes: the PSP is subducting beneath the Hoshu arc and also colliding with the Pacific plate. The subduction and collision both contribute active seismicity in the Kanto region. We present a high resolution tomographic image to show a low velocity zone which suggests a possible internal failure of the slab; a source region of the M7+ intra-slab earthquake. Our study contributes a new assessment of the seismic hazard in the Tokyo metropolitan area. tokyo.ac.jp/daidai/index-J.html

  20. Seismic hazard assessment and pattern recognition of earthquake prone areas in the Po Plain (Italy)

    NASA Astrophysics Data System (ADS)

    Gorshkov, Alexander; Peresan, Antonella; Soloviev, Alexander; Panza, Giuliano F.

    2014-05-01

    A systematic and quantitative assessment, capable of providing first-order consistent information about the sites where large earthquakes may occur, is crucial for the knowledgeable seismic hazard evaluation. The methodology for the pattern recognition of areas prone to large earthquakes is based on the morphostructural zoning method (MSZ), which employs topographic data and present-day tectonic structures for the mapping of earthquake-controlling structures (i.e. the nodes formed around lineaments intersections) and does not require the knowledge about past seismicity. The nodes are assumed to be characterized by a uniform set of topographic, geologic, and geophysical parameters; on the basis of such parameters the pattern recognition algorithm defines a classification rule to discriminate seismogenic and non-seismogenic nodes. This methodology has been successfully applied since the early 1970s in a number of regions worldwide, including California, where it permitted the identification of areas that have been subsequently struck by strong events and that previously were not considered prone to strong earthquakes. Recent studies on the Iberian Peninsula and the Rhone Valley, have demonstrated the applicability of MSZ to flat basins, with a relatively flat topography. In this study, the analysis is applied to the Po Plain (Northern Italy), an area characterized by a flat topography, to allow for the systematic identification of the nodes prone to earthquakes with magnitude larger or equal to M=5.0. The MSZ method differs from the standard morphostructural analysis where the term "lineament" is used to define the complex of alignments detectable on topographic maps or on satellite images. According to that definition the lineament is locally defined and the existence of the lineament does not depend on the surrounding areas. In MSZ, the primary element is the block - a relatively homogeneous area - while the lineament is a secondary element of the morphostructure. The identified earthquake prone areas provide first-order systematic information that may significantly contribute to seismic hazard assessment in the Italian territory. The information about the possible location of strong earthquakes provided by the morphostructural analysis, in fact, can be naturally incorporated in the neo-deterministic procedure for seismic hazard assessment (NDSHA), so as to fill in possible gaps in known seismicity. Moreover, the space information about earthquake prone areas can be fruitfully combined with the space-time information provided by the quantitative analysis of the seismic flow, so as to identify the priority areas (with linear dimensions of few tens kilometers), where the probability of a strong earthquake is relatively high, for detailed local scale studies. The new indications about the seismogenic potential obtained from this study, although less accurate than detailed fault studies, have the advantage of being independent on past seismicity information, since they rely on the systematic and quantitative analysis of the available geological and morphostructural data. Thus, this analysis appears particularly useful in areas where historical information is scarce; special attention should be paid to seismogenic nodes that are not related with known active faults or past earthquakes.

  1. The Impact Hazard in the Context of Other Natural Hazards and Predictive Science

    NASA Astrophysics Data System (ADS)

    Chapman, C. R.

    1998-09-01

    The hazard due to impact of asteroids and comets has been recognized as analogous, in some ways, to other infrequent but consequential natural hazards (e.g. floods and earthquakes). Yet, until recently, astronomers and space agencies have felt no need to do what their colleagues and analogous agencies must do in order the assess, quantify, and communicate predictions to those with a practical interest in the predictions (e.g. public officials who must assess the threats, prepare for mitigation, etc.). Recent heightened public interest in the impact hazard, combined with increasing numbers of "near misses" (certain to increase as Spaceguard is implemented) requires that astronomers accept the responsibility to place their predictions and assessments in terms that may be appropriately considered. I will report on preliminary results of a multi-year GSA/NCAR study of "Prediction in the Earth Sciences: Use and Misuse in Policy Making" in which I have represented the impact hazard, while others have treated earthquakes, floods, weather, global climate change, nuclear waste disposal, acid rain, etc. The impact hazard presents an end-member example of a natural hazard, helping those dealing with more prosaic issues to learn from an extreme. On the other hand, I bring to the astronomical community some lessons long adopted in other cases: the need to understand the policy purposes of impact predictions, the need to assess potential societal impacts, the requirements to very carefully assess prediction uncertainties, considerations of potential public uses of the predictions, awareness of ethical considerations (e.g. conflicts of interest) that affect predictions and acceptance of predictions, awareness of appropriate means for publicly communicating predictions, and considerations of the international context (especially for a hazard that knows no national boundaries).

  2. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  3. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    USGS Publications Warehouse

    Thenhaus, P.C.

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-1812 in the area of New Madrid, Missouri. they are considered to be the greatest earthquakes in the conterminous U.S because they were felt and caused damage at far greater distances than any other earthquakes in U.S history. The large population currently living within the damage area of these earthquakes means that widespread destruction and loss of life is likely if the sequence were repeated. In contrast to California, where the earthquakes are felt frequently, the damaging earthquakes that have occurred in the Easter U.S-in 155 (Cape Ann, Mass.), 1811-12 (New Madrid, Mo.), 1886 (Charleston S.C) ,and 1897 (Giles County, Va.- are generally regarded as only historical phenomena (fig. 1). The social memory of these earthquakes no longer exists. A fundamental problem in the Eastern U.S, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the "tools" that geoscientists have to study the region. The so-called earthquake hazard is defined  by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. the term "earthquake risk," on the other hand, refers to aspects of the expected damage to manmade strctures and to lifelines as a result of the earthquake hazard.  

  4. Evaluating the Use of Declustering for Induced Seismicity Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Michael, A. J.

    2016-12-01

    The recent dramatic seismicity rate increase in the central and eastern US (CEUS) has motivated the development of seismic hazard assessments for induced seismicity (e.g., Petersen et al., 2016). Standard probabilistic seismic hazard assessment (PSHA) relies fundamentally on the assumption that seismicity is Poissonian (Cornell, BSSA, 1968); therefore, the earthquake catalogs used in PSHA are typically declustered (e.g., Petersen et al., 2014) even though this may remove earthquakes that may cause damage or concern (Petersen et al., 2015; 2016). In some induced earthquake sequences in the CEUS, the standard declustering can remove up to 90% of the sequence, reducing the estimated seismicity rate by a factor of 10 compared to estimates from the complete catalog. In tectonic regions the reduction is often only about a factor of 2. We investigate how three declustering methods treat induced seismicity: the window-based Gardner-Knopoff (GK) algorithm, often used for PSHA (Gardner and Knopoff, BSSA, 1974); the link-based Reasenberg algorithm (Reasenberg, JGR,1985); and a stochastic declustering method based on a space-time Epidemic-Type Aftershock Sequence model (Ogata, JASA, 1988; Zhuang et al., JASA, 2002). We apply these methods to three catalogs that likely contain some induced seismicity. For the Guy-Greenbrier, AR earthquake swarm from 2010-2013, declustering reduces the seismicity rate by factors of 6-14, depending on the algorithm. In northern Oklahoma and southern Kansas from 2010-2015, the reduction varies from factors of 1.5-20. In the Salton Trough of southern California from 1975-2013, the rate is reduced by factors of 3-20. Stochastic declustering tends to remove the most events, followed by the GK method, while the Reasenberg method removes the fewest. Given that declustering and choice of algorithm have such a large impact on the resulting seismicity rate estimates, we suggest that more accurate hazard assessments may be found using the complete catalog.

  5. 7th U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research: Abstract Volume and Technical Program

    USGS Publications Warehouse

    Detweiler, Shane T.; Ellsworth, William L.

    2008-01-01

    The U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research promotes advanced study toward a more fundamental understanding of the earthquake process and hazard estimation. The Panel promotes basic and applied research to improve our understanding of the causes and effects of earthquakes and to facilitate the transmission of research results to those who implement hazard reduction measures on both sides of the Pacific and around the world. Meetings are held every other year, and alternate between countries with short presentation on current research and local field trips being the highlights. The 5th Joint Panel meeting was held at Asilomar, California in October, 2004. The technical sessions featured reports on the September 28, 2004 Parkfield, California earthquake, progress on earthquake early warning and rapid post-event assessment technology, probabilistic earthquake forecasting and the newly discovered phenomenon of nonvolcanic tremor. The Panel visited the epicentral region of the M 6.0 Parkfield earthquake and viewed the surface ruptures along the San Andreas Fault. They also visited the San Andreas Fault Observatory at Depth (SAFOD), which had just completed the first phase of drilling into the fault. The 6th Joint Panel meeting was held in Tokushima, Japan in November, 2006. The meeting included very productive exchanges of information on approaches to systematic observation of earthquake processes. Sixty eight technical papers were presented during the meeting on a wide range of subjects, including interplate earthquakes in subduction zones, slow slip and nonvolcanic tremor, crustal deformation, recent earthquake activity and hazard mapping. Through our discussion, we reaffirmed the benefits of working together to achieve our common goal of reducing earthquake hazard, continued cooperation on issues involving densification of observation networks and the open exchange of data among scientific communities. We also reaffirmed the importance of making information public in a timely manner. The Panel visited sites along the east coast of Shikoku that were inundated by the tsunami caused by the 1946 Nankai earthquake where they heard from survivors of the disaster and saw new tsunami shelters and barriers. They also visited the Median Tectonic Line, a major onshore strike-slip fault on Shikoku. The 7th Joint Panel meeting was held in Seattle, Wash., U.S.A. from October 27-30, 2008.

  6. 7 CFR 1792.101 - General.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...

  7. 7 CFR 1792.101 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...

  8. 7 CFR 1792.101 - General.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...

  9. 7 CFR 1792.101 - General.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...

  10. 7 CFR 1792.101 - General.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...

  11. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all projects are undertaken with strong involvement of local scientific and risk reduction communities. Open-source software and careful documentation of the methodologies create full transparency of the modelling process, so that results can be reproduced any time by third parties.

  12. Developing a Near Real-time System for Earthquake Slip Distribution Inversion

    NASA Astrophysics Data System (ADS)

    Zhao, Li; Hsieh, Ming-Che; Luo, Yan; Ji, Chen

    2016-04-01

    Advances in observational and computational seismology in the past two decades have enabled completely automatic and real-time determinations of the focal mechanisms of earthquake point sources. However, seismic radiations from moderate and large earthquakes often exhibit strong finite-source directivity effect, which is critically important for accurate ground motion estimations and earthquake damage assessments. Therefore, an effective procedure to determine earthquake rupture processes in near real-time is in high demand for hazard mitigation and risk assessment purposes. In this study, we develop an efficient waveform inversion approach for the purpose of solving for finite-fault models in 3D structure. Full slip distribution inversions are carried out based on the identified fault planes in the point-source solutions. To ensure efficiency in calculating 3D synthetics during slip distribution inversions, a database of strain Green tensors (SGT) is established for 3D structural model with realistic surface topography. The SGT database enables rapid calculations of accurate synthetic seismograms for waveform inversion on a regular desktop or even a laptop PC. We demonstrate our source inversion approach using two moderate earthquakes (Mw~6.0) in Taiwan and in mainland China. Our results show that 3D velocity model provides better waveform fitting with more spatially concentrated slip distributions. Our source inversion technique based on the SGT database is effective for semi-automatic, near real-time determinations of finite-source solutions for seismic hazard mitigation purposes.

  13. Rapid field-based landslide hazard assessment in response to post-earthquake emergency

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Gambini, Stefano; Cancelliere, Giorgio

    2016-04-01

    On April 25, 2015 a Mw 7.8 earthquake occurred 80 km to the northwest of Kathmandu (Nepal). The largest aftershock, occurred on May 12, 2015, was the Mw 7.3 Nepal earthquake (SE of Zham, China), 80 km to the east of Kathmandu. . The earthquakes killed ~9000 people and severely damaged a 10,000 sqkm region in Nepal and neighboring countries. Several thousands of landslides have been triggered during the event, causing widespread damages to mountain villages and the evacuation of thousands of people. Rasuwa was one of the most damaged districts. This contribution describes landslide hazard analysis of the Saramthali, Yarsa and Bhorle VDCs (122 km2, Rasuwa district). Hazard is expressed in terms of qualitative classes (low, medium, high), through a simple matrix approach that combines frequency classes and magnitude classes. The hazard analysis is based primarily on the experience gained during a field survey conducted in September 2014. During the survey, local knowledge has been systematically exploited through interviews with local people that have experienced the earthquake and the coseismic landslides. People helped us to recognize fractures and active deformations, and allowed to reconstruct a correct chronicle of landslide events, in order to assign the landslide events to the first shock, the second shock, or the post-earthquake 2015 monsoon. The field experience was complemented with a standard analysis of the relationship between potential controlling factors and the distribution of landslides reported in Kargel et al (2016). This analysis allowed recognizing the most important controlling factor. This information was integrated with the field observations to verify the mapped units and to complete the mapping in area not accessible for field activity. Finally, the work was completed with the analysis and the use of a detailed landslide inventory produced by the University of Milano Bicocca that covers most of the area affected by coseismic landslides in Nepal (Valagussa et al, 2016). As a result, a 1:10.000 hazard map was produced. About 47% of the area is classified at high hazard, almost 19 % at medium level and 34% at low risk. In addition, the hazard map reports 262 polygons of active coseismic or postseismic landslides.

  14. Population and business exposure to twenty scenario earthquakes in the State of Washington

    USGS Publications Warehouse

    Wood, Nathan; Ratliff, Jamie

    2011-01-01

    This report documents the results of an initial analysis of population and business exposure to scenario earthquakes in Washington. This analysis was conducted to support the U.S. Geological Survey (USGS) Pacific Northwest Multi-Hazards Demonstration Project (MHDP) and an ongoing collaboration between the State of Washington Emergency Management Division (WEMD) and the USGS on earthquake hazards and vulnerability topics. This report was developed to help WEMD meet internal planning needs. A subsequent report will provide analysis to the community level. The objective of this project was to use scenario ground-motion hazard maps to estimate population and business exposure to twenty Washington earthquakes. In consultation with the USGS Earthquake Hazards Program and the Washington Division of Geology and Natural Resources, the twenty scenario earthquakes were selected by WEMD (fig. 1). Hazard maps were then produced by the USGS and placed in the USGS ShakeMap archive.

  15. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  16. Rock-solid information

    NASA Astrophysics Data System (ADS)

    The National Science Foundation's Southern California Earthquake Center and the U.S. Geological Survey have collaborated to provide residents of America's most famous earthquake zone with some hard facts about temblors. Putting Down Roots in Earthquake Country, a 32-page handbook on coping with life near the many earthen faults in Southern California, was distributed in October to all public libraries from San Luis Obispo to Tijuana. The book summarizes for lay people what is known about the San Andreas fault and the many others that cris-cross California. It also offers guidance on how to prevent earthquake damage, how to retrofit a home, and how to assess earthquake hazards.

  17. Variations in population vulnerability to tectonic and landslide-related tsunami hazards in Alaska

    USGS Publications Warehouse

    Wood, Nathan J.; Peters, Jeff

    2015-01-01

    Effective tsunami risk reduction requires an understanding of how at-risk populations are specifically vulnerable to tsunami threats. Vulnerability assessments primarily have been based on single hazard zones, even though a coastal community may be threatened by multiple tsunami sources that vary locally in terms of inundation extents and wave arrival times. We use the Alaskan coastal communities of Cordova, Kodiak, Seward, Valdez, and Whittier (USA), as a case study to explore population vulnerability to multiple tsunami threats. We use anisotropic pedestrian evacuation models to assess variations in population exposure as a function of travel time out of hazard zones associated with tectonic and landslide-related tsunamis (based on scenarios similar to the 1964 M w9.2 Good Friday earthquake and tsunami disaster). Results demonstrate that there are thousands of residents, employees, and business customers in tsunami hazard zones associated with tectonically generated waves, but that at-risk individuals will likely have sufficient time to evacuate to high ground before waves are estimated to arrive 30–60 min after generation. Tsunami hazard zones associated with submarine landslides initiated by a subduction zone earthquake are smaller and contain fewer people, but many at-risk individuals may not have enough time to evacuate as waves are estimated to arrive in 1–2 min and evacuations may need to occur during earthquake ground shaking. For all hazard zones, employees and customers at businesses far outnumber residents at their homes and evacuation travel times are highest on docks and along waterfronts. Results suggest that population vulnerability studies related to tsunami hazards should recognize non-residential populations and differences in wave arrival times if emergency managers are to develop realistic preparedness and outreach efforts.

  18. Seismic-hazard maps and time histories for the commonwealth of Kentucky.

    DOT National Transportation Integrated Search

    2008-06-01

    The ground-motion hazard maps and time histories for three earthquake scenarios, expected earthquakes, probable earthquakes, and maximum credible earthquakes on the free surface in hard rock (shear-wave velocity >1,500 m/s), were derived using the de...

  19. Multi -risk assessment at a national level in Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashvili, Otar; Amiranashvili, Avtandil; Tsereteli, Emili; Elizbarashvili, Elizbar; Saluqvadze, Manana; Dolodze, Jemal

    2013-04-01

    Work presented here was initiated by national GNSF project " Reducing natural disasters multiple risk: a positive factor for Georgia development " and two international projects: NATO SFP 983038 "Seismic hazard and Rusk assessment for Southern Caucasus-eastern Turkey Energy Corridors" and EMME " Earthquake Model for Middle east Region". Methodology for estimation of "general" vulnerability, hazards and multiple risk to natural hazards (namely, earthquakes, landslides, snow avalanches, flash floods, mudflows, drought, hurricanes, frost, hail) where developed for Georgia. The electronic detailed databases of natural disasters were created. These databases contain the parameters of hazardous phenomena that caused natural disasters. The magnitude and intensity scale of the mentioned disasters are reviewed and the new magnitude and intensity scales are suggested for disasters for which the corresponding formalization is not yet performed. The associated economic losses were evaluated and presented in monetary terms for these hazards. Based on the hazard inventory, an approach was developed that allowed for the calculation of an overall vulnerability value for each individual hazard type, using the Gross Domestic Product per unit area (applied to population) as the indicator for elements at risk exposed. The correlation between estimated economic losses, physical exposure and the magnitude for each of the six types of hazards has been investigated in detail by using multiple linear regression analysis. Economic losses for all past events and historical vulnerability were estimated. Finally, the spatial distribution of general vulnerability was assessed, and the expected maximum economic loss was calculated as well as a multi-risk map was set-up.

  20. Seismic Landslide Hazard for the Cities of Oakland and Piedmont, California

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.

    2001-01-01

    This map describes the possible hazard from earthquake-induced landslides for the cities of Oakland and Piedmont, CA. The hazard depicted by this map was modeled for a scenario corresponding to an M=7.1 earthquake on the Hayward, CA fault. This scenario magnitude is associated with complete rupture of the northern and southern segments of the Hayward fault, an event that has an estimated return period of about 500 years. The modeled hazard also corresponds to completely saturated ground-water conditions resulting from an extreme storm event or series of storm events. This combination of earthquake and ground-water scenarios represents a particularly severe state of hazard for earthquake-induced landslides. For dry ground-water conditions, overall hazard will be less, while relative patterns of hazard are likely to change.

  1. Earthquake Loss Assessment for the Evaluation of the Sovereign Risk and Financial Sustainability of Countries and Cities

    NASA Astrophysics Data System (ADS)

    Cardona, O. D.

    2013-05-01

    Recently earthquakes have struck cities both from developing as well as developed countries, revealing significant knowledge gaps and the need to improve the quality of input data and of the assumptions of the risk models. The quake and tsunami in Japan (2011) and the disasters due to earthquakes in Haiti (2010), Chile (2010), New Zealand (2011) and Spain (2011), only to mention some unexpected impacts in different regions, have left several concerns regarding hazard assessment as well as regarding the associated uncertainties to the estimation of the future losses. Understanding probable losses and reconstruction costs due to earthquakes creates powerful incentives for countries to develop planning options and tools to cope with sovereign risk, including allocating the sustained budgetary resources necessary to reduce those potential damages and safeguard development. Therefore the use of robust risk models is a need to assess the future economic impacts, the country's fiscal responsibilities and the contingent liabilities for governments and to formulate, justify and implement risk reduction measures and optimal financial strategies of risk retention and transfer. Special attention should be paid to the understanding of risk metrics such as the Loss Exceedance Curve (empiric and analytical) and the Expected Annual Loss in the context of conjoint and cascading hazards.

  2. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of the Earth’s surface that the fault rupture and shaking will activate.

  3. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives to the NSHM scenario were developed for the Hilton Creek and Hartley Springs Faults to account for different opinions in how far these two faults extend into Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice: the deterministic seismic hazard analysis program developed by Art Frankel of USGS and three Next Generation Ground Motion Attenuation (NGA) models. Ground motion calculations incorporated the potential amplification of seismic shaking by near-surface soils defined by a map of the average shear wave velocity in the uppermost 30 m (VS30) developed by CGS.In addition to ground shaking and shaking-related ground failure such as liquefaction and earthquake induced landslides, earthquakes cause surface rupture displacement, which can lead to severe damage of buildings and lifelines. For each earthquake scenario, potential surface fault displacements are estimated using deterministic and probabilistic approaches. Liquefaction occurs when saturated sediments lose their strength because of ground shaking. Zones of potential liquefaction are mapped by incorporating areas where loose sandy sediments, shallow groundwater, and strong earthquake shaking coincide in the earthquake scenario. The process for defining zones of potential landslide and rockfall incorporates rock strength, surface slope, and existing landslides, with ground motions caused by the scenario earthquake.Each scenario is illustrated with maps of seismic shaking potential and fault displacement, liquefaction, and landslide potential. Seismic shaking is depicted by the distribution of shaking intensity, peak ground acceleration, and 1.0-second spectral acceleration. One-second spectral acceleration correlates well with structural damage to surface facilities. Acceleration greater than 0.2 g is often associated with strong ground shaking and may cause moderate to heavy damage. The extent of strong shaking is influenced by subsurface fault dip and near surface materials. Strong shaking is more widespread in the hanging wall regions of a normal fault. Larger ground motions also occur where young alluvial sediments amplify the shaking. Both of these effects can lead to strong shaking that extends farther from the fault on the valley side than on the hill side.The effect of fault rupture displacements may be localized along the surface trace of the mapped earthquake fault if fault geometry is simple and the fault traces are accurately located. However, surface displacement hazards can spread over a few hundred meters to a few kilometers if the earthquake fault has numerous splays or branches, such as the Hilton Creek Fault. Faulting displacements are estimated to be about 1 meter along normal faults in the study area and close to 2 meters along the White Mountains Fault Zone.All scenarios show the possibility of widespread ground failure. Liquefaction damage would likely occur in the areas of higher ground shaking near the faults where there are sandy/silty sediments and the depth to groundwater is 6.1 meters (20 feet) or less. Generally, this means damage is most common near lakes and streams in the areas of strongest shaking. Landslide potential exists throughout the study region. All steep slopes (>30 degrees) present a potential hazard at any level of shaking. Lesser slopes may have landslides within the areas of the higher ground shaking. The landslide hazard zones also are likely sources for snow avalanches during winter months and for large boulders that can be shaken loose and roll hundreds of feet down hill, which happened during the 1980 Mammoth Lakes earthquakes.Whereas methodologies used in estimating ground shaking, liquefaction, and landslides are well developed and have been applied in published hazard maps; methodologies used in estimating surface fault displacement are still being developed. Therefore, this report provides a more in-depth and detailed discussion of methodologies used for deterministic and probabilistic fault displacement hazard analyses for this project.

  4. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  5. Evaluation of seismic hazard at the northwestern part of Egypt

    NASA Astrophysics Data System (ADS)

    Ezzelarab, M.; Shokry, M. M. F.; Mohamed, A. M. E.; Helal, A. M. A.; Mohamed, Abuoelela A.; El-Hadidy, M. S.

    2016-01-01

    The objective of this study is to evaluate the seismic hazard at the northwestern Egypt using the probabilistic seismic hazard assessment approach. The Probabilistic approach was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. The doubly-truncated exponential model was adopted for calculations of the recurrence parameters. Ground-motion prediction equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 0.2° × 0.2° covering the study area, seismic hazard curves for every node were calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to six spectral periods (0.1, 0.2, 0.3, 1.0, 2.0 and 3.0 s) for return periods of 72, 475 and 2475 years. The unified hazard spectra of two selected rock sites at Alexandria and Mersa Matruh Cities were provided. Finally, the hazard curves were de-aggregated to determine the sources that contribute most of hazard level of 10% probability of exceedance in 50 years for the mentioned selected sites.

  6. Natural Resource Assessments in Afghanistan Through High Resolution Digital Elevation Modeling and Multi-spectral Image Analysis

    NASA Technical Reports Server (NTRS)

    Chirico, Peter G.

    2007-01-01

    This viewgraph presentation provides USGS/USAID natural resource assessments in Afghanistan through the mapping of coal, oil and natural gas, minerals, hydrologic resources and earthquake and flood hazards.

  7. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Paradise, T. R.

    2016-12-01

    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration, Refugees

  8. The Effects of the Passage of Time from the 2011 Tohoku Earthquake on the Public's Anxiety about a Variety of Hazards.

    PubMed

    Nakayachi, Kazuya; Nagaya, Kazuhisa

    2016-08-31

    This research investigated whether the Japanese people's anxiety about a variety of hazards, including earthquakes and nuclear accidents, has changed over time since the Tohoku Earthquake in 2011. Data from three nationwide surveys conducted in 2008, 2012, and 2015 were compared to see the change in societal levels of anxiety toward 51 types of hazards. The same two-phase stratified random sampling method was used to create the list of participants in each survey. The results showed that anxiety about earthquakes and nuclear accidents had increased for a time after the Tohoku Earthquake, and then decreased after a four-year time frame with no severe earthquakes and nuclear accidents. It was also revealed that the anxiety level for some hazards other than earthquakes and nuclear accidents had decreased at ten months after the Earthquake, and then remained unchanged after the four years. Therefore, ironically, a major disaster might decrease the public anxiety in general at least for several years.

  9. Seismotectonic Analysis for the KZN region of South Africa

    NASA Astrophysics Data System (ADS)

    Singh, M.

    2012-04-01

    Recently, devastating earthquakes and tsunamis have shocked the modern world (Japan [April 7 2011, Mw 9.0, loss of life and destruction of infrastructure, 15,457 deaths 5,389 injured, US300billion loss (Japanese National Police Agency 2011)], New Zealand [21 February 2011, Mw 6.3, 148 killed], Haiti [12 January 2010, Mw 7.0, estimated 316 000 killed and 300 000 injured]. These earthquakes have caused large scale damage to the built environment not to mention the high number of fatalities. The KZN coastal region is also fast developing especially towards the north of Durban CBD (Cornubia [New development near Umhlanga, 25 Billion Rands investment], Gateway/Umhlanga Business District, Moses Mabida Stadium (cost of R3.4 billion ), King Shaka International Airport at a cost of R6.8 billion, Dube Tradeport to be developed next to the airport at a cost of R5 billion, as well as the development of the Richards Bay Industrial Development Zone . The KZN is home to 10 million inhabitants with a relatively denser population distribution around the Durban and Pietermaritzburg CBDs. With the increasing amount of investment towards the north coast of Durban, the population distribution will migrate to these areas. These areas now become 'vulnerable' to rare, infrequent and potentially devastating natural disasters like earthquakes. One of the first steps to understand and plan for an earthquake occurrence is through a seismic hazard and risk assessment. The seismic hazard and risk method has well been established since 1968 (see Cornell (1968); Veneziano et al., (1984); Bender and Perkins (1993); McGuire (1993); McGuire and Toro (2008); Kijko and Graham (1998); Kijko and Sellevoll, (1989, 1992)). The components of a seismic risk assessment (SRA) include several building blocks namely: the development of the earthquake catalogue, seismotectonic model, attenuation models, seismic hazard assessment (SHA), vulnerability assessment and seismic risk computations. The seismotectonic model element will be explored in further detail for this research. Preliminary investigations into a seismotectonic investigation for the province have been undertaken by Singh et al. (2011). Under the framework of this research the following tasks are planned for the KZN coastal region: i) Development of a historical earthquake catalogue ii) Development of a GeoDatabase for Seismotectonic Zonation iii) Development of a Seismotectonic Model and iv) Development of an Earthquake Recurrence Model. The author will present progress made to date towards this research.

  10. SCIGN; new Southern California GPS network advances the study of earthquakes

    USGS Publications Warehouse

    Hudnut, Ken; King, Nancy

    2001-01-01

    Southern California is a giant jigsaw puzzle, and scientists are now using GPS satellites to track the pieces. These puzzle pieces are continuously moving, slowly straining the faults in between. That strain is then eventually released in earthquakes. The innovative Southern California Integrated GPS Network (SCIGN) tracks the motions of these pieces over most of southern California with unprecedented precision. This new network greatly improves the ability to assess seismic hazards and quickly measure the larger displacements that occur during and immediatelyafter earthquakes.

  11. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 17. Interpretation of Strong Ground Motion Records.

    DTIC Science & Technology

    1981-10-01

    earthquake. The analysis works from first "hysical principles and, so rar as possible, uses elementary ray theory and kinematic arguments. Nevertheless...elements of the more sophisticated theory of earthquake mechanisms and seismic wave propagation in the near field were taken into account in the...Broad Principles of Interpretation 163 4.2 Robust Estimation of Parameters 171 4.3 Some Remarks on High-Acceleration Values 180 4.4 The Focussing

  12. Public release of the ISC-GEM Global Instrumental Earthquake Catalogue (1900-2009)

    USGS Publications Warehouse

    Storchak, Dmitry A.; Di Giacomo, Domenico; Bondára, István; Engdahl, E. Robert; Harris, James; Lee, William H.K.; Villaseñor, Antonio; Bormann, Peter

    2013-01-01

    The International Seismological Centre–Global Earthquake Model (ISC–GEM) Global Instrumental Earthquake Catalogue (1900–2009) is the result of a special effort to substantially extend and improve currently existing global catalogs to serve the requirements of specific user groups who assess and model seismic hazard and risk. The data from the ISC–GEM Catalogue would be used worldwide yet will prove absolutely essential in those regions where a high seismicity level strongly correlates with a high population density.

  13. Estimation of recurrence interval of large earthquakes on the central Longmen Shan fault zone based on seismic moment accumulation/release model.

    PubMed

    Ren, Junjie; Zhang, Shimin

    2013-01-01

    Recurrence interval of large earthquake on an active fault zone is an important parameter in assessing seismic hazard. The 2008 Wenchuan earthquake (Mw 7.9) occurred on the central Longmen Shan fault zone and ruptured the Yingxiu-Beichuan fault (YBF) and the Guanxian-Jiangyou fault (GJF). However, there is a considerable discrepancy among recurrence intervals of large earthquake in preseismic and postseismic estimates based on slip rate and paleoseismologic results. Post-seismic trenches showed that the central Longmen Shan fault zone probably undertakes an event similar to the 2008 quake, suggesting a characteristic earthquake model. In this paper, we use the published seismogenic model of the 2008 earthquake based on Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR) data and construct a characteristic seismic moment accumulation/release model to estimate recurrence interval of large earthquakes on the central Longmen Shan fault zone. Our results show that the seismogenic zone accommodates a moment rate of (2.7 ± 0.3) × 10¹⁷ N m/yr, and a recurrence interval of 3900 ± 400 yrs is necessary for accumulation of strain energy equivalent to the 2008 earthquake. This study provides a preferred interval estimation of large earthquakes for seismic hazard analysis in the Longmen Shan region.

  14. Estimation of Recurrence Interval of Large Earthquakes on the Central Longmen Shan Fault Zone Based on Seismic Moment Accumulation/Release Model

    PubMed Central

    Zhang, Shimin

    2013-01-01

    Recurrence interval of large earthquake on an active fault zone is an important parameter in assessing seismic hazard. The 2008 Wenchuan earthquake (Mw 7.9) occurred on the central Longmen Shan fault zone and ruptured the Yingxiu-Beichuan fault (YBF) and the Guanxian-Jiangyou fault (GJF). However, there is a considerable discrepancy among recurrence intervals of large earthquake in preseismic and postseismic estimates based on slip rate and paleoseismologic results. Post-seismic trenches showed that the central Longmen Shan fault zone probably undertakes an event similar to the 2008 quake, suggesting a characteristic earthquake model. In this paper, we use the published seismogenic model of the 2008 earthquake based on Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR) data and construct a characteristic seismic moment accumulation/release model to estimate recurrence interval of large earthquakes on the central Longmen Shan fault zone. Our results show that the seismogenic zone accommodates a moment rate of (2.7 ± 0.3) × 1017 N m/yr, and a recurrence interval of 3900 ± 400 yrs is necessary for accumulation of strain energy equivalent to the 2008 earthquake. This study provides a preferred interval estimation of large earthquakes for seismic hazard analysis in the Longmen Shan region. PMID:23878524

  15. Leveraging geodetic data to reduce losses from earthquakes

    USGS Publications Warehouse

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS event response products and by expanded use of geodetic imaging data to assess fault rupture and source parameters.Uncertainties in the NSHM, and in regional earthquake models, are reduced by fully incorporating geodetic data into earthquake probability calculations.Geodetic networks and data are integrated into the operations and earthquake information products of the Advanced National Seismic System (ANSS).Earthquake early warnings are improved by more rapidly assessing ground displacement and the dynamic faulting process for the largest earthquakes using real-time geodetic data.Methodology for probabilistic earthquake forecasting is refined by including geodetic data when calculating evolving moment release during aftershock sequences and by better understanding the implications of transient deformation for earthquake likelihood.A geodesy program that encompasses a balanced mix of activities to sustain missioncritical capabilities, grows new competencies through the continuum of fundamental to applied research, and ensures sufficient resources for these endeavors provides a foundation by which the EHP can be a leader in the application of geodesy to earthquake science. With this in mind the following objectives provide a framework to guide EHP efforts:Fully utilize geodetic information to improve key products, such as the NSHM and EEW, and to address new ventures like the USGS Subduction Zone Science Plan.Expand the variety, accuracy, and timeliness of post-earthquake information products, such as PAGER (Prompt Assessment of Global Earthquakes for Response), through incorporation of geodetic observations.Determine if geodetic measurements of transient deformation can significantly improve estimates of earthquake probability.Maintain an observational strategy aligned with the target outcomes of this document that includes continuous monitoring, recording of ephemeral observations, focused data collection for use in research, and application-driven data processing and analysis systems.Collaborate on research, development, and operation of affordable, high-precision seafloor geodetic methods that improve earthquake forecasting and event response.Advance computational techniques and instrumentation to enable use of strategies like repeat-pass imagery and low-cost geodetic sensors for earthquake response, monitoring, and research.Engage stakeholders and collaborate with partner institutions to foster operational and research objectives and to safeguard the continued health of geodetic infrastructure upon which we mutually depend.Maintaining a vibrant internal research program provides the foundation by which the EHP can remain an effective and trusted source for earthquake science. Exploiting abundant new data sources, evaluating and assimilating the latest science, and pursuing novel avenues of investigation are means to fulfilling the EHP’s core responsibilities and realizing the important scientific advances envisioned by its scientists. Central to the success of such a research program is engaging personnel with a breadth of competencies and a willingness and ability to adapt these to the program’s evolving priorities, enabling current staff to expand their skills and responsibilities, and planning holistically to meet shared workforce needs. In parallel, collaboration with external partners to support scientific investigations that complement ongoing internal research enables the EHP to strengthen earthquake information products by incorporating alternative perspectives and approaches and to study topics and geographic regions that cannot be adequately covered internally.With commensurate support from technical staff who possess diverse skills, including engineering, information technology, and proficiency in quantitative analysis combined with basic geophysical knowledge, the EHP can achieve the geodetic outcomes identified in this document.

  16. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  17. Probabilistic Appraisal of Earthquake Hazard Parameters Deduced from a Bayesian Approach in the Northwest Frontier of the Himalayas

    NASA Astrophysics Data System (ADS)

    Yadav, R. B. S.; Tsapanos, T. M.; Bayrak, Yusuf; Koravos, G. Ch.

    2013-03-01

    A straightforward Bayesian statistic is applied in five broad seismogenic source zones of the northwest frontier of the Himalayas to estimate the earthquake hazard parameters (maximum regional magnitude M max, β value of G-R relationship and seismic activity rate or intensity λ). For this purpose, a reliable earthquake catalogue which is homogeneous for M W ≥ 5.0 and complete during the period 1900 to 2010 is compiled. The Hindukush-Pamir Himalaya zone has been further divided into two seismic zones of shallow ( h ≤ 70 km) and intermediate depth ( h > 70 km) according to the variation of seismicity with depth in the subduction zone. The estimated earthquake hazard parameters by Bayesian approach are more stable and reliable with low standard deviations than other approaches, but the technique is more time consuming. In this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in all seismogenic source zones. The zones of estimated M max greater than 8.0 are related to the Sulaiman-Kirthar ranges, Hindukush-Pamir Himalaya and Himalayan Frontal Thrusts belt; suggesting more seismically hazardous regions in the examined area. The lowest value of M max (6.44) has been calculated in Northern-Pakistan and Hazara syntaxis zone which have estimated lowest activity rate 0.0023 events/day as compared to other zones. The Himalayan Frontal Thrusts belt exhibits higher earthquake magnitude (8.01) in next 100-years with 90 % probability level as compared to other zones, which reveals that this zone is more vulnerable to occurrence of a great earthquake. The obtained results in this study are directly useful for the probabilistic seismic hazard assessment in the examined region of Himalaya.

  18. Seismic hazard and risk assessment for large Romanian dams situated in the Moldavian Platform

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Popescu, Emilia; Otilia Placinta, Anica; Petruta Constantin, Angela; Toma Danila, Dragos; Borleanu, Felix; Emilian Toader, Victorin; Moldoveanu, Traian

    2016-04-01

    Besides periodical technical inspections, the monitoring and the surveillance of dams' related structures and infrastructures, there are some more seismic specific requirements towards dams' safety. The most important one is the seismic risk assessment that can be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine (2002), and Bureau (2003), taking into account the maximum expected peak ground motions at the dams site - values obtained using probabilistic hazard assessment approaches (Moldovan et al., 2008), the structures vulnerability and the downstream risk characteristics (human, economical, historic and cultural heritage, etc) in the areas that might be flooded in the case of a dam failure. Probabilistic seismic hazard (PSH), vulnerability and risk studies for dams situated in the Moldavian Platform, starting from Izvorul Muntelui Dam, down on Bistrita and following on Siret River and theirs affluent will be realized. The most vulnerable dams will be studied in detail and flooding maps will be drawn to find the most exposed downstream localities both for risk assessment studies and warnings. GIS maps that clearly indicate areas that are potentially flooded are enough for these studies, thus giving information on the number of inhabitants and goods that may be destroyed. Geospatial servers included topography is sufficient to achieve them, all other further studies are not necessary for downstream risk assessment. The results will consist of local and regional seismic information, dams specific characteristics and locations, seismic hazard maps and risk classes, for all dams sites (for more than 30 dams), inundation maps (for the most vulnerable dams from the region) and possible affected localities. The studies realized in this paper have as final goal to provide the local emergency services with warnings of a potential dam failure and ensuing flood as a result of an large earthquake occurrence, allowing further public training for evacuation. The work is supported from PNII/PCCA 2013 Project DARING 69/2014, financed by UEFISCDI, Romania. Bureau GJ (2003) "Dams and appurtenant facilities" Earthquake Engineering Handbook, CRS Press, WF Chen, and C Scawthorn (eds.), Boca Raton, pp. 26.1-26.47. Bureau GJ and Ballentine GD (2002) "A comprehensive seismic vulnerability and loss assessment of the State of Carolina using HAZUS. Part IV: Dam inventory and vulnerability assessment methodology", 7th National Conference on Earthquake Engineering, July 21-25, Boston, Earthquake Engineering Research Institute, Oakland, CA. Moldovan IA, Popescu E, Constantin A (2008), "Probabilistic seismic hazard assessment in Romania: application for crustal seismic active zones", Romanian Journal of Physics, Vol.53, Nos. 3-4

  19. Effective Collaboration Between Scientists and Local Governments to Improve Scientific Communication for Public Safety in Dallas and Irving, Texas

    NASA Astrophysics Data System (ADS)

    Blanpied, M. L.; Perry, S. C.; Carriere, J.; DeShon, H. R.; Oden, K.; Vaz, R.; Williams, R. A.; Stump, B. W.; Hayward, C.; Choy, G. L.; Hoover, S. M.; Mueller, C. S.; LaGrassa, N.; Miller, G.; Osburn, M.

    2016-12-01

    Felt earthquakes have occurred in the Dallas-Fort Worth-Irving area since 2008, raising concern about seismic risks and potential links to petroleum industry activities - and leading to a productive, long-standing interaction between earthquake scientists and local government officials. City staff, including emergency managers, formed the Dallas Irving Earthquake Working Group (DIEWG) in early 2015 to share information, learn about their new hazard, and coordinate public messages and response planning. The DIEWG has held regular meetings that included academic and government experts including scientists from Southern Methodist University (SMU) and the U.S. Geological Survey (USGS). SMU apprised DIEWG of monitoring and research results, and responded to media inquiries. USGS provided information about seismic hazard and the likelihood of damaging earthquakes, and worked with FEMA Regions VI & VIII to provide impact planning scenarios for plausible earthquakes of M4.8 and M5.6. USGS briefed DIEWG before the release of an assessment of the likelihood of damage from natural and induced earthquakes, as local officials needed to understand the information and its implications in order to translate for their constituents. DIEWG has now asked USGS to help to develop tabletop response exercises. Through these interactions, local officials and scientists increased understanding of each other's roles, capabilities and limitations. The interactions have also improved DIEWG members' understanding of earthquake risk and impact, supported hazard mitigation planning, influenced infrastructure and building code decisions, and informed conversations with residents and media. Input from DIEWG has improved scientists' translation of complex information for use in planning, and identified persistent misunderstandings about concepts and terminology that are relevant to many earthquake information products. A key aspect of this success has been the repeated personal interaction over time.

  20. Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS

    NASA Astrophysics Data System (ADS)

    Sabuncu, A.; Garagon Dogru, A.; Ozener, H.

    2013-12-01

    Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.

  1. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    NASA Astrophysics Data System (ADS)

    Lee, Kyungbook; Song, Seok Goo

    2017-09-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  2. Earthquake Catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude catalogue of the region. The results will serve as the input for the Seismic hazard assessment for the region

  3. Real-time Position Based Population Data Analysis and Visualization Using Heatmap for Hazard Emergency Response

    NASA Astrophysics Data System (ADS)

    Ding, R.; He, T.

    2017-12-01

    With the increased popularity in mobile applications and services, there has been a growing demand for more advanced mobile technologies that utilize real-time Location Based Services (LBS) data to support natural hazard response efforts. Compared to traditional sources like the census bureau that often can only provide historical and static data, an LBS service can provide more current data to drive a real-time natural hazard response system to more accurately process and assess issues such as population density in areas impacted by a hazard. However, manually preparing or preprocessing the data to suit the needs of the particular application would be time-consuming. This research aims to implement a population heatmap visual analytics system based on real-time data for natural disaster emergency management. System comprised of a three-layered architecture, including data collection, data processing, and visual analysis layers. Real-time, location-based data meeting certain polymerization conditions are collected from multiple sources across the Internet, then processed and stored in a cloud-based data store. Parallel computing is utilized to provide fast and accurate access to the pre-processed population data based on criteria such as the disaster event and to generate a location-based population heatmap as well as other types of visual digital outputs using auxiliary analysis tools. At present, a prototype system, which geographically covers the entire region of China and combines population heat map based on data from the Earthquake Catalogs database has been developed. It Preliminary results indicate that the generation of dynamic population density heatmaps based on the prototype system has effectively supported rapid earthquake emergency rescue and evacuation efforts as well as helping responders and decision makers to evaluate and assess earthquake damage. Correlation analyses that were conducted revealed that the aggregation and movement of people depended on various factors, including earthquake occurrence time and location of epicenter. This research hopes to continue to build upon the success of the prototype system in order to improve and extend the system to support the analysis of earthquakes and other types of natural hazard events.

  4. MARSite: Marmara as a Supersite

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.; Necmioglu, O.; Ergintav, S.; Ozel, A.; Erdik, M. O.

    2013-12-01

    The Marmara Region is one of the most active seismic regions in Turkey and also most densely populated and fast-developing part in the country. The region was effected with destructive earthquakes in its past, and the seismic hazard in Marmara Region has become a great concern especially after the Izmit and Duzce earthquakes in 1999 costing 18.000 people lives. Recent studies indicate that the region has a great potential to produce M≥7.0 earthquake within the next 30 years. Hence, a realistic assessment of the earthquake hazard in this area including Istanbul with more then 15 million inhabitants is a priority. MARsite project identifes the Marmara region as a ';Supersite' to aggregate on-shore, off-shore and space-based observations, comprehensive geophysical monitoring, improved hazard and risk assessments encompassed in an integrated set of activities. MARsite Consortium constitutes of 18 European research institutions with a long record of scientific history and success, and 3 SMEs, from 7 nations of the Euro-Mediterranean area. MARsite aims to harmonize geological, geophysical, geodetic and geochemical observations to provide a better view of the post-seismic deformation of the 1999 Izmit earthquake (in addition to the post-seismic signature of previous earthquakes), loading of submarine and inland active fault segments and transient pre-earthquake signals, related to stress loading with different tectonic properties in and around Marmara Sea. These studies are planned to contribute to high-quality rapid source-mechanism solutions and slip models, early warning and rapid-response studies. The project outputs will also be adapted to improve various phases of the risk management cycle with the creation of a link between the scientific community and end users. In this context, MARsite will develop novel geo-hazard monitoring instruments including high-resolution displacement meters, novel borehole instrumentation and sea-bottom gas emission and heat-flow measurement systems, in association with the relevant industrial sectors and SMEs. Data and the results of MARSite will be exploited through the integration of data management practices and coordination with ongoing research infrastructures. A dissemination and public outreach strategy will be further developed on the analysis of the target users and a communication plan will be produced to ensure effective dissemination. MARsite will represent a significant European contribution to the Supersite initiative and thus to the Global Earth Observation System (GEOSS), and it will lead to better scientific understanding of the geophysical processes, contributing in-situ data to a unifying e-infrastructure and reduce our vulnerability to geologic hazards.

  5. Seismic hazard assessment of Oregon highway truck routes.

    DOT National Transportation Integrated Search

    2012-06-01

    This research project developed a seismic risk assessment model along the major truck routes in Oregon. The study had adopted federally : developed software tools called Risk for Earthquake Damage to Roadway Systems (REDARS2) and HAZUS-MH. The model ...

  6. Maximum Magnitude and Probabilities of Induced Earthquakes in California Geothermal Fields: Applications for a Science-Based Decision Framework

    NASA Astrophysics Data System (ADS)

    Weiser, Deborah Anne

    Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.

  7. Management of Combined Natural Risks - A New Approach: Keynote Address

    NASA Astrophysics Data System (ADS)

    Hanisch, Jörg

    A new attempt is made to illustrate and to quantify the relationships of individual natural hazards, their combinations and the human vulnerability to natural hazards. During many catastrophic events, combinations of different natural events aggravate their occurrence substantially. Earthquakes are frequently associated with heavy landsliding (El Salvador 2001) and heavy rainstorms are able to trigger fast running debris flows and not only floods (like during the Mitch disaster in Central America in 1998). That signifies that natural hazard maps should show the combinations of different hazards and their genetic relationships. To put into effect this, first, the individual hazards have to be assessed and presented in hazard zones (0 to 3). Then these hazards zones will be overlain using GIS techniques. In this way, e.g., an earthquake-prone area which coincides with an area susceptible to landslides (ranking 0 to 3 as well) can show hazard concentrations of up to a value of 6, simply adding the individual hazard zones. To get the result of the corresponding risk zones, the vulnerability maps of human settlements and infra-structure have to be overlain on the maps of these combinations of natural hazards.

  8. Using a microfossil-based approach to constrain megathrust-induced coseismic land displacement in coastal Oregon, USA

    NASA Astrophysics Data System (ADS)

    Hawkes, A. D.; Horton, B. P.

    2007-05-01

    Paleoseismologists infer the amount of coseismic subsidence during plate-boundary earthquakes from stratigraphic changes in microfossils across sharp peat-mud and peat-sand contacts. However, the use of lithostratigraphic-based reconstructions is associated with a number of limitations, and these become particularly significant when examining low amplitude, short period variations that occur during a plate-boundary earthquake. To address this, paleoecologists working in the coastal zone have recently adopted a transfer- function approach to environmental reconstruction. Continuing subduction of the Juan de Fuca plate beneath the North America plate constitutes a major seismic hazard in the Pacific Northwest. The subduction zone interface presently lacks seismicity. The timing of the last great earthquake along the Cascadia subduction zone (1700AD) is now well refined by Japanese records of an orphan tsunami (no causal earthquake was felt in Japan) that was generated from an earthquake off the Pacific Northwest on the evening of January 26th 1700AD. I will apply the transfer function to modern foraminiferal datasets along coastal Oregon to analyze the fossil record and quantitatively determine the amount of vertical land movement associated with the 1700AD earthquake event. To date, we have collected 7 modern transects totaling 132 samples from the intertidal zone to the upland. We have also collected 9 cores recording the 1700AD earthquake. Furthermore, a 4m vibracore was collected and contains between 3 and 5 potential earthquake horizons. The 1700AD earthquake in the vibracore shows a distinct litho- and biostratigraphical change representing an instantaneous episode of subsidence of approximately 1m. However, development and application of the transfer function to such events will provide quantitative constrained estimates of coseismic land movement. Measurements that are more accurate are necessary to help modelers develop simulations that are more realistic in order to better assess earthquake and tsunami hazards. This will enable efficient and effective mitigation planning and preparation to minimize the personal and economic costs associated with such hazards.

  9. Topographic changes and their driving factors after 2008 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Li, C.; Wang, M.; Xie, J.; Liu, K.

    2017-12-01

    The Wenchuan Ms 8.0 Earthquake caused topographic change in the stricken areas because of the formation of numerous coseismic landslides. The emergence of new landslides and debris flows and movement of loose materials under the driving force of heavy rainfall could further shape the local topography. Dynamic topographic changes in mountainous areas stricken by major earthquakes have a strong linkage to the development and occurrence of secondary disasters. However, little attention has been paid to continuously monitoring mountain environment change after such earthquakes. A digital elevation model (DEM) is the main feature of the terrain surface, in our research, we extracted DEM in 2013 and 2015 of a typical mountainous area severely impacted by the 2008 Wenchuan earthquake from the ZY-3 stereo pair images with validation by field measurement. Combined with the elevation dataset in 2002 and 2010, we quantitatively assessed elevation changes in different years and qualitatively analyzed spatiotemporal variation of the terrain and mass movement across the study area. The results show that the earthquake stricken area experienced substantial elevation changes caused by seismic forces and subsequent rainfalls. Meanwhile, deposits after the earthquake are mainly accumulated on the river-channels and mountain ridges and deep gullies which increase the risk of other geo-hazards. And the heavy rainfalls after the earthquake have become the biggest driver of elevation reduction, which overwhelmed elevation increase during the major earthquake. Our study provided a better understanding of subsequent hazards and risks faced by residents and communities stricken by major earthquakes.

  10. Distinguishing megathrust from intraplate earthquakes using lacustrine turbidites (Laguna Lo Encañado, Central Chile)

    NASA Astrophysics Data System (ADS)

    Van Daele, Maarten; Araya-Cornejo, Cristian; Pille, Thomas; Meyer, Inka; Kempf, Philipp; Moernaut, Jasper; Cisternas, Marco

    2017-04-01

    One of the main challenges in seismically active regions is differentiating paleo-earthquakes resulting from different fault systems, such as the megathrust versus intraplate faults in subductions settings. Such differentiation is, however, key for hazard assessments based on paleoseismic records. Laguna Lo Encañado (33.7°S; 70.3°W; 2492 m a.s.l.) is located in the Central Chilean Andes, 50 km east of Santiago de Chile, a metropole with about 7,000,000 inhabitants. During the last century the study area experienced 3 large megathrust earthquakes (1906, 1985 and 2010) and 2 intraplate earthquakes (1945 and 1958) (Lomnitz, 1960). While the megathrust earthquakes cause Modified Mercalli Intensities (MMIs) of VI to VII at the lake (Van Daele et al., 2015), the intraplate earthquakes cause peak MMIs up to IX (Sepúlveda et al., 2008). Here we present a turbidite record of Laguna Lo Encañado going back to 1900 AD. While geophysical data (3.5 kHz subbottom seismic profiles and side-scan sonar data) provides a bathymetry and an overview of the sedimentary environment, we study 15 short cores in order to understand the depositional processes resulting in the encountered lacustrine turbidites. All mentioned earthquakes triggered turbidites in the lake, which are all linked to slumps in proximal areas, and are thus resulting from mass wasting of the subaquatic slopes. However, turbidites linked to the intraplate earthquakes are additionally covered by turbidites of a finer-grained, more clastic nature. We link the latter to post-seismic erosion of onshore landslides, which need higher MMIs to be triggered than subaquatic mass movements (Howarth et al., 2014). While intraplate earthquakes can cause MMIs up to IX and higher, megathrust earthquakes do not cause sufficiently high MMIs at the lake to trigger voluminous onshore landslides. Hence, the presence of these post-seismic turbidites allows to distinguish turbidites triggered by intraplate earthquakes from those triggered by megathrust earthquakes. These findings are an important step forward in the interpretation of lacustrine turbidites in subduction settings, and will eventually improve hazard assessments based on such paleoseismic records in the study area, and in other subduction zones. References Howarth et al., 2014. Lake sediments record high intensity shaking that provides insight into the location and rupture length of large earthquakes on the Alpine Fault, New Zealand. Earth and Planetary Science Letters 403, 340-351. Lomnitz, 1960. A study of the Maipo Valley earthquakes of September 4, 1958, Second World Conference on Earthquake Engineering, Tokyo and Kyoto, Japan, pp. 501-520. Sepulveda et al., 2008. New Findings on the 1958 Las Melosas Earthquake Sequence, Central Chile: Implications for Seismic Hazard Related to Shallow Crustal Earthquakes in Subduction Zones. Journal of Earthquake Engineering 12, 432-455. Van Daele et al., 2015. A comparison of the sedimentary records of the 1960 and 2010 great Chilean earthquakes in 17 lakes: Implications for quantitative lacustrine palaeoseismology. Sedimentology 62, 1466-1496.

  11. Modified Mercalli Intensity for scenario earthquakes in Evansville, Indiana

    USGS Publications Warehouse

    Cramer, Chris; Haase, Jennifer; Boyd, Oliver

    2012-01-01

    Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the fact that Evansville is close to the Wabash Valley and New Madrid seismic zones, there is concern about the hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake. Earthquake-hazard maps provide one way of conveying such estimates of strong ground shaking and will help the region prepare for future earthquakes and reduce earthquake-caused losses.

  12. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  13. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    USGS Publications Warehouse

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the methodology and data used to generate these preliminary maps. For more details about many of the topics in this summary the reader is referred to the Karadeniz (2007) and Chung (2007) Ph.D. theses.

  14. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  15. Current microseismicity and generating faults in the Gyeongju area, southeastern Korea

    NASA Astrophysics Data System (ADS)

    Han, Minhui; Kim, Kwang-Hee; Son, Moon; Kang, Su Young

    2017-01-01

    A study of microseismicity in a 15 × 20 km2 subregion of Gyeongju, southeastern Korea, establishes a direct link between minor earthquakes and known fault structures. The study area has a complex history of tectonic deformation and has experienced large historic earthquakes, with small earthquakes recorded since the beginning of modern instrumental monitoring. From 5 years of continuously recorded local seismic data, 311 previously unidentified microearthquakes can be reliably located using the double-difference algorithm. These newly discovered events occur in linear streaks that can be spatially correlated with active faults, which could pose a serious hazard to nearby communities. At-risk infrastructure includes the largest industrial park in South Korea, nuclear power plants, and disposal facilities for radioactive waste. The current work suggests that the southern segment of the Yeonil Tectonic Line and segments of the Seokup and Waup Basin boundary faults are active. For areas with high rates of microseismic activity, reliably located hypocenters are spatially correlated with mapped faults; in less active areas, earthquake clusters tend to occur at fault intersections. Microearthquakes in stable continental regions are known to exist, but have been largely ignored in assessments of seismic hazard because their magnitudes are well below the detection thresholds of seismic networks. The total number of locatable microearthquakes could be dramatically increased by lowering the triggering thresholds of network detection algorithms. The present work offers an example of how microearthquakes can be reliably detected and located with advanced techniques. This could make it possible to create a new database to identify subsurface fault geometries and modes of fault movement, which could then be considered in the assessments of seismic hazard in regions where major earthquakes are rare.

  16. Neo-deterministic seismic hazard scenarios for India—a preventive tool for disaster mitigation

    NASA Astrophysics Data System (ADS)

    Parvez, Imtiyaz A.; Magrin, Andrea; Vaccari, Franco; Ashish; Mir, Ramees R.; Peresan, Antonella; Panza, Giuliano Francesco

    2017-11-01

    Current computational resources and physical knowledge of the seismic wave generation and propagation processes allow for reliable numerical and analytical models of waveform generation and propagation. From the simulation of ground motion, it is easy to extract the desired earthquake hazard parameters. Accordingly, a scenario-based approach to seismic hazard assessment has been developed, namely the neo-deterministic seismic hazard assessment (NDSHA), which allows for a wide range of possible seismic sources to be used in the definition of reliable scenarios by means of realistic waveforms modelling. Such reliable and comprehensive characterization of expected earthquake ground motion is essential to improve building codes, particularly for the protection of critical infrastructures and for land use planning. Parvez et al. (Geophys J Int 155:489-508, 2003) published the first ever neo-deterministic seismic hazard map of India by computing synthetic seismograms with input data set consisting of structural models, seismogenic zones, focal mechanisms and earthquake catalogues. As described in Panza et al. (Adv Geophys 53:93-165, 2012), the NDSHA methodology evolved with respect to the original formulation used by Parvez et al. (Geophys J Int 155:489-508, 2003): the computer codes were improved to better fit the need of producing realistic ground shaking maps and ground shaking scenarios, at different scale levels, exploiting the most significant pertinent progresses in data acquisition and modelling. Accordingly, the present study supplies a revised NDSHA map for India. The seismic hazard, expressed in terms of maximum displacement (Dmax), maximum velocity (Vmax) and design ground acceleration (DGA), has been extracted from the synthetic signals and mapped on a regular grid over the studied territory.

  17. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  18. Seismic and tsunami hazard in Puerto Rico and the Virgin Islands

    USGS Publications Warehouse

    Dillon, William P.; Frankel, Arthur D.; Mueller, Charles S.; Rodriguez, Rafael W.; ten Brink, Uri S.

    1999-01-01

    Executive SummaryPuerto Rico and the Virgin Islands are located at an active plate boundary between the North American plate and the northeast corner of the Caribbean plate. The region was subject in historical times to large magnitude earthquakes and devastating tsunamis. A major downward tilt of the sea floor north of Puerto Rico and the Virgin Islands, large submarine rockslides, and an unusually large negative gravity anomaly are also indicative of a tectonically active region. Scientists have so far failed to explain the deformation of this region in a coherent and predictable picture, such as in California, and this has hampered their ability to assess seismic and tsunami hazards in the region. The NE corner of the Caribbean is unique among the seismically-active regions of the United States in that it is mostly covered by water. This fact presents an additional challenge for seismic and tsunami hazard assessment and mitigation.The workshop, convened in San Juan on March 23-24, 1999, was "historic" in that it brought together for the first time a broad spectrum of scientists, engineers, and public and private sector officials who deal with such diverse questions as tectonic models, probabilistic assessment of seismic hazard, prediction of tsunami runup, strong ground motion, building codes, stability of man-made structures, and the public’s preparedness for natural disasters. It was an opportunity for all the participants to find out how their own activity fit into the broad picture of science and how it aids society in hazard assessment and mitigation. In addition, the workshop was offered as a continuing education course at the Colegio de Ingenieros y Agrimensores de Puerto Rico, which assured a rapid dissemination of the results to the local community. A news conference which took place during the workshop alerted the public to the efforts of the USGS, other Federal agencies, the Commonwealth of Puerto Rico, universities and the private sector.During the first day of the workshop, participants from universities, federal institutions, and consulting firms in Puerto Rico, the Virgin Islands, the continental U.S., Dominican Republic, and Europe reviewed the present state of knowledge including a review and discussion of present plate models, recent GPS and seismic reflection data, seismicity, paleoseismology, and tsunamis. The state of earthquake/tsunami studies in Puerto Rico was presented by several faculty members from the University of Puerto Rico at Mayaguez. A preliminary seismic hazard map was presented by the USGS and previous hazard maps and economic loss assessments were considered. During the second day, the participants divided into working groups and prepared specific recommendations for future activities in the region along the six following topics below. Highlights of these recommended activities are:Marine geology and geophysics – Acquire deep-penetration seismic reflection and refraction data, deploy temporary ocean bottom seismometer arrays to record earthquakes, collect high-resolution multibeam bathymetry and side scan sonar data of the region, and in particular, the near shore region, and conduct focussed high-resolution seismic studies around faults. Determine slip rates of specific offshore faults. Assemble a GIS database for available marine geological and geophysical data.Paleoseismology and active faults - Field reconnaissance aimed at identifying Quaternary faults and determining their paleoseismic chronology and slip rates, as well as identifying and dating paleoliquefaction features from large earthquakes. Quaternary mapping of marine terraces, fluvial terraces and basins, beach ridges, etc., to establish framework for understanding neotectonic deformation of the island. Interpretation of aerial photography to identify possible Quaternary faults.Earthquake seismology – Determine an empirical seismic attenuation function using observations from local seismic networks and recently-installed broad-band stations. Evaluate existing earthquake catalogs from local networks and regional stations, complete the catalogs. Transcribe the pre-1991 network data from 9-track tape onto more stable archival media. Calibrate instruments of local networks. Use GPS measurement to constrain deformation rates used in seismic-hazard maps.Engineering – Prepare liquefaction susceptibility maps for the urban areas. Update and improve databases for types of site conditions. Collect site effect observations and near-surface geophysical measurements for future local (urban-area) hazard maps. Expand the number of instruments in the strong motion program. Develop fragility curves for Puerto Rico construction types and details, and carry out laboratory testing on selected types of mass-produced construction. Consider tsunami design in shoreline construction projects.Tsunami hazard - Extract tsunami observations from archives and develop a Caribbean historical tsunami database. Analyze prehistoric tsunami deposits. Collect accurate, up-to-date, near-shore topography and bathymetry for accurate inundation models. Prepare tsunami flooding and evacuation maps. Establish a Caribbean Tsunami Warning System for Puerto Rico and the Virgin Islands. Evaluate local, regional, national, and global seismic networks and equipment, and their role in a tsunami warning system.Societal concerns – Prepare warning messages, protocols, and evacuation routes for earthquake, tsunami, and landslide hazards for Puerto Rico and the U.S. Virgin Islands. Advocate enforcement of existing building codes. Prepare non-technical hazard assessment maps for political and educational uses. Raise the awareness of potentially affected populations by presentations at elementary schools, by the production of a tsunami video, and by distribution of earthquake preparedness manuals in newspaper supplements. Promote partnerships at state and federal level for long-term earthquake and tsunami hazard mitigation. This partnership should also include the private sector such as the insurance industry, telecommunication companies, and the engineering community.The following reports of the various working groups are the cumulative recommendations of the community of scientists, engineers, and public officials, who participated in the workshop. The list of participants and the workshop’s agenda are given in the appendix.Marine and Geology and Geophysics Working GroupPaleoseismology and Active Faults Working GroupJoint Working Group for Earthquake Seismology and EngineeringTsunami Working GroupSocietal Concerns Working Group

  19. Earthquake and Tsunami planning, outreach and awareness in Humboldt County, California

    NASA Astrophysics Data System (ADS)

    Ozaki, V.; Nicolini, T.; Larkin, D.; Dengler, L.

    2008-12-01

    Humboldt County has the longest coastline in California and is one of the most seismically active areas of the state. It is at risk from earthquakes located on and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ), other regional fault systems, and from distant sources elsewhere in the Pacific. In 1995 the California Division of Mines and Geology published the first earthquake scenario to include both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of representatives from government agencies, tribes, service groups, academia and the private sector from the three northern coastal California counties, was formed in 1996 to coordinate and promote earthquake and tsunami hazard awareness and mitigation. The RCTWG and its member agencies have sponsored a variety of projects including education/outreach products and programs, tsunami hazard mapping, signage and siren planning, and has sponsored an Earthquake - Tsunami Education Room at the Humboldt County fair for the past eleven years. Three editions of Living on Shaky Ground an earthquake-tsunami preparedness magazine for California's North Coast, have been published since 1993 and a fourth is due to be published in fall 2008. In 2007, Humboldt County was the first region in the country to participate in a tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD and the first area in California to conduct a full-scale tsunami evacuation drill. The County has conducted numerous multi-agency, multi-discipline coordinated exercises using county-wide tsunami response plan. Two Humboldt County communities were recognized as TsunamiReady by the National Weather Service in 2007. Over 300 tsunami hazard zone signs have been posted in Humboldt County since March 2008. Six assessment surveys from 1993 to 2006 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist concerns about tsunami hazard signs. Over the thirteen year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased from 51 to 73 percent and knowing what the Cascadia subduction zone is from 16 to 42 percent.

  20. A software framework for assessing the resilience of drinking water systems to disasters with an example earthquake case study

    DOE PAGES

    Klise, Katherine A.; Bynum, Michael; Moriarty, Dylan; ...

    2017-07-07

    Water utilities are vulnerable to a wide variety of human-caused and natural disasters. The Water Network Tool for Resilience (WNTR) is a new open source PythonTM package designed to help water utilities investigate resilience of water distribution systems to hazards and evaluate resilience-enhancing actions. In this paper, the WNTR modeling framework is presented and a case study is described that uses WNTR to simulate the effects of an earthquake on a water distribution system. The case study illustrates that the severity of damage is not only a function of system integrity and earthquake magnitude, but also of the available resourcesmore » and repair strategies used to return the system to normal operating conditions. While earthquakes are particularly concerning since buried water distribution pipelines are highly susceptible to damage, the software framework can be applied to other types of hazards, including power outages and contamination incidents.« less

  1. A software framework for assessing the resilience of drinking water systems to disasters with an example earthquake case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.; Bynum, Michael; Moriarty, Dylan

    Water utilities are vulnerable to a wide variety of human-caused and natural disasters. The Water Network Tool for Resilience (WNTR) is a new open source PythonTM package designed to help water utilities investigate resilience of water distribution systems to hazards and evaluate resilience-enhancing actions. In this paper, the WNTR modeling framework is presented and a case study is described that uses WNTR to simulate the effects of an earthquake on a water distribution system. The case study illustrates that the severity of damage is not only a function of system integrity and earthquake magnitude, but also of the available resourcesmore » and repair strategies used to return the system to normal operating conditions. While earthquakes are particularly concerning since buried water distribution pipelines are highly susceptible to damage, the software framework can be applied to other types of hazards, including power outages and contamination incidents.« less

  2. Potential for larger earthquakes in the East San Francisco Bay Area due to the direct connection between the Hayward and Calaveras Faults

    NASA Astrophysics Data System (ADS)

    Chaussard, E.; Bürgmann, R.; Fattahi, H.; Nadeau, R. M.; Taira, T.; Johnson, C. W.; Johanson, I.

    2015-04-01

    The Hayward and Calaveras Faults, two strike-slip faults of the San Andreas System located in the East San Francisco Bay Area, are commonly considered independent structures for seismic hazard assessment. We use Interferometric Synthetic Aperture RADAR to show that surface creep on the Hayward Fault continues 15 km farther south than previously known, revealing new potential for rupture and damage south of Fremont. The extended trace of the Hayward Fault, also illuminated by shallow repeating micro-earthquakes, documents a surface connection with the Calaveras Fault. At depths greater than 3-5 km, repeating micro-earthquakes located 10 km north of the surface connection highlight the 3-D wedge geometry of the junction. Our new model of the Hayward and Calaveras Faults argues that they should be treated as a single system with potential for earthquake ruptures generating events with magnitudes greater than 7, posing a higher seismic hazard to the East San Francisco Bay Area than previously considered.

  3. Earthquake hypocenters and focal mechanisms in central Oklahoma reveal a complex system of reactivated subsurface strike-slip faulting

    USGS Publications Warehouse

    McNamara, Daniel E.; Benz, Harley M.; Herrmann, Robert B.; Bergman, Eric A.; Earle, Paul S.; Holland, Austin F.; Baldwin, Randy W.; Gassner, A.

    2015-01-01

    The sharp increase in seismicity over a broad region of central Oklahoma has raised concern regarding the source of the activity and its potential hazard to local communities and energy industry infrastructure. Since early 2010, numerous organizations have deployed temporary portable seismic stations in central Oklahoma in order to record the evolving seismicity. In this study, we apply a multiple-event relocation method to produce a catalog of 3,639 central Oklahoma earthquakes from late 2009 through 2014. RMT source parameters were determined for 195 of the largest and best-recorded earthquakes. Combining RMT results with relocated seismicity enabled us to determine the length, depth and style-of-faulting occurring on reactivated subsurface fault systems. Results show that the majority of earthquakes occur on near vertical, optimally oriented (NE-SW and NW-SE), strike-slip faults in the shallow crystalline basement. These are necessary first order observations required to assess the potential hazards of individual faults in Oklahoma.

  4. Assessment of tsunami hazard for coastal areas of Shandong Province, China

    NASA Astrophysics Data System (ADS)

    Feng, Xingru; Yin, Baoshu

    2017-04-01

    Shandong province is located on the east coast of China and has a coastline of about 3100 km. There are only a few tsunami events recorded in the history of Shandong Province, but the tsunami hazard assessment is still necessary as the rapid economic development and increasing population of this area. The objective of this study was to evaluate the potential danger posed by tsunamis for Shandong Province. The numerical simulation method was adopted to assess the tsunami hazard for coastal areas of Shandong Province. The Cornell multi-grid coupled tsunami numerical model (COMCOT) was used and its efficacy was verified by comparison with three historical tsunami events. The simulated maximum tsunami wave height agreed well with the observational data. Based on previous studies and statistical analyses, multiple earthquake scenarios in eight seismic zones were designed, the magnitudes of which were set as the potential maximum values. Then, the tsunamis they induced were simulated using the COMCOT model to investigate their impact on the coastal areas of Shandong Province. The numerical results showed that the maximum tsunami wave height, which was caused by the earthquake scenario located in the sea area of the Mariana Islands, could reach up to 1.39 m off the eastern coast of Weihai city. The tsunamis from the seismic zones of the Bohai Sea, Okinawa Trough, and Manila Trench could also reach heights of >1 m in some areas, meaning that earthquakes in these zones should not be ignored. The inundation hazard was distributed primarily in some northern coastal areas near Yantai and southeastern coastal areas of Shandong Peninsula. When considering both the magnitude and arrival time of tsunamis, it is suggested that greater attention be paid to earthquakes that occur in the Bohai Sea. In conclusion, the tsunami hazard facing the coastal area of Shandong Province is not very serious; however, disasters could occur if such events coincided with spring tides or other extreme oceanic conditions. The results of this study will be useful for the design of coastal engineering projects and the establishment of a tsunami warning system for Shandong Province.

  5. Compiling an earthquake catalogue for the Arabian Plate, Western Asia

    NASA Astrophysics Data System (ADS)

    Deif, Ahmed; Al-Shijbi, Yousuf; El-Hussain, Issa; Ezzelarab, Mohamed; Mohamed, Adel M. E.

    2017-10-01

    The Arabian Plate is surrounded by regions of relatively high seismicity. Accounting for this seismicity is of great importance for seismic hazard and risk assessments, seismic zoning, and land use. In this study, a homogenous earthquake catalogue of moment-magnitude (Mw) for the Arabian Plate is provided. The comprehensive and homogenous earthquake catalogue provided in the current study spatially involves the entire Arabian Peninsula and neighboring areas, covering all earthquake sources that can generate substantial hazard for the Arabian Plate mainland. The catalogue extends in time from 19 to 2015 with a total number of 13,156 events, of which 497 are historical events. Four polygons covering the entire Arabian Plate were delineated and different data sources including special studies, local, regional and international catalogues were used to prepare the earthquake catalogue. Moment magnitudes (Mw) that provided by original sources were given the highest magnitude type priority and introduced to the catalogues with their references. Earthquakes with magnitude differ from Mw were converted into this scale applying empirical relationships derived in the current or in previous studies. The four polygons catalogues were included in two comprehensive earthquake catalogues constituting the historical and instrumental periods. Duplicate events were identified and discarded from the current catalogue. The present earthquake catalogue was declustered in order to contain only independent events and investigated for the completeness with time of different magnitude spans.

  6. The Effects of the Passage of Time from the 2011 Tohoku Earthquake on the Public’s Anxiety about a Variety of Hazards

    PubMed Central

    Nakayachi, Kazuya; Nagaya, Kazuhisa

    2016-01-01

    This research investigated whether the Japanese people’s anxiety about a variety of hazards, including earthquakes and nuclear accidents, has changed over time since the Tohoku Earthquake in 2011. Data from three nationwide surveys conducted in 2008, 2012, and 2015 were compared to see the change in societal levels of anxiety toward 51 types of hazards. The same two-phase stratified random sampling method was used to create the list of participants in each survey. The results showed that anxiety about earthquakes and nuclear accidents had increased for a time after the Tohoku Earthquake, and then decreased after a four-year time frame with no severe earthquakes and nuclear accidents. It was also revealed that the anxiety level for some hazards other than earthquakes and nuclear accidents had decreased at ten months after the Earthquake, and then remained unchanged after the four years. Therefore, ironically, a major disaster might decrease the public anxiety in general at least for several years. PMID:27589780

  7. Earthquake Hazard Assessment: Basics of Evaluation

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of NH4.7 session.

  8. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions Euros, shows that geological and geophysical investigations necessary to assess a reliable deterministic hazard evaluation are largely justified.

  9. RAPID-N: Assessing and mapping the risk of natural-hazard impact at industrial installations

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations can have major consequences due to the potential for release of hazardous materials, fires and explosions. Effective Natech risk reduction requires the identification of areas where this risk is high. However, recent studies have shown that there are hardly any methodologies and tools that would allow authorities to identify these areas. To work towards closing this gap, the European Commission's Joint Research Centre has developed the rapid Natech risk assessment and mapping framework RAPID-N. The tool, which is implemented in an online web-based environment, is unique in that it contains all functionalities required for running a full Natech risk analysis simulation (natural hazards severity estimation, equipment damage probability and severity calculation, modeling of the consequences of loss of containment scenarios) and for visualizing its results. The output of RAPID-N are risk summary reports and interactive risk maps which can be used for decision making. Currently, the tool focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. RAPID-N is available at http://rapidn.jrc.ec.europa.eu. This presentation will discuss the results of case-study calculations performed for selected flammable and toxic substances to test the capabilities of RAPID-N both for single- and multi-site earthquake Natech risk assessment. For this purpose, an Istanbul earthquake scenario provided by the Turkish government was used. The results of the exercise show that RAPID-N is a valuable decision-support tool that assesses the Natech risk and maps the consequence end-point distances. These end-point distances are currently defined by 7 kPa overpressure for Vapour Cloud Explosions, 2nd degree burns for pool fire (which is equivalent to a heat radiation of 5 kW/m2 for 40s), or the ERPG-2 concentration for atmospheric dispersion of toxic substances).

  10. An interview with Karl Steinbrugge

    USGS Publications Warehouse

    Spall, H.

    1985-01-01

    He has served on numerous national and international committees on earthquake hazards, and he is now a consulting structural engineer, specializing in earthquake hazard evaluation. At the present moment he is chairman of an independent panel of the Federal Emergency Management Agency that is reviewing the National Earthquake Hazards Reduction Program. Henry Spall recently asked Steinbrugge some questions about his long career. 

  11. Long Term RST Analyses of TIR Satellite Radiances in Different Geotectonic Contexts: Results and Implications for a Time-Dependent Assessment of Seismic Hazard (t-DASH)

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Armandi, B.; Coviello, I.; Eleftheriou, A.; Filizzola, C.; Genzano, N.; Lacava, T.; Lisi, M.; Paciello, R.; Pergola, N.; Satriano, V.; Vallianatos, F.

    2014-12-01

    A large scientific documentation is to-date available about the appearance of anomalous space-time patterns of geophysical parameters measured from days to week before earthquakes occurrence. Nevertheless up to now no one measurable parameter, no one observational methodology has demonstrated to be sufficiently reliable and effective for the implementation of an operational earthquake prediction system. In this context PRE-EARTHQUAKES EU-FP7 project (www.pre-earthquakes.org), investigated to which extent the combined use of different observations/parameters together with the refinement of data analysis methods, can reduce false alarm rates and improve reliability and precision (in the space-time domain) of predictions. Among the different parameters/methodologies proposed to provide useful information in the earthquake prediction system, since 2001 a statistical approach named RST (Robust Satellite Technique) has been used to identify the space-time fluctuations of Earth's emitted Thermal Infrared (TIR) radiation observed from satellite in seismically active regions. In this paper RST-based long-term analysis of TIR satellite record collected by MSG/SEVIRI over European (Italy and Greece) and by GOES/IMAGER over American (California) regions will be presented. Its enhanced potential, when applied in the framework of time-Dependent Assessment of Seismic Hazard (t-DASH) system continuously integrating independent observations, will be moreover discussed.

  12. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach

    PubMed Central

    Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.

    2014-01-01

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514

  13. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    PubMed

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  14. 44 CFR 361.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.2 Definitions. Cash Contribution means the State cash... to States under this section. They include specific activities or projects related to earthquake...

  15. 44 CFR 361.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.2 Definitions. Cash Contribution means the State cash... to States under this section. They include specific activities or projects related to earthquake...

  16. 44 CFR 361.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.2 Definitions. Cash Contribution means the State cash... to States under this section. They include specific activities or projects related to earthquake...

  17. 44 CFR 361.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.2 Definitions. Cash Contribution means the State cash... to States under this section. They include specific activities or projects related to earthquake...

  18. San Miguel Volcanic Seismic and Structure in Central America: Insight into the Physical Processes of Volcanoes

    NASA Astrophysics Data System (ADS)

    Patlan, E.; Velasco, A.; Konter, J. G.

    2010-12-01

    The San Miguel volcano lies near the city of San Miguel, El Salvador (13.43N and - 88.26W). San Miguel volcano, an active stratovolcano, presents a significant natural hazard for the city of San Miguel. In general, the internal state and activity of volcanoes remains an important component to understanding volcanic hazard. The main technology for addressing volcanic hazards and processes is through the analysis of data collected from the deployment of seismic sensors that record ground motion. Six UTEP seismic stations were deployed around San Miguel volcano from 2007-2008 to define the magma chamber and assess the seismic and volcanic hazard. We utilize these data to develop images of the earth structure beneath the volcano, studying the volcanic processes by identifying different sources, and investigating the role of earthquakes and faults in controlling the volcanic processes. We initially locate events using automated routines and focus on analyzing local events. We then relocate each seismic event by hand-picking P-wave arrivals, and later refine these picks using waveform cross correlation. Using a double difference earthquake location algorithm (HypoDD), we identify a set of earthquakes that vertically align beneath the edifice of the volcano, suggesting that we have identified a magma conduit feeding the volcano. We also apply a double-difference earthquake tomography approach (tomoDD) to investigate the volcano’s plumbing system. Our preliminary results show the extent of the magma chamber that also aligns with some horizontal seismicity. Overall, this volcano is very active and presents a significant hazard to the region.

  19. Transparent Global Seismic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen

    2013-04-01

    Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits of different risk management measures. The following global data, models and methodologies will be available in the platform. Some of these will be released to the public already before, such as the ISC-GEM global instrumental catalogue (released January 2013). Datasets: • Global Earthquake History Catalogue [1000-1903] • Global Instrumental Catalogue [1900-2009] • Global Geodetic Strain Rate Model • Global Active Fault Database • Tectonic Regionalisation • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerability Database • Socio-Economic Vulnerability and Resilience Indicators Models: • Seismic Source Models • Ground Motion (Attenuation) Models • Physical Exposure Models • Physical Vulnerability Models • Composite Index Models (social vulnerability, resilience, indirect loss) The aforementioned models developed under the GEM framework will be combined to produce estimates of hazard and risk at a global scale. Furthermore, building on many ongoing efforts and knowledge of scientists worldwide, GEM will integrate state-of-the-art data, models, results and open-source tools into a single platform that is to serve as a "clearinghouse" on seismic risk. The platform will continue to increase in value, in particular for use in local contexts, through contributions and collaborations with scientists and organisations worldwide.

  20. Probabilistic Seismic Hazard Maps for Ecuador

    NASA Astrophysics Data System (ADS)

    Mariniere, J.; Beauval, C.; Yepes, H. A.; Laurence, A.; Nocquet, J. M.; Alvarado, A. P.; Baize, S.; Aguilar, J.; Singaucho, J. C.; Jomard, H.

    2017-12-01

    A probabilistic seismic hazard study is led for Ecuador, a country facing a high seismic hazard, both from megathrust subduction earthquakes and shallow crustal moderate to large earthquakes. Building on the knowledge produced in the last years in historical seismicity, earthquake catalogs, active tectonics, geodynamics, and geodesy, several alternative earthquake recurrence models are developed. An area source model is first proposed, based on the seismogenic crustal and inslab sources defined in Yepes et al. (2016). A slightly different segmentation is proposed for the subduction interface, with respect to Yepes et al. (2016). Three earthquake catalogs are used to account for the numerous uncertainties in the modeling of frequency-magnitude distributions. The hazard maps obtained highlight several source zones enclosing fault systems that exhibit low seismic activity, not representative of the geological and/or geodetical slip rates. Consequently, a fault model is derived, including faults with an earthquake recurrence model inferred from geological and/or geodetical slip rate estimates. The geodetical slip rates on the set of simplified faults are estimated from a GPS horizontal velocity field (Nocquet et al. 2014). Assumptions on the aseismic component of the deformation are required. Combining these alternative earthquake models in a logic tree, and using a set of selected ground-motion prediction equations adapted to Ecuador's different tectonic contexts, a mean hazard map is obtained. Hazard maps corresponding to the percentiles 16 and 84% are also derived, highlighting the zones where uncertainties on the hazard are highest.

  1. Space Geodesy and the New Madrid Seismic Zone

    NASA Astrophysics Data System (ADS)

    Smalley, Robert; Ellis, Michael A.

    2008-07-01

    One of the most contentious issues related to earthquake hazards in the United States centers on the midcontinent and the origin, magnitudes, and likely recurrence intervals of the 1811-1812 New Madrid earthquakes that occurred there. The stakeholder groups in the debate (local and state governments, reinsurance companies, American businesses, and the scientific community) are similar to the stakeholder groups in regions more famous for large earthquakes. However, debate about New Madrid seismic hazard has been fiercer because of the lack of two fundamental components of seismic hazard estimation: an explanatory model for large, midplate earthquakes; and sufficient or sufficiently precise data about the causes, effects, and histories of such earthquakes.

  2. Coseismic Stress Changes of the 2016 Mw 7.8 Kaikoura, New Zealand, Earthquake and Its Implication for Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Shan, B.; LIU, C.; Xiong, X.

    2017-12-01

    On 13 November 2016, an earthquake with moment magnitude Mw 7.8 stroke North Canterbury, New Zealand as result of shallow oblique-reverse faulting close to boundary between the Pacific and Australian plates in the South Island, collapsing buildings and resulting in significant economic losses. The distribution of early aftershocks extended about 150 km to the north-northeast of the mainshock, suggesting the potential of earthquake triggering in this complex fault system. Strong aftershocks following major earthquakes present significant challenges for locals' reconstruction and rehabilitation. The regions around the mainshock may also suffer from earthquakes triggered by the Kaikoura earthquake. Therefore, it is significantly important to outline the regions with potential aftershocks and high seismic hazard to mitigate future disasters. Moreover, this earthquake ruptured at least 13 separate faults, and provided an opportunity to test the theory of earthquake stress triggering for a complex fault system. In this study, we calculated the coseismic Coulomb Failure Stress changes (ΔCFS) caused by the Kaikoura earthquake on the hypocenters of both historical earthquakes and aftershocks of this event with focal mechanisms. Our results show that the percentage of earthquake with positive ΔCFS within the aftershocks is higher than that of historical earthquakes. It means that the Kaikoura earthquake effectively influence the seismicity in this region. The aftershocks of Mw 7.8 Kaikoura earthquake are mainly located in the regions with positive ΔCFS. The aftershock distributions can be well explained by the coseismic ΔCFS. Furthermore, earthquake-induced ΔCFS on the surrounding active faults was further discussed. The northeastern Alpine fault, the southwest part of North Canterbury Fault, parts of the Marlborough fault system and the southwest ends of the Kapiti-Manawatu faults are significantly stressed by the Kaikoura earthquake. The earthquake-induced stress increments would raise the probability of earthquake occurrence on these faults.

  3. Earthquake Hazard Assessment: an Independent Review

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  4. Earthquake recurrence and risk assessment in circum-Pacific seismic gaps

    USGS Publications Warehouse

    Thatcher, W.

    1989-01-01

    THE development of the concept of seismic gaps, regions of low earthquake activity where large events are expected, has been one of the notable achievements of seismology and plate tectonics. Its application to long-term earthquake hazard assessment continues to be an active field of seismological research. Here I have surveyed well documented case histories of repeated rupture of the same segment of circum-Pacific plate boundary and characterized their general features. I find that variability in fault slip and spatial extent of great earthquakes rupturing the same plate boundary segment is typical rather than exceptional but sequences of major events fill identified seismic gaps with remarkable order. Earthquakes are concentrated late in the seismic cycle and occur with increasing size and magnitude. Furthermore, earthquake rup-ture starts near zones of concentrated moment release, suggesting that high-slip regions control the timing of recurrent events. The absence of major earthquakes early in the seismic cycle indicates a more complex behaviour for lower-slip regions, which may explain the observed cycle-to-cycle diversity of gap-filling sequences. ?? 1989 Nature Publishing Group.

  5. Incorporating natural hazard assessments into municipal master-plans; case-studies from Israel

    NASA Astrophysics Data System (ADS)

    Katz, Oded

    2010-05-01

    The active Dead Sea Rift (DSR) runs along the length of Israel, making the entire state susceptible to earthquake-related hazards. Current building codes generally acknowledge seismic hazards and direct engineers towards earthquake-resistant structures. However, hazard mapping on a scale fit for municipal/governmental planning is subject to local initiative and is currently not mandatory as seems necessary. In the following, a few cases of seismic-hazard evaluation made by the Geological Survey of Israel are presented, emphasizing the reasons for their initiation and the way results were incorporated (or not). The first case is a seismic hazard qualitative micro-zonation invited by the municipality of Jerusalem as part of a new master plan. This work resulted in maps (1:50,000; GIS format) identifying areas prone to (1) amplification of seismic shaking due to site characteristics (outcrops of soft rocks or steep topography) and (2) sites with earthquake induced landslide (EILS) hazard. Results were validated using reports from the 1927, M=6.2 earthquake that originated along the DSR about 30km east of Jerusalem. Although the hazard maps were accepted by municipal authorities, practical use by geotechnical engineers working within the frame of the new master-plan was not significant. The main reason for that is apparently a difference of opinion between the city-engineers responsible for implementing the new master-plan and the geologists responsible of the hazard evaluation. The second case involves evaluation of EILS hazard for two towns located further north along the DSR, Zefat and Tiberias. Both were heavily damaged more than once by strong earthquakes in past centuries. Work was carried out as part of a governmental seismic-hazard reduction program. The results include maps (1:10,000 scales) of sites with high EILS hazard identified within city limits. Maps (in GIS format) were sent to city engineers with reports explaining the methods and results. As far as we know, widespread implementation of the maps within municipal master plans never came about, and there was no open discussion between city engineers and the Geological Survey. The main reasons apparently are (1) a lack, until recently, of mandatory building codes requiring incorporation of EILS hazard; (2) budget priorities; (3) failure to involve municipality personnel in planning and executing the EILS hazard evaluation. These cases demonstrate that for seismic hazard data to be incorporated and implemented within municipal master-plans there needs to be (1) active involvement of municipal officials and engineers from the early planning stages of the evaluation campaign, and (2) a-priori dedication of funds towards implementation of evaluation results.

  6. Scenario based tsunami wave height estimation towards hazard evaluation for the Hellenic coastline and examples of extreme inundation zones in South Aegean

    NASA Astrophysics Data System (ADS)

    Melis, Nikolaos S.; Barberopoulou, Aggeliki; Frentzos, Elias; Krassanakis, Vassilios

    2016-04-01

    A scenario based methodology for tsunami hazard assessment is used, by incorporating earthquake sources with the potential to produce extreme tsunamis (measured through their capacity to cause maximum wave height and inundation extent). In the present study we follow a two phase approach. In the first phase, existing earthquake hazard zoning in the greater Aegean region is used to derive representative maximum expected earthquake magnitude events, with realistic seismotectonic source characteristics, and of greatest tsunamigenic potential within each zone. By stacking the scenario produced maximum wave heights a global maximum map is constructed for the entire Hellenic coastline, corresponding to all expected extreme offshore earthquake sources. Further evaluation of the produced coastline categories based on the maximum expected wave heights emphasizes the tsunami hazard in selected coastal zones with important functions (i.e. touristic crowded zones, industrial zones, airports, power plants etc). Owing to its proximity to the Hellenic Arc, many urban centres and being a popular tourist destination, Crete Island and the South Aegean region are given a top priority to define extreme inundation zoning. In the second phase, a set of four large coastal cities (Kalamata, Chania, Heraklion and Rethymno), important for tsunami hazard, due i.e. to the crowded beaches during the summer season or industrial facilities, are explored towards preparedness and resilience for tsunami hazard in Greece. To simulate tsunamis in the Aegean region (generation, propagation and runup) the MOST - ComMIT NOAA code was used. High resolution DEMs for bathymetry and topography were joined via an interface, specifically developed for the inundation maps in this study and with similar products in mind. For the examples explored in the present study, we used 5m resolution for the topography and 30m resolution for the bathymetry, respectively. Although this study can be considered as preliminary, it can also form the basis to further develop a scenario based inundation model database that can be used as an operational tool, for fast assessing tsunami prone zones during a real tsunami crisis.

  7. Earthquake mechanism and seafloor deformation for tsunami generation

    USGS Publications Warehouse

    Geist, Eric L.; Oglesby, David D.; Beer, Michael; Kougioumtzoglou, Ioannis A.; Patelli, Edoardo; Siu-Kui Au, Ivan

    2014-01-01

    Tsunamis are generated in the ocean by rapidly displacing the entire water column over a significant area. The potential energy resulting from this disturbance is balanced with the kinetic energy of the waves during propagation. Only a handful of submarine geologic phenomena can generate tsunamis: large-magnitude earthquakes, large landslides, and volcanic processes. Asteroid and subaerial landslide impacts can generate tsunami waves from above the water. Earthquakes are by far the most common generator of tsunamis. Generally, earthquakes greater than magnitude (M) 6.5–7 can generate tsunamis if they occur beneath an ocean and if they result in predominantly vertical displacement. One of the greatest uncertainties in both deterministic and probabilistic hazard assessments of tsunamis is computing seafloor deformation for earthquakes of a given magnitude.

  8. The contribution of the Global Change Observatory Central Asia to seismic hazard and risk assessment in the Central Asian region

    NASA Astrophysics Data System (ADS)

    Parolai, S.; Bindi, D.; Haberland, C. A.; Pittore, M.; Pilz, M.; Rosenau, M.; Schurr, B.; Wieland, M.; Yuan, X.

    2012-12-01

    Central Asia has one of the world's highest levels of earthquake hazard, owing to its exceptionally high deformation rates. Moreover, vulnerability to natural disasters in general is increasing, due to rising populations and a growing dependence on complex lifelines and technology. Therefore, there is an urgent need to undertake seismic hazard and risk assessment in this region, while at the same time improving upon existing methodologies, including the consideration of temporal variability in the seismic hazard, and in structural and social vulnerability. Over the last few years, the German Research Center for Geosciences (GFZ), in collaboration with local partners, has initiated a number of scientific activities within the framework of the Global Change Observatory Central Asia (GCO-CA). The work is divided into projects with specific concerns: - The installation and maintenance of the Central-Asian Real-time Earthquake MOnitoring Network (CAREMON) and the setup of a permanent wireless mesh network for structural health monitoring in Bishkek. - The TIPAGE and TIPTIMON projects focus on the geodynamics of the Tien-Shan, Pamir and Hindu Kush region, the deepest and most active intra-continental subduction zone in the world. The work covers time scales from millions of years to short-term snapshots based on geophysical measurements of seismotectonic activity and of the physical properties of the crust and upper mantle, as well as their coupling with other surface processes (e.g., landslides). - Existing risk analysis methods assume time-independent earthquake hazard and risk, although temporal changes are likely to occur due to, for example, co- and post-seismic changes in the regional stress field. We therefore aim to develop systematic time-dependent hazard and risk analysis methods in order to undertake the temporal quantification of earthquake activity (PROGRESS). - To improve seismic hazard assessment for better loss estimation, detailed site effects studies are necessary. Temporary seismic networks have been installed in several Central Asian cities (Bishkek and Karakol, Kyrgyzstan; Dushanbe, Tajikistan; Tashkent, Uzbekistan) within the framework of the Earthquake Model Central Asia (EMCA), a regional program of the Global Earthquake Model (GEM). The empirically estimated site effects have already helped to improve real-time risk scenarios for Bishkek and will be applied to other major cities. - A crucial requirement for disaster risk reduction involves the analysis of the vulnerability of existing building inventories. Whereas traditional approaches are very time- and cost-consuming, and even impossible given the high rate of urbanization in Central Asian capitals, our integrated approach is based on satellite remote sensing and ground-based omni-directional imaging, providing building inventories and thus structural vulnerability over large areas (EMCA, GEM-IDCT). All mentioned activities are carried out within the framework of cooperation between GFZ and regional national institutes, in particular the Central Asian Institute for Applied Geosciences. Altogether, this comprehensive and long-term risk analyses and research program will lead to a better understanding of the coupling of endogene and exogene processes and the identification of their impact on society.

  9. Increasing seismicity in the U. S. midcontinent: Implications for earthquake hazard

    USGS Publications Warehouse

    Ellsworth, William L.; Llenos, Andrea L.; McGarr, Arthur F.; Michael, Andrew J.; Rubinstein, Justin L.; Mueller, Charles S.; Petersen, Mark D.; Calais, Eric

    2015-01-01

    Earthquake activity in parts of the central United States has increased dramatically in recent years. The space-time distribution of the increased seismicity, as well as numerous published case studies, indicates that the increase is of anthropogenic origin, principally driven by injection of wastewater coproduced with oil and gas from tight formations. Enhanced oil recovery and long-term production also contribute to seismicity at a few locations. Preliminary hazard models indicate that areas experiencing the highest rate of earthquakes in 2014 have a short-term (one-year) hazard comparable to or higher than the hazard in the source region of tectonic earthquakes in the New Madrid and Charleston seismic zones.

  10. Building Loss Estimation for Earthquake Insurance Pricing

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Erdik, M.; Sesetyan, K.; Demircioglu, M. B.; Fahjan, Y.; Siyahi, B.

    2005-12-01

    After the 1999 earthquakes in Turkey several changes in the insurance sector took place. A compulsory earthquake insurance scheme was introduced by the government. The reinsurance companies increased their rates. Some even supended operations in the market. And, most important, the insurance companies realized the importance of portfolio analysis in shaping their future market strategies. The paper describes an earthquake loss assessment methodology that can be used for insurance pricing and portfolio loss estimation that is based on our work esperience in the insurance market. The basic ingredients are probabilistic and deterministic regional site dependent earthquake hazard, regional building inventory (and/or portfolio), building vulnerabilities associated with typical construction systems in Turkey and estimations of building replacement costs for different damage levels. Probable maximum and average annualized losses are estimated as the result of analysis. There is a two-level earthquake insurance system in Turkey, the effect of which is incorporated in the algorithm: the national compulsory earthquake insurance scheme and the private earthquake insurance system. To buy private insurance one has to be covered by the national system, that has limited coverage. As a demonstration of the methodology we look at the case of Istanbul and use its building inventory data instead of a portfolio. A state-of-the-art time depent earthquake hazard model that portrays the increased earthquake expectancies in Istanbul is used. Intensity and spectral displacement based vulnerability relationships are incorporated in the analysis. In particular we look at the uncertainty in the loss estimations that arise from the vulnerability relationships, and at the effect of the implemented repair cost ratios.

  11. The 2012 Ferrara seismic sequence: Regional crustal structure, earthquake sources, and seismic hazard

    NASA Astrophysics Data System (ADS)

    Malagnini, Luca; Herrmann, Robert B.; Munafò, Irene; Buttinelli, Mauro; Anselmi, Mario; Akinci, Aybige; Boschi, E.

    2012-10-01

    Inadequate seismic design codes can be dangerous, particularly when they underestimate the true hazard. In this study we use data from a sequence of moderate-sized earthquakes in northeast Italy to validate and test a regional wave propagation model which, in turn, is used to understand some weaknesses of the current design spectra. Our velocity model, while regionalized and somewhat ad hoc, is consistent with geophysical observations and the local geology. In the 0.02-0.1 Hz band, this model is validated by using it to calculate moment tensor solutions of 20 earthquakes (5.6 ≥ MW ≥ 3.2) in the 2012 Ferrara, Italy, seismic sequence. The seismic spectra observed for the relatively small main shock significantly exceeded the design spectra to be used in the area for critical structures. Observations and synthetics reveal that the ground motions are dominated by long-duration surface waves, which, apparently, the design codes do not adequately anticipate. In light of our results, the present seismic hazard assessment in the entire Pianura Padana, including the city of Milan, needs to be re-evaluated.

  12. Quantifying 10 years of Improvements in Earthquake and Tsunami Monitoring in the Caribbean and Adjacent Regions

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.

    2014-12-01

    The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and along the coast will also be addressed. With the support of Member States and other countries and organizations it has been possible to significantly expand the sea level network thus reducing the amount of time it now takes to verify tsunamis.

  13. Large Historical Earthquakes and Tsunami Hazards in the Western Mediterranean: Source Characteristics and Modelling

    NASA Astrophysics Data System (ADS)

    Harbi, Assia; Meghraoui, Mustapha; Belabbes, Samir; Maouche, Said

    2010-05-01

    The western Mediterranean region was the site of numerous large earthquakes in the past. Most of these earthquakes are located at the East-West trending Africa-Eurasia plate boundary and along the coastline of North Africa. The most recent recorded tsunamigenic earthquake occurred in 2003 at Zemmouri-Boumerdes (Mw 6.8) and generated ~ 2-m-high tsunami wave. The destructive wave affected the Balearic Islands and Almeria in southern Spain and Carloforte in southern Sardinia (Italy). The earthquake provided a unique opportunity to gather instrumental records of seismic waves and tide gauges in the western Mediterranean. A database that includes a historical catalogue of main events, seismic sources and related fault parameters was prepared in order to assess the tsunami hazard of this region. In addition to the analysis of the 2003 records, we study the 1790 Oran and 1856 Jijel historical tsunamigenic earthquakes (Io = IX and X, respectively) that provide detailed observations on the heights and extension of past tsunamis and damage in coastal zones. We performed the modelling of wave propagation using NAMI-DANCE code and tested different fault sources from synthetic tide gauges. We observe that the characteristics of seismic sources control the size and directivity of tsunami wave propagation on both northern and southern coasts of the western Mediterranean.

  14. The 1945 Balochistan earthquake and probabilistic tsunami hazard assessment for the Makran subduction zone

    NASA Astrophysics Data System (ADS)

    Höchner, Andreas; Babeyko, Andrey; Zamora, Natalia

    2014-05-01

    Iran and Pakistan are countries quite frequently affected by destructive earthquakes. For instance, the magnitude 6.6 Bam earthquake in 2003 in Iran with about 30'000 casualties, or the magnitude 7.6 Kashmir earthquake 2005 in Pakistan with about 80'000 casualties. Both events took place inland, but in terms of magnitude, even significantly larger events can be expected to happen offshore, at the Makran subduction zone. This small subduction zone is seismically rather quiescent, but a tsunami caused by a thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Additionally, some recent publications raise the question of the possiblity of rare but huge magnitude 9 events at the Makran subduction zone. We first model the historic Balochistan event and its effect in terms of coastal wave heights, and then generate various synthetic earthquake and tsunami catalogs including the possibility of large events in order to asses the tsunami hazard at the affected coastal regions. Finally, we show how an effective tsunami early warning could be achieved by the use of an array of high-precision real-time GNSS (Global Navigation Satellite System) receivers along the coast.

  15. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    USGS Publications Warehouse

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  16. Bayesian inference on earthquake size distribution: a case study in Italy

    NASA Astrophysics Data System (ADS)

    Licia, Faenza; Carlo, Meletti; Laura, Sandri

    2010-05-01

    This paper is focused on the study of earthquake size statistical distribution by using Bayesian inference. The strategy consists in the definition of an a priori distribution based on instrumental seismicity, and modeled as a power law distribution. By using the observed historical data, the power law is then modified in order to obtain the posterior distribution. The aim of this paper is to define the earthquake size distribution using all the seismic database available (i.e., instrumental and historical catalogs) and a robust statistical technique. We apply this methodology to the Italian seismicity, dividing the territory in source zones as done for the seismic hazard assessment, taken here as a reference model. The results suggest that each area has its own peculiar trend: while the power law is able to capture the mean aspect of the earthquake size distribution, the posterior emphasizes different slopes in different areas. Our results are in general agreement with the ones used in the seismic hazard assessment in Italy. However, there are areas in which a flattening in the curve is shown, meaning a significant departure from the power law behavior and implying that there are some local aspects that a power law distribution is not able to capture.

  17. Probabilistic Seismic Hazard Assessment for a NPP in the Upper Rhine Graben, France

    NASA Astrophysics Data System (ADS)

    Clément, Christophe; Chartier, Thomas; Jomard, Hervé; Baize, Stéphane; Scotti, Oona; Cushing, Edward

    2015-04-01

    The southern part of the Upper Rhine Graben (URG) straddling the border between eastern France and western Germany, presents a relatively important seismic activity for an intraplate area. A magnitude 5 or greater shakes the URG every 25 years and in 1356 a magnitude greater than 6.5 struck the city of Basel. Several potentially active faults have been identified in the area and documented in the French Active Fault Database (web site in construction). These faults are located along the Graben boundaries and also inside the Graben itself, beneath heavily populated areas and critical facilities (including the Fessenheim Nuclear Power Plant). These faults are prone to produce earthquakes with magnitude 6 and above. Published regional models and preliminary geomorphological investigations provided provisional assessment of slip rates for the individual faults (0.1-0.001 mm/a) resulting in recurrence time of 10 000 years or greater for magnitude 6+ earthquakes. Using a fault model, ground motion response spectra are calculated for annual frequencies of exceedance (AFE) ranging from 10-4 to 10-8 per year, typical for design basis and probabilistic safety analyses of NPPs. A logic tree is implemented to evaluate uncertainties in seismic hazard assessment. The choice of ground motion prediction equations (GMPEs) and range of slip rate uncertainty are the main sources of seismic hazard variability at the NPP site. In fact, the hazard for AFE lower than 10-4 is mostly controlled by the potentially active nearby Rhine River fault. Compared with areal source zone models, a fault model localizes the hazard around the active faults and changes the shape of the Uniform Hazard Spectrum at the site. Seismic hazard deaggregations are performed to identify the earthquake scenarios (including magnitude, distance and the number of standard deviations from the median ground motion as predicted by GMPEs) that contribute to the exceedance of spectral acceleration for the different AFE levels. These scenarios are finally examined with respect to the seismicity data available in paleoseismic, historic and instrumental catalogues.

  18. Seismic Hazard Assessment for the Baku City and Absheron Peninsula, Azerbaijan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babayev, Gulam R.

    2006-03-23

    This paper deals with the seismic hazard assessment for Baku and the Absheron peninsula. The assessment is based on the information on the features of earthquake ground motion excitation, seismic wave propagation (attenuation), and site effect. I analyze active faults, seismicity, soil and rock properties, geological cross-sections, the borehole data of measured shear-wave velocity, lithology, amplification factor of each geological unit, geomorphology, topography, and basic rock and surface ground motions. To estimate peak ground acceleration (PGA) at the surface, PGA at the basic rock is multiplied by the amplification parameter of each surface layers. Quaternary soft deposits, representing a highmore » risk due to increasing PGA values at surface, are studied in detail. For a near-zone target earthquake PGA values are compared to intensity at MSK-64 scale for the Absheron peninsula. The amplification factor for the Baku city is assessed and provides estimations for a level of a seismic motion and seismic intensity of the studied area.« less

  19. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174 Section 120.174 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Policies Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake...

  20. Loss modeling for pricing catastrophic bonds.

    DOT National Transportation Integrated Search

    2008-12-01

    In the research, a loss estimation framework is presented that directly relates seismic : hazard to seismic response to damage and hence to losses. A Performance-Based Earthquake : Engineering (PBEE) approach towards assessing the seismic vulnerabili...

  1. Seismic hazard map of the western hemisphere

    USGS Publications Warehouse

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of the Americas depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings). The largest seismic hazard values in the western hemisphere generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes. Although the largest earthquakes ever recorded are the 1960 Chile and 1964 Alaska subduction zone earthquakes, the largest seismic hazard (PGA) value in the Americas is in Southern California (U.S.), along the San Andreas fault.

  2. [Comment on “Should Memphis build for California's earthquakes?”] from S.E. Hough

    NASA Astrophysics Data System (ADS)

    Hough, Susan E.

    The recent article by Seth Stein, Joseph Tomasello, and Andrew Newman raised thought-provoking questions about one of the most vexing open issues in hazard assessment in the United States: the hazard posed by ostensibly infrequent, large, mid-continental earthquakes. Many of the technical issues raised by this article are addressed by A. D. Frankel in the accompanying comment. I concur with this, and will only address and/or elaborate on a few additional issues here: (1) Detailed paleoseismic investigations have shown that the New Madrid region experienced sequences of large earthquakes around 900 and 1450 A.D.in addition to the historic events in 1811-1812. With a repeat time on the order of 400-500 years, these cannot be considered infrequent events. Paleoseismic investigations also reveal evidence that the prehistoric “events” were also sequences of two to three large earthquakes with a similar overall distribution of liquefaction in the greater New Madrid region as produced by the 1811-1812 sequence [Tuttle et al., 2002]. And if, as evidence suggests, the zone produces characteristic earthquakes, one will not see a commensurate rate of moderate events, as would be the case if seismicity followed the Gutenburg-Richter distribution.

  3. The ESI scale, an ethical approach to the evaluation of seismic hazards

    NASA Astrophysics Data System (ADS)

    Porfido, Sabina; Nappi, Rosa; De Lucia, Maddalena; Gaudiosi, Germana; Alessio, Giuliana; Guerrieri, Luca

    2015-04-01

    The dissemination of correct information about seismic hazard is an ethical duty of scientific community worldwide. A proper assessment of a earthquake severity and impact should not ignore the evaluation of its intensity, taking into account both the effects on humans, man-made structures, as well as on the natural evironment. We illustrate the new macroseismic scale that measures the intensity taking into account the effects of earthquakes on the environment: the ESI 2007 (Environmental Seismic Intensity) scale (Michetti et al., 2007), ratified by the INQUA (International Union for Quaternary Research) during the XVII Congress in Cairns (Australia). The ESI scale integrates and completes the traditional macroseismic scales, of which it represents the evolution, allowing to assess the intensity parameter also where buildings are absent or damage-based diagnostic elements saturate. Each degree reflects the corresponding strength of an earthquake and the role of ground effects, evaluating the Intensity on the basis of the characteristics and size of primary (e.g. surface faulting and tectonic uplift/subsidence) and secondary effects (e.g. ground cracks, slope movements, liquefaction phenomena, hydrological changes, anomalous waves, tsunamis, trees shaking, dust clouds and jumping stones). This approach can be considered "ethical" because helps to define the real scenario of an earthquake, regardless of the country's socio-economic conditions and level of development. Here lies the value and the relevance of macroseismic scales even today, one hundred years after the death of Giuseppe Mercalli, who conceived the homonymous scale for the evaluation of earthquake intensity. For an appropriate mitigation strategy in seismic areas, it is fundamental to consider the role played by seismically induced effects on ground, such as active faults (size in length and displacement) and secondary effects (the total area affecting). With these perspectives two different cases studies have been reviewed: the destructive 1976 February 4 Guatemala, earthquake (M 7.5) and the 1743 February 20 Nardò, historical earthquake (Salento, Southern Italy). The re-analysis of both earthquakes contributes to define more realistic seismic scenarios in terms of intensities assessment and consequent regional seismic hazards. References Michetti A.M., Esposito E., Guerrieri L., Porfido S., Serva L., Tatevossian R., Vittori E., Audemard F., Azuma T., Clague J., Comerci V., Gurpinar A., Mccalpin J., Mohammadioun B., Mörner N.A, Ota Y. And E. Roghozin - 2007. Intensity Scale ESI 2007, Mem. Descrittive della Carta Geologica d'Italia, Roma, 74, 53 pp

  4. 2014 Update of the Pacific Northwest portion of the U.S. National Seismic Hazard Maps

    USGS Publications Warehouse

    Frankel, Arthur; Chen, Rui; Petersen, Mark; Moschetti, Morgan P.; Sherrod, Brian

    2015-01-01

    Several aspects of the earthquake characterization were changed for the Pacific Northwest portion of the 2014 update of the national seismic hazard maps, reflecting recent scientific findings. New logic trees were developed for the recurrence parameters of M8-9 earthquakes on the Cascadia subduction zone (CSZ) and for the eastern edge of their rupture zones. These logic trees reflect recent findings of additional M8 CSZ earthquakes using offshore deposits of turbidity flows and onshore tsunami deposits and subsidence. These M8 earthquakes each rupture a portion of the CSZ and occur in the time periods between M9 earthquakes that have an average recurrence interval of about 500 years. The maximum magnitude was increased for deep intraslab earthquakes. An areal source zone to account for the possibility of deep earthquakes under western Oregon was expanded. The western portion of the Tacoma fault was added to the hazard maps.

  5. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction. We have developed a methodology for synthesizing physics-based broadband ground motion that incorporates the effects of realistic earthquake rupture along specific faults and the actual geology between the source and site.

  6. Preparation of Synthetic Earthquake Catalogue and Tsunami Hazard Curves in Marmara Sea using Monte Carlo Simulations

    NASA Astrophysics Data System (ADS)

    Bayraktar, Başak; Özer Sözdinler, Ceren; Necmioǧlu, Öcal; Meral Özel, Nurcan

    2017-04-01

    The Marmara Sea and its surrounding is one of the most populated areas in Turkey. Many densely populated cities, such as megacity Istanbul with a population of more than 14 million, a great number of industrial facilities in largest capacity and potential, refineries, ports and harbors are located along the coasts of Marmara Sea. The region is highly seismically active. There has been a wide range of studies in this region regarding the fault mechanisms, seismic activities, earthquakes and triggered tsunamis in the Sea of Marmara. The historical documents reveal that the region has been experienced many earthquakes and tsunamis in the past. According to Altinok et al. (2011), 35 tsunami events happened in Marmara Sea between BC 330 and 1999. As earthquakes are expected in Marmara Sea with the break of segments of North Anatolian Fault (NAF) in the future, the region should be investigated in terms of the possibility of tsunamis by the occurrence of earthquakes in specific return periods. This study aims to make probabilistic tsunami hazard analysis in Marmara Sea. For this purpose, the possible sources of tsunami scenarios are specified by compiling the earthquake catalogues, historical records and scientific studies conducted in the region. After compiling all this data, a synthetic earthquake and tsunami catalogue are prepared using Monte Carlo simulations. For specific return periods, the possible epicenters, rupture lengths, widths and displacements are determined with Monte Carlo simulations assuming the angles of fault segments as deterministic. For each earthquake of synthetic catalogue, the tsunami wave heights will be calculated at specific locations along Marmara Sea. As a further objective, this study will determine the tsunami hazard curves for specific locations in Marmara Sea including the tsunami wave heights and their probability of exceedance. This work is supported by SATREPS-MarDim Project (Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education in Turkey) and JICA (Japan International Cooperation Agency). The authors would like to acknowledge the project MARsite - New Directions in Seismic Hazard assessment through Focused Earth Observation in the Marmara Supersite (FP7-ENV.2012 6.4-2, Grant 308417 - see NH2.3/GMPV7.4/SM7.7). The authors also would like to acknowledge Prof. Dr. Mustafa Erdik and Prof. Dr. Sinan Akkar for their valuable feedback and guidance throughout this study.

  7. Prospective testing of neo-deterministic seismic hazard scenarios for the Italian territory

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Vaccari, Franco; Kossobokov, Vladimir; Panza, Giuliano F.

    2013-04-01

    A reliable and comprehensive characterization of expected seismic ground shaking, eventually including the related time information, is essential in order to develop effective mitigation strategies and increase earthquake preparedness. Moreover, any effective tool for SHA must demonstrate its capability in anticipating the ground shaking related with large earthquake occurrences, a result that can be attained only through rigorous verification and validation process. So far, the major problems in classical probabilistic methods for seismic hazard assessment, PSHA, consisted in the adequate description of the earthquake recurrence, particularly for the largest and sporadic events, and of the attenuation models, which may be unable to account for the complexity of the medium and of the seismic sources and are often weekly constrained by the available observations. Current computational resources and physical knowledge of the seismic waves generation and propagation processes allow nowadays for viable numerical and analytical alternatives to the use of attenuation relations. Accordingly, a scenario-based neo-deterministic approach, NDSHA, to seismic hazard assessment has been proposed, which allows considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement readily applicable to seismic isolation techniques. Based on NDSHA, an operational integrated procedure for seismic hazard assessment has been developed, that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of formally defined earthquake predictions. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with algorithms for the realistic modeling of ground motion. Accordingly, a set of deterministic scenarios of ground motion at bedrock, which refers to the time interval when a strong event is likely to occur within the alerted area, can be defined by means of full waveform modeling, both at regional and local scale. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are regularly updated every two months since 2006. The routine application of the time-dependent NDSHA approach provides information that can be useful in assigning priorities for timely mitigation actions and, at the same time, allows for a rigorous prospective testing and validation of the proposed methodology. As an example, for sites where ground shaking values greater than 0.2 g are estimated at bedrock, further investigations can be performed taking into account the local soil conditions, to assess the performances of relevant structures, such as historical and strategic buildings. The issues related with prospective testing and validation of the time-dependent NDSHA scenarios will be discussed, illustrating the results obtained for the recent strong earthquakes in Italy, including the May 20, 2012 Emilia earthquake.

  8. OCCUPATIONAL SAFETY AND HEALTH IN DISASTER RESTORATION ACTIVITY AFTER SOME MAJOR EARTHQUAKES

    NASA Astrophysics Data System (ADS)

    Toyosawa, Yasuo; Itoh, Kazuya; Kikkawa, Naotaka

    Occupational safety and health in disaster restoration activity following the Great Hanshin Earthquake (1995), Niigata Chuetsu Earthquake (2004), Niigata Chuetsu-oki Earthquake (2007) Great East Japan Earthquake (2011) were analyzed and characterized in order to raise awareness on the risks and hazards in such work. In this scenario, the predominant type of accident is a "fall" which increases mainly due to the fact that labourers are working to repair houses and buildings. On the other hand, landslides were prevalent in the Niigata Chuetsu Earthquake, resulting in more accidents occurring during geotechnical works rather than in buildings construction works. In the abnormal conditions that characterize recovery activities, when safety and health measures have a tendency to be neglected, it is important to reinstate adequate measures as soon as possible by carrying out the usial risk assessments.

  9. 41 CFR 128-1.8001 - Background.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...

  10. 41 CFR 128-1.8001 - Background.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...

  11. 41 CFR 128-1.8001 - Background.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...

  12. 41 CFR 128-1.8001 - Background.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...

  13. 41 CFR 128-1.8001 - Background.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...

  14. Setting the Stage for Harmonized Risk Assessment by Seismic Hazard Harmonization in Europe (SHARE)

    NASA Astrophysics Data System (ADS)

    Woessner, Jochen; Giardini, Domenico; SHARE Consortium

    2010-05-01

    Probabilistic seismic hazard assessment (PSHA) is arguably one of the most useful products that seismology can offer to society. PSHA characterizes the best available knowledge on the seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results form the baseline for informed decision making, such as building codes or insurance rates and provide essential input to each risk assessment application. Several large scale national and international projects have recently been launched aimed at improving and harmonizing PSHA standards around the globe. SHARE (www.share-eu.org) is the European Commission funded project in the Framework Programme 7 (FP-7) that will create an updated, living seismic hazard model for the Euro-Mediterranean region. SHARE is a regional component of the Global Earthquake Model (GEM, www.globalquakemodel.org), a public/private partnership initiated and approved by the Global Science Forum of the OECD-GSF. GEM aims to be the uniform, independent and open access standard to calculate and communicate earthquake hazard and risk worldwide. SHARE itself will deliver measurable progress in all steps leading to a harmonized assessment of seismic hazard - in the definition of engineering requirements, in the collection of input data, in procedures for hazard assessment, and in engineering applications. SHARE scientists will create a unified framework and computational infrastructure for seismic hazard assessment and produce an integrated European probabilistic seismic hazard assessment (PSHA) model and specific scenario based modeling tools. The results will deliver long-lasting structural impact in areas of societal and economic relevance, they will serve as reference for the Eurocode 8 (EC8) application, and will provide homogeneous input for the correct seismic safety assessment for critical industry, such as the energy infrastructures and the re-insurance sector. SHARE will cover the whole European territory, the Maghreb countries in the Southern Mediterranean and Turkey in the Eastern Mediterranean. By strongly including the seismic engineering community, the project maintains a direct connection to the Eurocode 8 applications and the definition of the Nationally Determined Parameters, through the participation of the CEN/TC250/SC8 committee in the definition of the output specification requirements and in the hazard validation. SHARE will thus produce direct outputs for risk assessment. With this contribution, we focus on providing an overview of the goals and current achievement of the project.

  15. 44 CFR 361.4 - Matching contributions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.4 Matching contributions. (a) All State...

  16. 44 CFR 361.4 - Matching contributions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.4 Matching contributions. (a) All State...

  17. 44 CFR 361.4 - Matching contributions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.4 Matching contributions. (a) All State...

  18. 44 CFR 361.4 - Matching contributions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HOMELAND SECURITY PREPAREDNESS NATIONAL EARTHQUAKE HAZARDS REDUCTION ASSISTANCE TO STATE AND LOCAL GOVERNMENTS Earthquake Hazards Reduction Assistance Program § 361.4 Matching contributions. (a) All State...

  19. Study on Frequency content in seismic hazard analysis in West Azarbayjan and East Azarbayjan provinces (Iran)

    NASA Astrophysics Data System (ADS)

    Behzadafshar, K.; Abbaszadeh Shahri, A.; Isfandiari, K.

    2012-12-01

    ABSTRACT: Iran plate is prone to earthquake, occurrence of destructive earthquakes approximately every 5 years certify it. Due to existence of happened great earthquakes and large number of potential seismic sources (active faults) which some of them are responsible for great earthquakes the North-West of Iran which is located in junction of Alborz and Zagros seismotectonic provinces (Mirzaii et al, 1998) is an interesting area for seismologists. Considering to population and existence of large cities like Tabriz, Ardabil and Orumiyeh which play crucial role in industry and economy of Iran, authors decided to focus on study of seismic hazard assessment in these two provinces to achieve ground acceleration in different frequency content and indicate critical frequencies in the studied area. It is important to note that however lots of studies have been done in North -West of Iran, but building code modifications also need frequency content analysis to asses seismic hazard more precisely which has been done in the present study. Furthermore, in previous studies have been applied free download softwares which were provided before 2000 but the most important advantage of this study is applying professional industrial software which has been written in 2009 and provided by authors. This applied software can cover previous software weak points very well such as gridding potential sources, attention to the seismogenic zone and applying attenuation relationships directly. Obtained hazard maps illustrate that maximum accelerations will be experienced in North West to South East direction which increased by frequency reduction from 100 Hz to 10 Hz then decreased by frequency reduce (to 0.25 Hz). Maximum acceleration will be occurred in the basement in 10 HZ frequency content. Keywords: hazard map, Frequency content, seismogenic zone, Iran

  20. Seismic Hazard Analysis for Armenia and its Surrounding Areas

    NASA Astrophysics Data System (ADS)

    Klein, E.; Shen-Tu, B.; Mahdyiar, M.; Karakhanyan, A.; Pagani, M.; Weatherill, G.; Gee, R. C.

    2017-12-01

    The Republic of Armenia is located within the central part of a large, 800 km wide, intracontinental collision zone between the Arabian and Eurasian plates. Active deformation occurs along numerous structures in the form of faulting, folding, and volcanism distributed throughout the entire zone from the Bitlis-Zargos suture belt to the Greater Caucasus Mountains and between the relatively rigid Back Sea and Caspian Sea blocks without any single structure that can be claimed as predominant. In recent years, significant work has been done on mapping active faults, compiling and reviewing historic and paleoseismological studies in the region, especially in Armenia; these recent research contributions have greatly improved our understanding of the seismogenic sources and their characteristics. In this study we performed a seismic hazard analysis for Armenia and its surrounding areas using the latest detailed geological and paleoseismological information on active faults, strain rates estimated from kinematic modeling of GPS data and all available historic earthquake data. The seismic source model uses a combination of characteristic earthquake and gridded seismicity models to take advantage of the detailed knowledge of the known faults while acknowledging the distributed deformation and regional tectonic environment of the collision zone. In addition, the fault model considers earthquake ruptures that include single and multi-segment or fault rupture scenarios with earthquakes that can rupture any part of a multiple segment fault zone. The ground motion model uses a set of ground motion prediction equations (GMPE) selected from a pool of GMPEs based on the assessment of each GMPE against the available strong motion data in the region. The hazard is computed in the GEM's OpenQuake engine. We will present final hazard results and discuss the uncertainties associated with various input data and their impact on the hazard at various locations.

  1. Optimal Scaling of Aftershock Zones using Ground Motion Forecasts

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.

    2018-02-01

    The spatial distribution of aftershocks following major earthquakes has received significant attention due to the shaking hazard these events pose for structures and populations in the affected region. Forecasting the spatial distribution of aftershock events is an important part of the estimation of future seismic hazard. A simple spatial shape for the zone of activity has often been assumed in the form of an ellipse having semimajor axis to semiminor axis ratio of 2.0. However, since an important application of these calculations is the estimation of ground shaking hazard, an effective criterion for forecasting future aftershock impacts is to use ground motion prediction equations (GMPEs) in addition to the more usual approach of using epicentral or hypocentral locations. Based on these ideas, we present an aftershock model that uses self-similarity and scaling relations to constrain parameters as an option for such hazard assessment. We fit the spatial aspect ratio to previous earthquake sequences in the studied regions, and demonstrate the effect of the fitting on the likelihood of post-disaster ground motion forecasts for eighteen recent large earthquakes. We find that the forecasts in most geographic regions studied benefit from this optimization technique, while some are better suited to the use of the a priori aspect ratio.

  2. The 1868 Hayward fault, California, earthquake: Implications for earthquake scaling relations on partially creeping faults

    USGS Publications Warehouse

    Hough, Susan E.; Martin, Stacey

    2015-01-01

    The 21 October 1868 Hayward, California, earthquake is among the best-characterized historical earthquakes in California. In contrast to many other moderate-to-large historical events, the causative fault is clearly established. Published magnitude estimates have been fairly consistent, ranging from 6.8 to 7.2, with 95% confidence limits including values as low as 6.5. The magnitude is of particular importance for assessment of seismic hazard associated with the Hayward fault and, more generally, to develop appropriate magnitude–rupture length scaling relations for partially creeping faults. The recent reevaluation of archival accounts by Boatwright and Bundock (2008), together with the growing volume of well-calibrated intensity data from the U.S. Geological Survey “Did You Feel It?” (DYFI) system, provide an opportunity to revisit and refine the magnitude estimate. In this study, we estimate the magnitude using two different methods that use DYFI data as calibration. Both approaches yield preferred magnitude estimates of 6.3–6.6, assuming an average stress drop. A consideration of data limitations associated with settlement patterns increases the range to 6.3–6.7, with a preferred estimate of 6.5. Although magnitude estimates for historical earthquakes are inevitably uncertain, we conclude that, at a minimum, a lower-magnitude estimate represents a credible alternative interpretation of available data. We further discuss implications of our results for probabilistic seismic-hazard assessment from partially creeping faults.

  3. 78 FR 4380 - Information Collection Activity; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    .... Abstract: The Earthquake Hazards Reduction Act of 1977 (42 U.S.C. 7701 et seq.) was enacted to reduce risks to life and property through the National Earthquake Hazards Reduction Program (NEHRP). The Federal... construction methods to make structures earthquake resistant. Executive Order 12699 of January 5, 1990, Seismic...

  4. Assessment and Prediction of Natural Hazards from Satellite Imagery

    PubMed Central

    Gillespie, Thomas W.; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan

    2013-01-01

    Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth’s surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth’s surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space. PMID:25170186

  5. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  6. Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquakes

    NASA Astrophysics Data System (ADS)

    Wdowinski, S.; Peng, Z.; Ferrier, K.; Lin, C. H.; Hsu, Y. J.; Shyu, J. B. H.

    2017-12-01

    Earthquakes, landslides, and tropical cyclones are extreme hazards that pose significant threats to human life and property. Some of the couplings between these hazards are well known. For example, sudden, widespread landsliding can be triggered by large earthquakes and by extreme rainfall events like tropical cyclones. Recent studies have also shown that earthquakes can be triggered by erosional unloading over 100-year timescales. In a NASA supported project, titled "Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquake", we study triggering relations between these hazard types. The project focuses on such triggering relations in Taiwan, which is subjected to very wet tropical storms, landslides, and earthquakes. One example for such triggering relations is the 2009 Morakot typhoon, which was the wettest recorded typhoon in Taiwan (2850 mm of rain in 100 hours). The typhoon caused widespread flooding and triggered more than 20,000 landslides, including the devastating Hsiaolin landslide. Six months later, the same area was hit by the 2010 M=6.4 Jiashian earthquake near Kaohsiung city, which added to the infrastructure damage induced by the typhoon and the landslides. Preliminary analysis of temporal relations between main-shock earthquakes and the six wettest typhoons in Taiwan's past 50 years reveals similar temporal relations between M≥5 events and wet typhoons. Future work in the project will include remote sensing analysis of landsliding, seismic and geodetic monitoring of landslides, detection of microseismicity and tremor activities, and mechanical modeling of crustal stress changes due to surface unloading.

  7. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 21. Seismic Source Zones of the Eastern United States and Seismic Zoning of the Atlantic Seaboard and Appalachian Regions.

    DTIC Science & Technology

    1986-08-01

    1812 earthquakes, and this produced Reelfoot Lake (Fuller, 1912). 10. .6. r. .,-- UPLIFT Uplift is known to be occurring in two regions in the...axes, as does the 11 mile (18 km) long Reelfoot Lake , formed during the 1811 and 1812 earthquakes (Fuller, 1912). The trend of the probable fault...the Reelfoot Lake basin to the northeast has subsided (Fig. 37). Monoclinal structure and shallow faults have been located along the scarp between the

  8. Reduction of earthquake risk in the united states: Bridging the gap between research and practice

    USGS Publications Warehouse

    Hays, W.W.

    1998-01-01

    Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE.

  9. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    NASA Astrophysics Data System (ADS)

    D'Alessandro, A.; Luzio, D.; D'Anna, G.

    2014-09-01

    In this paper, we introduce a project for the realization of the first European real-time urban seismic network based on Micro Electro-Mechanical Systems (MEMS) technology. MEMS accelerometers are a highly enabling technology, and nowadays, the sensitivity and the dynamic range of these sensors are such as to allow the recording of earthquakes of moderate magnitude even at a distance of several tens of kilometers. Moreover, thanks to their low cost and smaller size, MEMS accelerometers can be easily installed in urban areas in order to achieve an urban seismic network constituted by high density of observation points. The network is being implemented in the Acireale Municipality (Sicily, Italy), an area among those with the highest hazard, vulnerability and exposure to the earthquake of the Italian territory. The main objective of the implemented urban network will be to achieve an effective system for post-earthquake rapid disaster assessment. The earthquake recorded, also that with moderate magnitude will be used for the effective seismic microzonation of the area covered by the network. The implemented system will be also used to realize a site-specific earthquakes early warning system.

  10. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro; Alexander, Nicholas A.; Kongko, Widjo; Muhari, Abdul

    2017-12-01

    This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0) that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan - including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal-vertical evacuation time maps - has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  11. The Rurrand Fault, Germany: A Holocene surface rupture and new slip rate estimates

    NASA Astrophysics Data System (ADS)

    Grützner, Christoph; Fischer, Peter; Reicherter, Klaus

    2016-04-01

    Very low deformation rates in continental interiors are a challenge for research on active tectonics and seismic hazard. Faults tend to have very long earthquake recurrence intervals and morphological evidence of surface faulting is often obliterated by erosion and sedimentation. The Lower Rhine Graben in Central Europe is characterized by slow active faults with individual slip rates of well less than 0.1 mm/a. As a consequence, most geodetic techniques fail to record tectonic motions and the morphological expression of the faults is subtle. Although damaging events are known from this region, e.g. the 1755/56 Düren earthquakes series, there is no account for surface rupturing events in instrumental and historical records. Owing to the short temporal coverage with respect to the fault recurrence intervals, these records probably fail to depict the maximum possible magnitudes. In this study we used morphological evidence from a 1 m airborne LiDAR survey, near surface geophysics, and paleoseismological trenching to identify surface rupturing earthquakes at the Rurrand Fault between Cologne and Aachen in W Germany. LiDAR data allowed identifying a young fault strand parallel to the already known main fault with the subtle morphological expression of recent surface faulting. In the paleoseismological trenches we found evidence for two surface rupturing earthquakes. The most recent event occurred in the Holocene, and a previous earthquake probably happened in the last 150 ka. Geophysical data allowed us to estimate a minimum slip rate of 0.03 mm/a from an offset gravel horizon. We estimate paleomagnitudes of MW5.9-6.8 based on the observed offsets in the trench (<0.5 m per event) and fault scaling relationships. Our data imply that the Rurrand Fault did not creep during the last 150 ka, but rather failed in large earthquakes. These events were much stronger than those known from historical sources. We are able to show that the Rurrand Fault did not rupture the surface during the Düren 1755/56 seismic crisis and conclude that these events likely occurred on another nearby fault system or did not rupture the surface at all. The very long recurrence interval of 25-65 ka for surface rupturing events illustrates the problems of assessing earthquake hazard in such slowly deforming regions. We emphasize that geological data must be included in seismic hazard and surface rupture hazard assessments in order to obtain a complete picture of a region's seismic potential.

  12. DSOD Procedures for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Howard, J. K.; Fraser, W. A.

    2005-12-01

    DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum shaking levels are provided for sites far from active faulting. Our procedures and standards are presented at the DSOD website http://damsafety.water.ca.gov/. We review our methods and tools periodically under the guidance of our Consulting Board for Earthquake Analysis (and expect to make changes pending NGA completion), mindful that frequent procedural changes can interrupt design evaluations.

  13. Forecasting probabilistic seismic shaking for greater Tokyo from 400 years of intensity observations

    USGS Publications Warehouse

    Bozkurt, S.B.; Stein, R.S.; Toda, S.

    2007-01-01

    The long recorded history of earthquakes in Japan affords an opportunity to forecast seismic shaking exclusively from past shaking. We calculate the time-averaged (Poisson) probability of severe shaking by using more than 10,000 intensity observations recorded since AD 1600 in a 350 km-wide box centered on Tokyo. Unlike other hazard-assessment methods, source and site effects are included without modeling, and we do not need to know the size or location of any earthquake nor the location and slip rate of any fault. The two key assumptions are that the slope of the observed frequency-intensity relation at every site is the same, and that the 400-year record is long enough to encompass the full range of seismic behavior. Tests we conduct here suggest that both assumptions are sound. The resulting 30-year probability of IJMA ??? 6 shaking (??? PGA ??? 0.4 g or MMI ??? IX) is 30%-40% in Tokyo, Kawasaki, and Yokohama, and 10% 15% in Chiba and Tsukuba. This result means that there is a 30% chance that 4 million people will be subjected to IJMA ??? 6 shaking during an average 30-year period. We also produce exceedance maps of PGA for building-code regulations, and calculate short-term hazard associated with a hypothetical catastrophe bond. Our results resemble an independent assessment developed from conventional seismic hazard analysis for greater Tokyo. ?? 2007, Earthquake Engineering Research Institute.

  14. Proposal for a model to assess the effect of seismic activity on the triggering of debris flows

    NASA Astrophysics Data System (ADS)

    Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Luna, Byron Quan; Nadim, Farrokh

    2013-04-01

    Landslide triggered by earthquakes is a serious threat for many communities around the world, and in some cases is known to have caused 25-50% of the earthquake fatalities. Seismic shaking can contribute to the triggering of debris flows either during the seismic event or indirectly by increasing the susceptibility of the slope to debris flow during intense rainfall in a period after the seismic event. The paper proposes a model to quantify both these effects. The model is based on an infinite slope formulation where precipitation and earthquakes influence the slope stability as follows: (1) During the shaking, the factor of safety is reduced due to cyclic pore pressure build-up where the cyclic pore pressure is modelled as a function of earthquake duration and intensity (measured as number of equivalent shear stress cycles and cyclic shear stress magnitude) and in-situ soil conditions (measured as average normalised shear stress). The model is calibrated using cyclic triaxial and direct simple shear (DSS) test data on clay and sand. (2) After the shaking, the factor of safety is modified using a combined empirical and analytical model that links observed earthquake induced changes in rainfall thresholds for triggering of debris flow to an equivalent reduction in soil shear strength. The empirical part uses data from past earthquakes to propose a conceptual model linking a site-specific reduction factor for rainfall intensity threshold (needed to trigger debris flows) to earthquake magnitude, distance from the epicentre and time period after the earthquake. The analytical part is a hydrological model for transient rainfall infiltration into an infinite slope in order to translate the change in rainfall intensity threshold into an equivalent reduction in soil shear strength. This is generalised into a functional form giving a site-specific shear strength reduction factor as function of earthquake history and soil conditions. The model is suitable for hazard and risk assessment at local and regional scale for earthquake and rainfall induced landslide. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).

  15. Classification of Earthquake-triggered Landslide Events - Review of Classical and Particular Cases

    NASA Astrophysics Data System (ADS)

    Braun, A.; Havenith, H. B.; Schlögel, R.

    2016-12-01

    Seismically induced landslides often contribute to a significant degree to the losses related to earthquakes. The identification of possible extends of landslide affected areas can help to target emergency measures when an earthquake occurs or improve the resilience of inhabited areas and critical infrastructure in zones of high seismic hazard. Moreover, landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes in paleoseismic studies, allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. Inspired by classical reviews of earthquake induced landslides, e.g. by Keefer or Jibson, we present here a review of factors contributing to earthquake triggered slope failures based on an `event-by-event' classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, `Intensity', `Fault', `Topographic energy', `Climatic conditions' and `Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be crosschecked. We present cases where our prediction model performs well and discuss particular cases where it does not. These are e.g. cases of far distant, delayed or ancient earthquake induced landslides.

  16. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    NASA Astrophysics Data System (ADS)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses due to events of varying severities and recurrence intervals, annual premium rates can be set with some longer term risk planning in mind. However, this metric is particularly sensitive to high frequency, moderate magnitude events. Inclusion of earthquake aftershock sequence characteristics into the stochastic event set may have a strong impact on the AAL, depending on the time window of aftershocks that is taken into account. We will present our model of the aftershock-derived, time-dependent hazard for the region of the two earthquakes and will bring about a detailed view on regional, short-term hazard. Dealing with this short-term hazard poses a challenge to the earthquake insurance business. In this presentation we will look at these short-term hazard changes from a risk perspective and quantify the impact on earthquake risk in terms of the main risk metrics used in the industry.

  17. Integrating LiDAR Data into Earth Science Education

    NASA Astrophysics Data System (ADS)

    Robinson, S. E.; Arrowsmith, R.; de Groot, R. M.; Crosby, C. J.; Whitesides, A. S.; Colunga, J.

    2010-12-01

    The use of high-resolution topography derived from Light Detection and Ranging (LiDAR) in the study of active tectonics is widespread and has become an indispensable tool to better understand earthquake hazards. For this reason and the spectacular representation of the phenomena the data provide, it is appropriate to integrate these data into the Earth science education curriculum. A collaboration between Arizona State University, the OpenTopography Facility, and the Southern California Earthquake Center are developing, three earth science education products to inform students and other audiences about LiDAR and its application to active tectonics research. First, a 10-minute introductory video titled LiDAR: Illuminating Earthquakes was produced and is freely available online through the OpenTopography portal and SCEC. The second product is an update and enhancement of the Wallace Creek Interpretive Trail website (www.scec.org/wallacecreek). LiDAR topography data products have been added along with the development of a virtual tour of the offset channels at Wallace Creek using the B4 LiDAR data within the Google Earth environment. The virtual tour to Wallace Creek is designed as a lab activity for introductory undergraduate geology courses to increase understanding of earthquake hazards through exploration of the dramatic offset created by the San Andreas Fault (SAF) at Wallace Creek and Global Positioning System-derived displacements spanning the SAF at Wallace Creek . This activity is currently being tested in courses at Arizona State University. The goal of the assessment is to measure student understanding of plate tectonics and earthquakes after completing the activity. Including high-resolution topography LiDAR data into the earth science education curriculum promotes understanding of plate tectonics, faults, and other topics related to earthquake hazards.

  18. Investigating Environmental Tectonics in Northern Alpine Foreland of Europe

    NASA Astrophysics Data System (ADS)

    Cloetingh, Sierd; Ziegler, Peter; Cornu, Tristan; Ustaszewski, K.; Schmid, S.; Dezes, P.; Hinsch, R.; Decker, K.; Lopes Cardozo, G.; Granet, M.; Bertrand, G.; Behrmann, J.; Michon, L.; Pagnier, H.; van Wees, J. D.; Rozsa, S.; Heck, B.; Verdun, J.; Kahle, H. G.; Fracassi, U.; Winter, T.; Burov, E.

    Until now, research on neotectonics and related seismicity has mostly focused on active plate boundaries characterized by a generally high level of earthquake activity. Current seismic hazard estimates for intraplate areas are commonly based on probabilistic analyses of historical and instrumental earthquake data. The accuracy of these hazard estimates is limited by the nature of the data (e.g., ambiguous historical sources), and by the restriction of available earthquake catalogues to time scales of only few hundred years. Both of these are geologically insignificant and unsuitable for describing tectonic processes causing earthquakes. This is especially relevant to intraplate regions, where faults show low slip rates resulting in long average recurrence times for large earthquakes (103 to 106 yrs), such as the devastating Basel earthquake of 1356, with an estimated magnitude of 6.5. The Alpine orogen and the intraplate sedimentary basins and rifts of its northern foreland are associated with a much higher level of neotectonic activity than hitherto assumed. Seismicity and stress indicator data, combined with geodetic and geomorphologic observations, demonstrate that the Northern Alpine foreland is being actively deformed [Cloetingh, 2000; Ziegler et al., 2002; Behrmann et al., 2003]. This has major implications for the assessment of their natural hazards and environmental degradation. The northwest European lithosphere has undergone a polyphase evolution, in which the interplay between upper mantle thermal perturbations [Goes et al., 2000; Ritter et al., 2001] and stress-induced intraplate deformation [Muller et al., 1992; Ziegler et al., 2002] played an important role. A number of recent findings point to an important role of lithospheric folding in thermally weakened lithosphere of the northwestern European foreland [Cloetingh et al., 1999].

  19. Update earthquake risk assessment in Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety and collapse prevention in future earthquakes, a five-step road map has been purposed.

  20. Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data

    NASA Astrophysics Data System (ADS)

    Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.

    2017-12-01

    While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard within local communities, and understanding the seismo-tectonic processes taking place in Sabah

  1. Coupling of Sentinel-1, Sentinel-2 and ALOS-2 to assess coseismic deformation and earthquake-induced landslides following 26 June, 2016 earthquake in Kyrgyzstan

    NASA Astrophysics Data System (ADS)

    Vajedian, Sanaz; Motagh, Mahdi; Wetzel, Hans-Ulrich; Teshebaeva, Kanayim

    2017-04-01

    The active deformation in Kyrgyzstan results from the collision between Indian and Asia tectonic plates at a rate of 29 ± 1 mm/yr. This collision is accommodated by deformation on prominent faults, which can be ruptured coseismically and trigger other hazards like landslides. Many earthquake and earthquake-induced landslides in Kyrgyzstan occur in mountainous areas, where limited accessibility makes ground-based measurements for the assessment of their impact a challenging task. In this context, remote sensing measurements are extraordinary useful as they improve our knowledge about coseismic rupture process and provide information on other types of hazards that are triggered during and/or after the earthquakes. This investigation aims to use L-band ALOS/PALSAR, C-band Sentinel-1, Sentinel-2 data to evaluate fault slip model and coseismic-induced landslides related to 26 June 2016 Sary-Tash earthquake, southwest Kyrgyzstan. First we implement three methods to measure coseismic surface motion using radar data including Interferometric SAR (InSAR) analysis, SAR tracking technique and multiple aperture InSAR (MAI), followed by using Genetic Algorithm (GA) to invert the final displacement field to infer combination of orientation, location and slip on rectangular uniform slip fault plane. Slip distribution analysis is done by applying Tikhonov regularization to solve the constrained least-square method with Laplacian smoothing approach. The estimated coseismic slip model suggests a nearly W-E thrusting fault ruptured during the earthquake event in which the main rupture occurred at a depth between 11 and 14 km. Second, the local phase shifts related to landslides are inferred by detailed analysis pre-seismic, coseismic and postseismic C-band and L-band interferograms and the results are compared with the interpretations derived from Sentinel-2 data acquired before and after the earthquake.

  2. Submarine slope instability offshore western Calabria, Italy: possible triggering of tsunamigenic landslides by seismic load

    NASA Astrophysics Data System (ADS)

    Ausilia Paparo, Maria; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Gallotti, Glauco; Tinti, Stefano

    2017-04-01

    The Eastern Tyrrhenian margin offshore western Calabria (Italy) has experienced several mass movements involving varying volumes and shapes, as revealed by several geological surveys identifying slide scars and massive deposits. The hypothesis that at least some of these mass movements was tsunamigenic sounds perfectly reasonable. In this study, we focus on the continental edge offshore the Santa Eufemia Gulf and the Paola Basin, because the area experienced several strong earthquakes (Mw up to 7), some of them in the last centuries (see, for example, the 1905 earthquake and the late shocks of the 1783 sequence). Our aim is to study the seismic load as the trigger mechanism of mass failures: not all earthquakes generate tsunamis, but the conjunction of definite factors as seafloor shaking and pore water pressure could temporarily reduce soil shear stress, inducing failures and submarine tsunamigenic landslides. We have selected several sections of the Calabrian margin with different gradients and studied their slope stability by using the Minimum Lithostatic Deviation (MLD) method. We have applied typical Peak Ground Accelerations (PGAs) obtained from local historical earthquakes by means of regression laws, determining the potentially unstable sectors, as well as the volumes of the material that can be set in motion. This in turn can be used as input for future tsunami modelling and hazard assessment. This work is a contribution to assess local hazard and risk in western Calabrian coast where earthquakes can trigger tsunamigenic submarine mass movements: the impact and the effects of such phenomena could be disastrous for coastal infrastructures and populations without the proper mitigation measures. This work was carried out in the frame of the EU Project called ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe (Grant 603839, 7th FP, ENV.2013.6.4-3).

  3. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  4. Documentation for Initial Seismic Hazard Maps for Haiti

    USGS Publications Warehouse

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2010-01-01

    In response to the urgent need for earthquake-hazard information after the tragic disaster caused by the moment magnitude (M) 7.0 January 12, 2010, earthquake, we have constructed initial probabilistic seismic hazard maps for Haiti. These maps are based on the current information we have on fault slip rates and historical and instrumental seismicity. These initial maps will be revised and improved as more data become available. In the short term, more extensive logic trees will be developed to better capture the uncertainty in key parameters. In the longer term, we will incorporate new information on fault parameters and previous large earthquakes obtained from geologic fieldwork. These seismic hazard maps are important for the management of the current crisis and the development of building codes and standards for the rebuilding effort. The boundary between the Caribbean and North American Plates in the Hispaniola region is a complex zone of deformation. The highly oblique ~20 mm/yr convergence between the two plates (DeMets and others, 2000) is partitioned between subduction zones off of the northern and southeastern coasts of Hispaniola and strike-slip faults that transect the northern and southern portions of the island. There are also thrust faults within the island that reflect the compressional component of motion caused by the geometry of the plate boundary. We follow the general methodology developed for the 1996 U.S. national seismic hazard maps and also as implemented in the 2002 and 2008 updates. This procedure consists of adding the seismic hazard calculated from crustal faults, subduction zones, and spatially smoothed seismicity for shallow earthquakes and Wadati-Benioff-zone earthquakes. Each one of these source classes will be described below. The lack of information on faults in Haiti requires many assumptions to be made. These assumptions will need to be revisited and reevaluated as more fieldwork and research are accomplished. We made two sets of maps using different assumptions about site conditions. One set of maps is for a firm-rock site condition (30-m averaged shear-wave velocity, Vs30, of 760 m/s). We also developed hazard maps that contain site amplification based on a grid of Vs30 values estimated from topographic slope. These maps take into account amplification from soils. We stress that these new maps are designed to quantify the hazard for Haiti; they do not consider all the sources of earthquake hazard that affect the Dominican Republic and therefore should not be considered as complete hazard maps for eastern Hispaniola. For example, we have not included hazard from earthquakes in the Mona Passage nor from large earthquakes on the subduction zone interface north of Puerto Rico. Furthermore, they do not capture all the earthquake hazards for eastern Cuba.

  5. Topographic changes and their driving factors after 2008 Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Li, Congrong; Wang, Ming; Liu, Kai; Xie, Jun

    2018-06-01

    The 2008 Wenchuan Earthquake caused topographic change in the stricken areas because of the occurrence of numerous coseismic landslides. The emergence of new landslides and debris flows and movement of loose materials under the driving force of high rainfall could further shape the local topography. Currently, little attention has been paid to continuously monitoring and assessing topographic changes after the major earthquake. In this research, we obtained an elevation dataset (2002, 2010, 2013 and 2015) based on digital elevation model (DEM) data and a DEM extracted from ZY-3 stereo paired images with validation by field measurement. We quantitatively assessed elevation changes in different years and qualitatively analyzed spatiotemporal variation of the terrain and mass movement across the study area. The results show that the earthquake affected area experienced substantial elevation changes caused by seismic forces and subsequent rainfalls. High rainfall after the earthquake have become the biggest driver of elevation reduction, which overwhelmed elevation increase caused by the major earthquake. Increased post-earthquake erosion intensity has caused large amounts of loose materials to accumulate in river channels, and gullies and on upper-middle mountain slopes, which increases the risk of flooding and geo-hazards in the area.

  6. New Directions in Seismic Hazard Assessment Through Focused Earth Observation in the MARmara SuperSITE - Project Achievements

    NASA Astrophysics Data System (ADS)

    Meral OZel, Nurcan; Necmioǧlu, Öcal; Ergintav, Semih; Ozel, Oǧuz; Favali, Paolo; Bigarre, Pascal; Çakır, Ziyadin; Ozeren, Sinan; Geli, Louis; Douglas, John; Aochi, Hideo; Bossu, Remy; Zülfikar, Can; Şeşetyan, Karin; Erdik, Mustafa

    2016-04-01

    The MARsite Project, which started in November 2012,funded by the EC/ FP7-ENV.2012 6.4-2 (Grant 308417) identifies the Marmara region as a 'Supersite' within European initiatives to aggregate on-shore, off-shore and space-based observations, comprehensive geophysical monitoring, improved hazard and risk assessments encompassed in an integrated set of activities. MARsite aimed to harmonize geological, geophysical, geodetic and geochemical observations to provide a better view of the post-seismic deformation of the 1999 Izmit earthquake (in addition to the post-seismic signature of previous earthquakes), loading of submarine and inland active fault segments and transient pre-earthquake signals, related to stress loading with different tectonic properties in and around Marmara Sea. This presentation provides an overview of the achievements of MARSite which aimed to coordinate research groups ranging from seismology to gas geochemistry in a comprehensive monitoring activity developed in the Marmara Region based on collection of multidisciplinary data to be shared, interpreted and merged in consistent theoretical and practical models suitable for the implementation of good practices to move the necessary information to the end users in charge of seismic risk management of the region. In addition, processes involved in earthquake generation and the physics of short-term seismic transients, 4D deformations to understand earthquake cycle processes, fluid activity monitoring and seismicity under the sea floor using existing autonomous instrumentation, early warning and development of real-time shake and loss information, real- and quasi-real-time earthquake and tsunami hazard monitoring and earthquake-induced landslide hazard topics are also covered within MARSite. In particular, achievements and progress in the design and building of a multi-parameter borehole system consisting of very wide dynamic range and stable borehole (VBB) broad band seismic sensor, with incorporated 3-D strain meter, tilt meter, and temperature and local hydrostatic pressure measuring devices would be reported. Progress has been marked on photogeological analysis of DInSAR temporal series and of space multispectral/hyperspectral image data, an important geophysical field survey of one of the most important landslide that yielded a refined geological engineering model, numerical dynamic modelling of this and installation of a real-time monitoring system the field. We improved the existing earthquake early warning and strong motion networks and they are mostly integrated. The early warning signals extend to the critical infrastructure's of Marmara Region like as natural gas distribution line IGDAS and transportation line MARMARAY). The project reached the following goals: intensive monitoring infrastructure have been installed, data sharing among the partners and researchers even the out of the Marsite project have been succesfully realized, more than 70 articles ,reports, presentations have been already issued (or published) and presented by 18 partners institutions.

  7. Characterization of the Cottonwood Grove and Ridgely faults near Reelfoot Lake, Tennessee, from high-resolution seismic reflection data

    USGS Publications Warehouse

    Stephenson, William J.; Shedlock, Kaye M.; Odum, Jack K.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  8. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  9. Rupture propagation behavior and the largest possible earthquake induced by fluid injection into deep reservoirs

    NASA Astrophysics Data System (ADS)

    Gischig, Valentin S.

    2015-09-01

    Earthquakes caused by fluid injection into deep underground reservoirs constitute an increasingly recognized risk to populations and infrastructure. Quantitative assessment of induced seismic hazard, however, requires estimating the maximum possible magnitude earthquake that may be induced during fluid injection. Here I seek constraints on an upper limit for the largest possible earthquake using source-physics simulations that consider rate-and-state friction and hydromechanical interaction along a straight homogeneous fault. Depending on the orientation of the pressurized fault in the ambient stress field, different rupture behaviors can occur: (1) uncontrolled rupture-front propagation beyond the pressure front or (2) rupture-front propagation arresting at the pressure front. In the first case, fault properties determine the earthquake magnitude, and the upper magnitude limit may be similar to natural earthquakes. In the second case, the maximum magnitude can be controlled by carefully designing and monitoring injection and thus restricting the pressurized fault area.

  10. Probabilistic tsunami hazard assessment in Greece for seismic sources along the segmented Hellenic Arc

    NASA Astrophysics Data System (ADS)

    Novikova, Tatyana; Babeyko, Andrey; Papadopoulos, Gerassimos

    2017-04-01

    Greece and adjacent coastal areas are characterized by a high population exposure to tsunami hazard. The Hellenic Arc is the most active geotectonic structure for the generation of earthquakes and tsunamis. We performed probabilistic tsunami hazard assessment for selected locations of Greek coastlines which are the forecasting points officially used in the tsunami warning operations by the Hellenic National Tsunami Warning Center and the NEAMTWS/IOC/UNESCO. In our analysis we considered seismic sources for tsunami generation along the western, central and eastern segments of the Hellenic Arc. We first created a synthetic catalog as long as 10,000 years for all the significant earthquakes with magnitudes in the range from 6.0 to 8.5, the real events being included in this catalog. For each event included in the synthetic catalog a tsunami was generated and propagated using Boussinesq model. The probability of occurrence for each event was determined by Gutenberg-Richter magnitude-frequency distribution. The results of our study are expressed as hazard curves and hazard maps. The hazard curves were obtained for the selected sites and present the annual probability of exceedance as a function of pick coastal tsunami amplitude. Hazard maps represent the distribution of peak coastal tsunami amplitudes corresponding to a fixed annual probability. In such forms our results can be easily compared to the ones obtained in other studies and further employed for the development of tsunami risk management plans. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.

  11. USGS approach to real-time estimation of earthquake-triggered ground failure - Results of 2015 workshop

    USGS Publications Warehouse

    Allstadt, Kate E.; Thompson, Eric M.; Wald, David J.; Hamburger, Michael W.; Godt, Jonathan W.; Knudsen, Keith L.; Jibson, Randall W.; Jessee, M. Anna; Zhu, Jing; Hearne, Michael; Baise, Laurie G.; Tanyas, Hakan; Marano, Kristin D.

    2016-03-30

    The U.S. Geological Survey (USGS) Earthquake Hazards and Landslide Hazards Programs are developing plans to add quantitative hazard assessments of earthquake-triggered landsliding and liquefaction to existing real-time earthquake products (ShakeMap, ShakeCast, PAGER) using open and readily available methodologies and products. To date, prototype global statistical models have been developed and are being refined, improved, and tested. These models are a good foundation, but much work remains to achieve robust and defensible models that meet the needs of end users. In order to establish an implementation plan and identify research priorities, the USGS convened a workshop in Golden, Colorado, in October 2015. This document summarizes current (as of early 2016) capabilities, research and operational priorities, and plans for further studies that were established at this workshop. Specific priorities established during the meeting include (1) developing a suite of alternative models; (2) making use of higher resolution and higher quality data where possible; (3) incorporating newer global and regional datasets and inventories; (4) reducing barriers to accessing inventory datasets; (5) developing methods for using inconsistent or incomplete datasets in aggregate; (6) developing standardized model testing and evaluation methods; (7) improving ShakeMap shaking estimates, particularly as relevant to ground failure, such as including topographic amplification and accounting for spatial variability; and (8) developing vulnerability functions for loss estimates.

  12. Seismic hazard analysis for Jayapura city, Papua

    NASA Astrophysics Data System (ADS)

    Robiana, R.; Cipta, A.

    2015-04-01

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 - 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  13. 2018 one‐year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Rukstales, Kenneth S.; McNamara, Daniel E.; Williams, Robert A.; Shumway, Allison; Powers, Peter; Earle, Paul; Llenos, Andrea L.; Michael, Andrew J.; Rubinstein, Justin L.; Norbeck, Jack; Cochran, Elizabeth S.

    2018-01-01

    This article describes the U.S. Geological Survey (USGS) 2018 one‐year probabilistic seismic hazard forecast for the central and eastern United States from induced and natural earthquakes. For consistency, the updated 2018 forecast is developed using the same probabilistic seismicity‐based methodology as applied in the two previous forecasts. Rates of earthquakes across the United States M≥3.0">M≥3.0 grew rapidly between 2008 and 2015 but have steadily declined over the past 3 years, especially in areas of Oklahoma and southern Kansas where fluid injection has decreased. The seismicity pattern in 2017 was complex with earthquakes more spatially dispersed than in the previous years. Some areas of west‐central Oklahoma experienced increased activity rates where industrial activity increased. Earthquake rates in Oklahoma (429 earthquakes of M≥3">M≥3 and 4 M≥4">M≥4), Raton basin (Colorado/New Mexico border, six earthquakes M≥3">M≥3), and the New Madrid seismic zone (11 earthquakes M≥3">M≥3) continue to be higher than historical levels. Almost all of these earthquakes occurred within the highest hazard regions of the 2017 forecast. Even though rates declined over the past 3 years, the short‐term hazard for damaging ground shaking across much of Oklahoma remains at high levels due to continuing high rates of smaller earthquakes that are still hundreds of times higher than at any time in the state’s history. Fine details and variability between the 2016–2018 forecasts are obscured by significant uncertainties in the input model. These short‐term hazard levels are similar to active regions in California. During 2017, M≥3">M≥3 earthquakes also occurred in or near Ohio, West Virginia, Missouri, Kentucky, Tennessee, Arkansas, Illinois, Oklahoma, Kansas, Colorado, New Mexico, Utah, and Wyoming.

  14. Landslides in everyday life: An interdisciplinary approach to understanding vulnerability in the Himalayas

    NASA Astrophysics Data System (ADS)

    Sudmeier-Rieux, K.; Breguet, A.; Dubois, J.; Jaboyedoff, M.

    2009-04-01

    Several thousand landslides were triggered by the Kashmir earthquake, scarring the hillside with cracks. Monsoon rains continue to trigger landslides, which have increased the exposure of populations because of lost agricultural lands, blocked roads and annual fatalities due to landslides. The great majority of these landslides are shallow and relatively small but greatly impacting the population. In this region, landslides were a factor before the earthquake, mainly due to road construction and gravel excavation, but the several thousand landslides triggered by the earthquake have completely overwhelmed the local population and authorities. In Eastern Nepal, the last large earthquake to hit this region occurred in 1988, also triggering numerous landslides and cracks. Here, landslides can be considered a more common phenomenon, yet coping capacities amount to local observations of landslide movement, subsequent abandonment of houses and land as they become too dangerous. We present a comparative case study from Kashmir, Pakistan and Eastern Nepal, highlighting an interdisciplinary approach to understanding the complex interactions between land use, landslides and vulnerability. Our approach sets out to understand underlying causes of the massive landslides triggered by the 2005 earthquake in Kashmir, Pakistan, and also the increasing number of landslides in Nepal. By approaching the issue of landslides from multiple angles (risk perceptions, land use, local coping capacities, geological assessment, risk mapping) and multiple research techniques (remote sensing, GIS, geological assessment, participatory mapping, focus groups) we are better able to create a more complete picture of the "hazardscape". We find that by combining participatory social science research with hazard mapping, we obtain a more complete understanding of underlying causes, coping strategies and possible mitigation options, placing natural hazards in the context of everyday life. This method is relatively simple, low cost and useful to local authorities or development agencies in planning and managing development projects, which include a hazard management aspect. We discuss some of our successes, some obstacles and ideas for future research.

  15. Earthquake Hazard Assessment Based on Geological Data: An approach from Crystalline Terrain of Peninsular India

    NASA Astrophysics Data System (ADS)

    John, B.

    2009-04-01

    Earthquake Hazard Assessment Based on Geological Data: An approach from Crystalline Terrain of Peninsular India Biju John National Institute of Rock Mechanics b_johnp@yahoo.co.in Peninsular India was for long considered as seismically stable. But the recent earthquake sequence of Latur (1993), Jabalpur (1997), Bhuj (2001) suggests this region is among one of the active Stable Continental Regions (SCRs) of the world, where the recurrence intervals is of the order of tens of thousands of years. In such areas, earthquake may happen at unexpected locations, devoid of any previous seismicity or dramatic geomorphic features. Even moderate earthquakes will lead to heavy loss of life and property in the present scenario. So it is imperative to map suspected areas to identify active faults and evaluate its activities, which will be a vital input to seismic hazard assessment of SCR area. The region around Wadakkanchery, Kerala, South India has been experiencing micro seismic activities since 1989. Subsequent studies, by the author, identified a 30 km long WNW-ESE trending reverse fault, dipping south (45°), that influenced the drainage system of the area. The macroscopic and microscopic studies of the fault rocks from the exposures near Desamangalam show an episodic nature of faulting. Dislocations of pegmatitic veins across the fault indicate a cumulative dip displacement of 2.1m in the reverse direction. A minimum of four episodes of faulting were identified in this fault based on the cross cutting relations of different structural elements and from the mineralogic changes of different generations of gouge zones. This suggests that an average displacement of 52cm per event might have occurred for each event. A cyclic nature of faulting is identified in this fault zone in which the inter-seismic period is characterized by gouge induration and fracture sealing aided by the prevailing fluids. Available empirical relations connecting magnitude with displacement and rupture length show that each event might have produced an earthquake of magnitude ≥ 6.0, which could be a damaging one to an area like peninsular India. Electron Spin Resonance dating of fault gouge indicates a major event around 430ka. In the present stress regime this fault can be considered as seismically active, because the orientation of the fault is favorable for reactivation.

  16. Initiatives to Reduce Earthquake Risk of Developing Countries

    NASA Astrophysics Data System (ADS)

    Tucker, B. E.

    2008-12-01

    The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of an earthquake- and tsunami-resistant structure in Sumatra to house a tsunami museum, a community training center, and offices of a local NGO that is preparing Padang for the next tsunami. This facility would be designed and built by a team of US and Indonesian academics, architects, engineers and students. Another initiative would launch a collaborative research program on school earthquake safety with the scientists and engineers from the US and the ten Islamic countries that comprise the Economic Cooperation Organization. Finally, GHI hopes to develop internet and satellite communication techniques that will allow earthquake risk managers in the US to interact with masons, government officials, engineers and architects in remote communities of vulnerable developing countries, closing the science and engineering divide.

  17. Seismic hazard assessment of the Kivu rift segment based on a new sismo-tectonic zonation model (Western Branch of the East African Rift system)

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Delvaux, Damien

    2015-04-01

    In the frame of the Belgian GeoRisCA multi-risk assessment project focused on the Kivu and Northern Tanganyika Region, a seismic hazard map has been produced for this area. It is based on a on a recently re-compiled catalogue using various local and global earthquake catalogues. The use of macroseismic epicenters determined from felt earthquakes allowed to extend the time-range back to the beginning of the 20th century, thus spanning about 100 years. The magnitudes have been homogenized to Mw and the coherence of the catalogue has been checked and validated. The seismo-tectonic zonation includes 10 seismic source areas that have been defined on the basis of the regional geological structure, neotectonic fault systems, basin architecture and distribution of earthquake epicenters. The seismic catalogue was filtered by removing obvious aftershocks and Gutenberg-Richter Laws were determined for each zone. On the basis of this seismo-tectonic information and existing attenuation laws that had been established by Twesigomwe (1997) and Mavonga et al. (2007) for this area, seismic hazard has been computed with the Crisis 2012 (Ordaz et al., 2012) software. The outputs of this assessment clearly show higher PGA values (for 475 years return period) along the Rift than the previous estimates by Twesigomwe (1997) and Mavonga (2007) while the same attenuation laws had been used. The main reason for these higher PGA values is likely to be related to the more detailed zonation of the Rift structure marked by a strong gradient of the seismicity from outside the rift zone to the inside. Mavonga, T. (2007). An estimate of the attenuation relationship for the strong ground motion in the Kivu Province, Western Rift Valley of Africa. Physics of the Earth and Planetary Interiors 62, 13-21. Ordaz M, Martinelli F, Aguilar A, Arboleda J, Meletti C, D'Amico V. (2012). CRISIS 2012, Program for computing seismic hazard. Instituto de Ingeniería, Universidad Nacional Autónoma de México. Twesigomwe, E. (1997). Probabilistic seismic hazard assessment of Uganda, Ph.D. Thesis, Dept. of Physics, Makare University, Uganda.

  18. Ethics in disaster management

    NASA Astrophysics Data System (ADS)

    Parkash, S.

    2012-04-01

    Ethics are basically a minimum level of moral values in a society that one must follow to do justice for honest practices in any profession. Geoscientists have significant roles to play, more particularly in the field of geohazards, to appraise the society about the possibilities of natural hazards like landslides, avalanches, floods, volcanoes, earthquake etc. They can not only assess these hazards but also can estimate the potential consequences if these hazards occur in a given place and a given time. However, sometimes it has been found that the credibility of geoscientist among the society and the governance is lost due to some unethical practices for a short term gain or due to improper understanding of the geological phenomena. Some of the hazards that cannot be predicted with the existing capabilities have been forecasted by some geoscientists to draw social/media's attention, thereby bringing the reputation of the profession down. One must be fair enough to accept the limitations of our profession in informing about natural hazards which are yet not fully well understood by the professionals in this field. More specifically the predictions related to earthquakes have drawn the attention of the society as well as media in the developing world where common people have different perceptions. Most often the popular myths take over the scientific facts among the public and lead to rumours about natural hazards. The paper attempts to cite some cases of rumours about natural disasters, particularly earthquakes and response of the society, media and governance. It emphasizes the role of geoscientists as the ethical responsibility to inform the public about the factual situations on the geohazards, to avert the panic caused by rumours from non-specialists or hyper-active pseudo experts. The paper points out the recent rumours about lake outburst, flash-floods and volcanic activities after a moderate earthquake (M6.8, 18 September 2011) in the Sikkim State, India and actions taken by the geoscientific community to correctly inform about the real situation.

  19. Earthquake and tsunami hazard in West Sumatra: integrating science, outreach, and local stakeholder needs

    NASA Astrophysics Data System (ADS)

    McCaughey, J.; Lubis, A. M.; Huang, Z.; Yao, Y.; Hill, E. M.; Eriksson, S.; Sieh, K.

    2012-04-01

    The Earth Observatory of Singapore (EOS) is building partnerships with local to provincial government agencies, NGOs, and educators in West Sumatra to inform their policymaking, disaster-risk-reduction, and education efforts. Geodetic and paleoseismic studies show that an earthquake as large as M 8.8 is likely sometime in the coming decades on the Mentawai patch of the Sunda megathrust. This earthquake and its tsunami would be devastating for the Mentawai Islands and neighboring areas of the western Sumatra coast. The low-lying coastal Sumatran city of Padang (pop. ~800,000) has been the object of many research and outreach efforts, especially since 2004. Padang experienced deadly earthquakes in 2007 and 2009 that, though tragedies in their own right, served also as wake-up calls for a larger earthquake to come. However, there remain significant barriers to linking science to policy: extant hazard information is sometimes contradictory or confusing for non-scientists, while turnover of agency leadership and staff means that, in the words of one local advocate, "we keep having to start from zero." Both better hazard knowledge and major infrastructure changes are necessary for risk reduction in Padang. In contrast, the small, isolated villages on the outlying Mentawai Islands have received relatively fewer outreach efforts, yet many villages have the potential for timely evacuation with existing infrastructure. Therefore, knowledge alone can go far toward risk reduction. The tragic October 2010 Mentawai tsunami has inspired further disaster-risk reduction work by local stakeholders. In both locations, we are engaging policymakers and local NGOs, providing science to help inform their work. Through outreach contacts, the Mentawai government requested that we produce the first-ever tsunami hazard map for their islands; this aligns well with scientific interests at EOS. We will work with the Mentawai government on the presentation and explanation of the hazard map, as well as assessment of its impact at the district and village levels. We are also providing science and teaching examples for an NGO-led program to integrate disaster-risk reduction into the Mentawai primary-school curriculum. We are working with our partners to develop a participatory monitoring scheme. Indicators will include the degree to which policy is informed by science, whether communities develop and publicise evacuation routes based on hazard mapping, whether and how frequently communities practice evacuation simulations, and whether hazard information is incorporated into school curricula.

  20. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  1. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the construction of cyclone shelters was being undertaken. The availability heuristics caused a perception of low probability of tsunami following an earthquake, as the last large similar event happened over a hundred years ago. Another led to a situation when decisions were taken on the basis of experience and not statistical evidence, namely, experience showed that the so-called "Ring of Fire" generates underground earthquakes and tsunamis in the Pacific Ocean. This knowledge made decision-makers to neglect the numerical estimations about probability of underground earthquake in the Indian Ocean even though seismologists were warning about probability of a large underground earthquake in the Indian Ocean. The bounded rationality bias led to misperception of signals from the early warning center in the Pacific Ocean. The resulting limited concern resulted in risk mitigation measures that considered cyclone risks, but much less about tsunami. Under loss aversion considerations, the decision-makers perceived the losses connected with the necessary additional investment as being greater than benefits from mitigating a less probable hazard.

  2. Resilience to Interacting multi-natural hazards

    NASA Astrophysics Data System (ADS)

    Zhuo, Lu; Han, Dawei

    2016-04-01

    Conventional analyses of hazard assessment tend to focus on individual hazards in isolation. However, many parts of the world are usually affected by multiple natural hazards with the potential for interacting relationships. The understanding of such interactions, their impacts and the related uncertainties, are an important and topical area of research. Interacting multi-hazards may appear in different forms, including 1) CASCADING HAZARDS (a primary hazard triggering one or more secondary hazards such as an earthquake triggering landslides which may block river channels with dammed lakes and ensued floods), 2) CONCURRING HAZARDS (two or more primary hazards coinciding to trigger or exacerbate secondary hazards such as an earthquake and a rainfall event simultaneously creating landslides), and 3) ALTERING HAZARDS (a primary hazard increasing the probability of a secondary hazard occurring such as major earthquakes disturbing soil/rock materials by violent ground shaking which alter the regional patterns of landslides and debris flows in the subsequent years to come). All three types of interacting multi-hazards may occur in natural hazard prone regions, so it is important that research on hazard resilience should cover all of them. In the past decades, great progresses have been made in tackling disaster risk around the world. However, there are still many challenging issues to be solved, and the disasters over recent years have clearly demonstrated the inadequate resilience in our highly interconnected and interdependent systems. We have identified the following weaknesses and knowledge gaps in the current disaster risk management: 1) although our understanding in individual hazards has been greatly improved, there is a lack of sound knowledge about mechanisms and processes of interacting multi-hazards. Therefore, the resultant multi-hazard risk is often significantly underestimated with severe consequences. It is also poorly understood about the spatial and temporal changes in hazards and vulnerability during successive hazards; 2) hazard monitoring, forecasting and early warning systems have not fully utilised the domain knowledge of physical processes and the statistical information of the observations; 3) uncertainties have not been well recognised in the current risk management practice, and ignorance of uncertainties could lead to major threat to the society and poor consideration with inefficient or unsustainable preferences of options; 4) there is increasing recognition that the so called 'natural' disasters are not just the consequences of nature-related processes alone, but are attributable to various social, economic, historical, political and cultural causes. However, despite this recognition, the current hazard and risk assessments are fragmented with a weakness in holistically combining quantitative and qualitative information from a variety of sources; 5) successful disaster risk management must be relevant and useful to all stakeholders involved. Efforts should enable the essential common purpose, collective learning and entrepreneurial collaborations that underpin effective and efficient resilience. Therefore, there is an urgent need for the systems thinking framework and decision support system tools in adequate scenario assessment and resilience development from a harmonised and transdisciplinary perspective. It is important that the aforementioned issues should be tackled with a joint effort from a multidisciplinary team in social science, natural science, engineering and systems.

  3. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  4. Seismic Hazard Maps for Seattle, Washington, Incorporating 3D Sedimentary Basin Effects, Nonlinear Site Response, and Rupture Directivity

    USGS Publications Warehouse

    Frankel, Arthur D.; Stephenson, William J.; Carver, David L.; Williams, Robert A.; Odum, Jack K.; Rhea, Susan

    2007-01-01

    This report presents probabilistic seismic hazard maps for Seattle, Washington, based on over 500 3D simulations of ground motions from scenario earthquakes. These maps include 3D sedimentary basin effects and rupture directivity. Nonlinear site response for soft-soil sites of fill and alluvium was also applied in the maps. The report describes the methodology for incorporating source and site dependent amplification factors into a probabilistic seismic hazard calculation. 3D simulations were conducted for the various earthquake sources that can affect Seattle: Seattle fault zone, Cascadia subduction zone, South Whidbey Island fault, and background shallow and deep earthquakes. The maps presented in this document used essentially the same set of faults and distributed-earthquake sources as in the 2002 national seismic hazard maps. The 3D velocity model utilized in the simulations was validated by modeling the amplitudes and waveforms of observed seismograms from five earthquakes in the region, including the 2001 M6.8 Nisqually earthquake. The probabilistic seismic hazard maps presented here depict 1 Hz response spectral accelerations with 10%, 5%, and 2% probabilities of exceedance in 50 years. The maps are based on determinations of seismic hazard for 7236 sites with a spacing of 280 m. The maps show that the most hazardous locations for this frequency band (around 1 Hz) are soft-soil sites (fill and alluvium) within the Seattle basin and along the inferred trace of the frontal fault of the Seattle fault zone. The next highest hazard is typically found for soft-soil sites in the Duwamish Valley south of the Seattle basin. In general, stiff-soil sites in the Seattle basin exhibit higher hazard than stiff-soil sites outside the basin. Sites with shallow bedrock outside the Seattle basin have the lowest estimated hazard for this frequency band.

  5. Development of the Global Earthquake Model’s neotectonic fault database

    USGS Publications Warehouse

    Christophersen, Annemarie; Litchfield, Nicola; Berryman, Kelvin; Thomas, Richard; Basili, Roberto; Wallace, Laura; Ries, William; Hayes, Gavin P.; Haller, Kathleen M.; Yoshioka, Toshikazu; Koehler, Richard D.; Clark, Dan; Wolfson-Schwehr, Monica; Boettcher, Margaret S.; Villamor, Pilar; Horspool, Nick; Ornthammarath, Teraphan; Zuñiga, Ramon; Langridge, Robert M.; Stirling, Mark W.; Goded, Tatiana; Costa, Carlos; Yeats, Robert

    2015-01-01

    The Global Earthquake Model (GEM) aims to develop uniform, openly available, standards, datasets and tools for worldwide seismic risk assessment through global collaboration, transparent communication and adapting state-of-the-art science. GEM Faulted Earth (GFE) is one of GEM’s global hazard module projects. This paper describes GFE’s development of a modern neotectonic fault database and a unique graphical interface for the compilation of new fault data. A key design principle is that of an electronic field notebook for capturing observations a geologist would make about a fault. The database is designed to accommodate abundant as well as sparse fault observations. It features two layers, one for capturing neotectonic faults and fold observations, and the other to calculate potential earthquake fault sources from the observations. In order to test the flexibility of the database structure and to start a global compilation, five preexisting databases have been uploaded to the first layer and two to the second. In addition, the GFE project has characterised the world’s approximately 55,000 km of subduction interfaces in a globally consistent manner as a basis for generating earthquake event sets for inclusion in earthquake hazard and risk modelling. Following the subduction interface fault schema and including the trace attributes of the GFE database schema, the 2500-km-long frontal thrust fault system of the Himalaya has also been characterised. We propose the database structure to be used widely, so that neotectonic fault data can make a more complete and beneficial contribution to seismic hazard and risk characterisation globally.

  6. Initial source and site characterization studies for the U.C. Santa Barbara campus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archuleta, R.; Nicholson, C.; Steidl, J.

    1997-12-01

    The University of California Campus-Laboratory Collaboration (CLC) project is an integrated 3 year effort involving Lawrence Livermore National Laboratory (LLNL) and four UC campuses - Los Angeles (UCLA), Riverside (UCR), Santa Barbara (UCSB), and San Diego (UCSD) - plus additional collaborators at San Diego State University (SDSU), at Los Alamos National Laboratory and in industry. The primary purpose of the project is to estimate potential ground motions from large earthquakes and to predict site-specific ground motions for one critical structure on each campus. This project thus combines the disciplines of geology, seismology, geodesy, soil dynamics, and earthquake engineering into amore » fully integrated approach. Once completed, the CLC project will provide a template to evaluate other buildings at each of the four UC campuses, as well as provide a methodology for evaluating seismic hazards at other critical sites in California, including other UC locations at risk from large earthquakes. Another important objective of the CLC project is the education of students and other professional in the application of this integrated, multidisciplinary, state-of-the-art approach to the assessment of earthquake hazard. For each campus targeted by the CLC project, the seismic hazard study will consist of four phases: Phase I - Initial source and site characterization, Phase II - Drilling, logging, seismic monitoring, and laboratory dynamic soil testing, Phase III - Modeling of predicted site-specific earthquake ground motions, and Phase IV - Calculations of 3D building response. This report cover Phase I for the UCSB campus and incudes results up through March 1997.« less

  7. Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3

    NASA Astrophysics Data System (ADS)

    Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.

    2017-12-01

    Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.

  8. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  9. Seismic hazard of the Kivu rift (western branch, East African Rift system): new neotectonic map and seismotectonic zonation model

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Mulumba, Jean-Luc; Sebagenzi Mwene Ntabwoba, Stanislas; Fiama Bondo, Silvanos; Kervyn, François; Havenith, Hans-Balder

    2017-04-01

    The first detailed probabilistic seismic hazard assessment has been performed for the Kivu and northern Tanganyika rift region in Central Africa. This region, which forms the central part of the Western Rift Branch, is one of the most seismically active part of the East African rift system. It was already integrated in large scale seismic hazard assessments, but here we defined a finer zonation model with 7 different zones representing the lateral variation of the geological and geophysical setting across the region. In order to build the new zonation model, we compiled homogeneous cross-border geological, neotectonic and sismotectonic maps over the central part of East D.R. Congo, SW Uganda, Rwanda, Burundi and NW Tanzania and defined a new neotectonic sheme. The seismic risk assessment is based on a new earthquake catalogue, compiled on the basis of various local and global earthquake catalogues. The use of macroseismic epicenters determined from felt earthquakes allowed to extend the time-range back to the beginning of the 20th century, spanning 126 years, with 1068 events. The magnitudes have been homogenized to Mw and aftershocks removed. From this initial catalogue, a catalogue of 359 events from 1956 to 2015 and with M > 4.4 has been extracted for the seismic hazard assessment. The seismotectonic zonation includes 7 seismic source areas that have been defined on the basis of the regional geological structure, neotectonic fault systems, basin architecture and distribution of thermal springs and earthquake epicenters. The Gutenberg-Richter seismic hazard parameters were determined using both the least square linear fit and the maximum likelihood method (Kijko & Smit aue program). Seismic hazard maps have been computed with the Crisis 2012 software using 3 different attenuation laws. We obtained higher PGA values (475 years return period) for the Kivu rift region than the previous estimates (Delvaux et al., 2016). They vary laterally in function of the tectonic setting, with the lowest value in the volcanically active Virunga - Rutshuru zone, highest in the currently non-volcanic parts of Lake Kivu, Rusizi valley and North Tanganyika rift zone, and intermediate in the regions flanking the axial rift zone. Those are to be considered as preliminary values, as there are a number of important uncertainties such as the heterogeneity and relatively short duration of the instrumental seismic catalogue used (60 years), the absence of locally derived attenuation laws and thus the choice of the attenuation laws used, and the seismic zonation scheme. Delvaux, D. et al., 2016. Journal of African Earth Sciences, doi: 10.1016/j.jafrearsci.2016.10.004.

  10. Ground motion models used in the 2014 U.S. National Seismic Hazard Maps

    USGS Publications Warehouse

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.

    2015-01-01

    The National Seismic Hazard Maps (NSHMs) are an important component of seismic design regulations in the United States. This paper compares hazard using the new suite of ground motion models (GMMs) relative to hazard using the suite of GMMs applied in the previous version of the maps. The new source characterization models are used for both cases. A previous paper (Rezaeian et al. 2014) discussed the five NGA-West2 GMMs used for shallow crustal earthquakes in the Western United States (WUS), which are also summarized here. Our focus in this paper is on GMMs for earthquakes in stable continental regions in the Central and Eastern United States (CEUS), as well as subduction interface and deep intraslab earthquakes. We consider building code hazard levels for peak ground acceleration (PGA), 0.2-s, and 1.0-s spectral accelerations (SAs) on uniform firm-rock site conditions. The GMM modifications in the updated version of the maps created changes in hazard within 5% to 20% in WUS; decreases within 5% to 20% in CEUS; changes within 5% to 15% for subduction interface earthquakes; and changes involving decreases of up to 50% and increases of up to 30% for deep intraslab earthquakes for most U.S. sites. These modifications were combined with changes resulting from modifications in the source characterization models to obtain the new hazard maps.

  11. Assessment and mitigation of liquefaction hazards to bridge approach embankments in Oregon : final report.

    DOT National Transportation Integrated Search

    2002-11-01

    The seismic performance of bridge structures and appurtenant components (i.e., approach spans, abutments and foundations) has been well documented following recent earthquakes worldwide. This experience demonstrates that bridges are highly vulnerable...

  12. Prioritizing earthquake and tsunami alerting efforts

    NASA Astrophysics Data System (ADS)

    Allen, R. M.; Allen, S.; Aranha, M. A.; Chung, A. I.; Hellweg, M.; Henson, I. H.; Melgar, D.; Neuhauser, D. S.; Nof, R. N.; Strauss, J. A.

    2015-12-01

    The timeline of hazards associated with earthquakes ranges from seconds for the strong shaking at the epicenter, to minutes for strong shaking at more distant locations in big quakes, to tens of minutes for a local tsunami. Earthquake and tsunami warning systems must therefore include very fast initial alerts, while also taking advantage of available time in bigger and tsunami-generating quakes. At the UC Berkeley Seismological Laboratory we are developing a suite of algorithms to provide the fullest possible information about earthquake shaking and tsunami inundation from seconds to minutes after a quake. The E-larmS algorithm uses the P-wave to rapidly detect an earthquake and issue a warning. It is currently issuing alerts to test users in as little as 3 sec after the origin time. Development of a new waveform detector may lead to even faster alerts. G-larmS uses permanent deformation estimates from GNSS stations to estimate the geometry and extent of rupture underway providing more accurate ground shaking estimates in big (M>~7) earthquakes. It performed well in the M6.0 2014 Napa earthquake. T-larmS is a new algorithm designed to extend alert capabilities to tsunami inundation. Rapid estimates of source characteristics for subduction zones event can not only be used to warn of the shaking hazard, but also the local tsunami inundation hazard. These algorithms are being developed, implemented and tested with a focus on the western US, but are also now being tested in other parts of the world including Israel, Turkey, Korea and Chile. Beta users in the Bay Area are receiving the alerts and beginning to implement automated actions. They also provide feedback on users needs, which has led to the development of the MyEEW smartphone app. This app allows beta users to receive the alerts on their cell phones. All these efforts feed into our ongoing assessment of directions and priorities for future development and implementation efforts.

  13. 3-D Simulation of Earthquakes on the Cascadia Megathrust: Key Parameters and Constraints from Offshore Structure and Seismicity

    NASA Astrophysics Data System (ADS)

    Wirth, E. A.; Frankel, A. D.; Vidale, J. E.; Stone, I.; Nasser, M.; Stephenson, W. J.

    2017-12-01

    The Cascadia subduction zone has a long history of M8 to M9 earthquakes, inferred from coastal subsidence, tsunami records, and submarine landslides. These megathrust earthquakes occur mostly offshore, and an improved characterization of the megathrust is critical for accurate seismic hazard assessment in the Pacific Northwest. We run numerical simulations of 50 magnitude 9 earthquake rupture scenarios on the Cascadia megathrust, using a 3-D velocity model based on geologic constraints and regional seismicity, as well as active and passive source seismic studies. We identify key parameters that control the intensity of ground shaking and resulting seismic hazard. Variations in the down-dip limit of rupture (e.g., extending rupture to the top of the non-volcanic tremor zone, compared to a completely offshore rupture) result in a 2-3x difference in peak ground acceleration (PGA) for the inland city of Seattle, Washington. Comparisons of our simulations to paleoseismic data suggest that rupture extending to the 1 cm/yr locking contour (i.e., mostly offshore) provides the best fit to estimates of coastal subsidence during previous Cascadia earthquakes, but further constraints on the down-dip limit from microseismicity, offshore geodetics, and paleoseismic evidence are needed. Similarly, our simulations demonstrate that coastal communities experience a four-fold increase in PGA depending upon their proximity to strong-motion-generating areas (i.e., high strength asperities) on the deeper portions of the megathrust. An improved understanding of the structure and rheology of the plate interface and accretionary wedge, and better detection of offshore seismicity, may allow us to forecast locations of these asperities during a future Cascadia earthquake. In addition to these parameters, the seismic velocity and attenuation structure offshore also strongly affects the resulting ground shaking. This work outlines the range of plausible ground motions from an M9 Cascadia earthquake, and highlights the importance of offshore studies for constraining critical parameters and seismic hazard in the Pacific Northwest.

  14. Key recovery factors for the August 24, 2014, South Napa Earthquake

    USGS Publications Warehouse

    Hudnut, Kenneth W.; Brocher, Thomas M.; Prentice, Carol S.; Boatwright, John; Brooks, Benjamin A.; Aagaard, Brad T.; Blair, James Luke; Fletcher, Jon Peter B.; Erdem, Jemile; Wicks, Chuck; Murray, Jessica R.; Pollitz, Fred F.; Langbein, John O.; Svarc, Jerry L.; Schwartz, David P.; Ponti, Daniel J.; Hecker, Suzanne; DeLong, Stephen B.; Rosa, Carla M.; Jones, Brenda; Lamb, Rynn M.; Rosinski, Anne M.; McCrink, Timothy P.; Dawson, Timothy E.; Seitz, Gordon G.; Glennie, Craig; Hauser, Darren; Ericksen, Todd; Mardock, Dan; Hoirup, Don F.; Bray, Jonathan D.; Rubin, Ron S.

    2014-01-01

    Through discussions between the Federal Emergency Management Agency (FEMA) and the U.S. Geological Survey (USGS) following the South Napa earthquake, it was determined that several key decision points would be faced by FEMA for which additional information should be sought and provided by USGS and its partners. This report addresses the four tasks that were agreed to. These tasks are (1) assessment of ongoing fault movement (called afterslip) especially in the Browns Valley residential neighborhood, (2) assessment of the shaking pattern in the downtown area of the City of Napa, (3) improvement of information on the fault hazards posed by the West Napa Fault System (record of past earthquakes and slip rate, for example), and (4) imagery acquisition and data processing to provide overall geospatial information support to FEMA.

  15. Robust Satellite Techniques to support the short-term assessment of the seismic hazard in Japan: an analysis on 11 years (2005-2015) of MTSAT TIR observations

    NASA Astrophysics Data System (ADS)

    Genzano, Nicola; Filizzola, Carolina; Hattori, Katsumi; Lisi, Mariano; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio

    2017-04-01

    In order to increase reliability and precision of short-term seismic hazard assessment (but also a possible earthquakes forecast), the integration of different kinds of observations (chemical, physical, biological, etc.) in a multi-parametric approach could be a useful strategy to be undertaken. Among the different observational methodologies, the fluctuations of Earth's thermally emitted radiation, measured by satellite sensors operating in the thermal infrared (TIR) spectral range, have been proposed since eighties as a potential earthquake precursor. Since 2001, the general change detention approach Robust Satellite Techniques (RST), used in combination with RETIRA (Robust Estimator of TIR Anomalies) index, showed good ability to discriminate anomalous TIR signals possibly associated to seismic activity, from the normal variability of TIR signal due to other causes (e.g. meteorological). In this paper, the RST data analysis approach has been implemented on TIR satellite records collected over Japan by the geostationary satellite sensor MTSAT (Multifunctional Transport SATellites) in the period June 2005 - December 2015 in order to evaluate its possible contribute to an improved multi parametric system for a time-Dependent Assessment of Seismic Hazard (t-DASH). For the first time, thermal anomalies have been identified comparing the daily TIR radiation of each location of the considered satellite portions, with its historical expected value and variation range (i.e. RST reference fields) computed using a a 30 days moving window (i.e. 15 days before and 15 days after the considered day of the year) instead than fixed monthly window. Preliminary results of correlation analysis among the appearance of Significant Sequences of TIR Anomalies (SSTAs) and time, location and magnitude of earthquakes (M≥5), performed by applying predefined space-temporal and magnitude constraints, show that 80% of SSTAs were in an apparent space-time relations with earthquakes with a false alarm rate of 20%.

  16. Local amplification of seismic waves from the Denali earthquake and damaging seiches in Lake Union, Seattle, Washington

    USGS Publications Warehouse

    Barberopoulou, A.; Qamar, A.; Pratt, T.L.; Creager, K.C.; Steele, W.P.

    2004-01-01

    The Mw7.9 Denali, Alaska earthquake of 3 November, 2002, caused minor damage to at least 20 houseboats in Seattle, Washington by initiating water waves in Lake Union. These water waves were likely initiated during the large amplitude seismic surface waves from this earthquake. Maps of spectral amplification recorded during the Denali earthquake on the Pacific Northwest Seismic Network (PNSN) strong-motion instruments show substantially increased shear and surface wave amplitudes coincident with the Seattle sedimentary basin. Because Lake Union is situated on the Seattle basin, the size of the water waves may have been increased by local amplification of the seismic waves by the basin. Complete hazard assessments require understanding the causes of these water waves during future earthquakes. Copyright 2004 by the American Geophysical Union.

  17. Ground Motion Prediction Models for Caucasus Region

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Godoladze, Tea; Tvaradze, Nino; Tumanova, Nino

    2016-04-01

    Ground motion prediction models (GMPMs) relate ground motion intensity measures to variables describing earthquake source, path, and site effects. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is peak ground acceleration or spectral acceleration because this parameter gives useful information for Seismic Hazard Assessment. Since 2003 development of Georgian Digital Seismic Network has started. In this study new GMP models are obtained based on new data from Georgian seismic network and also from neighboring countries. Estimation of models is obtained by classical, statistical way, regression analysis. In this study site ground conditions are additionally considered because the same earthquake recorded at the same distance may cause different damage according to ground conditions. Empirical ground-motion prediction models (GMPMs) require adjustment to make them appropriate for site-specific scenarios. However, the process of making such adjustments remains a challenge. This work presents a holistic framework for the development of a peak ground acceleration (PGA) or spectral acceleration (SA) GMPE that is easily adjustable to different seismological conditions and does not suffer from the practical problems associated with adjustments in the response spectral domain.

  18. Development of regional liquefaction-induced deformation hazard maps

    USGS Publications Warehouse

    Rosinski, A.; Knudsen, K.-L.; Wu, J.; Seed, R.B.; Real, C.R.; ,

    2004-01-01

    This paper describes part of a project to assess the feasibility of producing regional (1:24,000-scale) liquefaction hazard maps that are based-on potential liquefaction-induced deformation. The study area is the central Santa Clara Valley, at the south end of San Francisco Bay in Central California. The information collected and used includes: a) detailed Quaternary geological mapping, b) over 650 geotechnical borings, c) probabilistic earthquake shaking information, and d) ground-water levels. Predictions of strain can be made using either empirical formulations or numerical simulations. In this project lateral spread displacements are estimated and new empirical relations to estimate future volumetric and shear strain are used. Geotechnical boring data to are used to: (a) develop isopach maps showing the thickness of sediment thatis likely to liquefy and deform under earthquake shaking; and (b) assess the variability in engineering properties within and between geologic map units. Preliminary results reveal that late Holocene deposits are likely to experience the greatest liquefaction-induced strains, while Holocene and late Pleistocene deposits are likely to experience significantly less horizontal and vertical strain in future earthquakes. Development of maps based on these analyses is feasible.

  19. Documentation for the Southeast Asia seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark; Harmsen, Stephen; Mueller, Charles; Haller, Kathleen; Dewey, James; Luco, Nicolas; Crone, Anthony; Lidke, David; Rukstales, Kenneth

    2007-01-01

    The U.S. Geological Survey (USGS) Southeast Asia Seismic Hazard Project originated in response to the 26 December 2004 Sumatra earthquake (M9.2) and the resulting tsunami that caused significant casualties and economic losses in Indonesia, Thailand, Malaysia, India, Sri Lanka, and the Maldives. During the course of this project, several great earthquakes ruptured subduction zones along the southern coast of Indonesia (fig. 1) causing additional structural damage and casualties in nearby communities. Future structural damage and societal losses from large earthquakes can be mitigated by providing an advance warning of tsunamis and introducing seismic hazard provisions in building codes that allow buildings and structures to withstand strong ground shaking associated with anticipated earthquakes. The Southeast Asia Seismic Hazard Project was funded through a United States Agency for International Development (USAID)—Indian Ocean Tsunami Warning System to develop seismic hazard maps that would assist engineers in designing buildings that will resist earthquake strong ground shaking. An important objective of this project was to discuss regional hazard issues with building code officials, scientists, and engineers in Thailand, Malaysia, and Indonesia. The code communities have been receptive to these discussions and are considering updating the Thailand and Indonesia building codes to incorporate new information (for example, see notes from Professor Panitan Lukkunaprasit, Chulalongkorn University in Appendix A).

  20. Earthquakes in the Central United States, 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Volpi, Christina M.

    2010-01-01

    This publication is an update of an earlier report, U.S. Geological Survey (USGS) Geologic Investigation I-2812 by Wheeler and others (2003), titled ?Earthquakes in the Central United States-1699-2002.? Like the original poster, the center of the updated poster is a map showing the pattern of earthquake locations in the most seismically active part of the central United States. Arrayed around the map are short explanatory texts and graphics, which describe the distribution of historical earthquakes and the effects of the most notable of them. The updated poster contains additional, post 2002, earthquake data. These are 38 earthquakes covering the time interval from January 2003 to June 2010, including the Mount Carmel, Illinois, earthquake of 2008. The USGS Preliminary Determination of Epicenters (PDE) was the source of these additional data. Like the I-2812 poster, this poster was prepared for a nontechnical audience and designed to inform the general public as to the widespread occurrence of felt and damaging earthquakes in the Central United States. Accordingly, the poster should not be used to assess earthquake hazard in small areas or at individual locations.

  1. Central US earthquake catalog for hazard maps of Memphis, Tennessee

    USGS Publications Warehouse

    Wheeler, R.L.; Mueller, C.S.

    2001-01-01

    An updated version of the catalog that was used for the current national probabilistic seismic-hazard maps would suffice for production of large-scale hazard maps of the Memphis urban area. Deaggregation maps provide guidance as to the area that a catalog for calculating Memphis hazard should cover. For the future, the Nuttli and local network catalogs could be examined for earthquakes not presently included in the catalog. Additional work on aftershock removal might reduce hazard uncertainty. Graphs of decadal and annual earthquake rates suggest completeness at and above magnitude 3 for the last three or four decades. Any additional work on completeness should consider the effects of rapid, local population changes during the Nation's westward expansion. ?? 2001 Elsevier Science B.V. All rights reserved.

  2. Physically-Based Probabilistic Seismic Hazard Analysis Using Broad-Band Ground Motion Simulation: a Case Study for Prince Islands Fault, Marmara Sea

    NASA Astrophysics Data System (ADS)

    Mert, A.

    2016-12-01

    The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.

  3. Simulation of tsunamis from great earthquakes on the cascadia subduction zone.

    PubMed

    Ng, M K; Leblond, P H; Murty, T S

    1990-11-30

    Large earthquakes occur episodically in the Cascadia subduction zone. A numerical model has been used to simulate and assess the hazards of a tsunami generated by a hypothetical earthquake of magnitude 8.5 associated with rupture of the northern sections of the subduction zone. Wave amplitudes on the outer coast are closely related to the magnitude of sea-bottom displacement (5.0 meters). Some amplification, up to a factor of 3, may occur in some coastal embayments. Wave amplitudes in the protected waters of Puget Sound and the Strait of Georgia are predicted to be only about one fifth of those estmated on the outer coast.

  4. Proceedings of Conference V: communicating earthquake hazard reduction information: convened under auspices of National Earthquake Hazards Reduction Program 22-24 May, 1978

    USGS Publications Warehouse

    Hays, Walter W.

    1978-01-01

    (11) achieving landslide hazard reduction. The objective was to identify the most significant lessons learned during the course of each experience and to develop recommendations for improving communication that might be incorporated in the search program of the USGS.

  5. Cascadia Earthquake and Tsunami Scenario for California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L.

    2006-12-01

    In 1995 the California Division of Mines and Geology (now the California Geological Survey) released a planning scenario for an earthquake on the southern portion of the Cascadia subduction zone (CSZ). This scenario was the 8th and last of the Earthquake Planning Scenarios published by CDMG. It was the largest magnitude CDMG scenario, an 8.4 earthquake rupturing the southern 200 km of the CSZ, and it was the only scenario to include tsunami impacts. This scenario event has not occurred in historic times and depicts impacts far more severe than any recent earthquake. The local tsunami hazard is new; there is no written record of significant local tsunami impact in the region. The north coast scenario received considerable attention in Humboldt and Del Norte Counties and contributed to a number of mitigation efforts. The Redwood Coast Tsunami Work Group (RCTWG), an organization of scientists, emergency managers, government agencies, and businesses from Humboldt, Mendocino, and Del Norte Counties, was formed in 1996 to assist local jurisdictions in understanding the implications of the scenario and to promote a coordinated, consistent mitigation program. The group has produced print and video materials and promoted response and evacuation planning. Since 1997 the RCTWG has sponsored an Earthquake Tsunami Education Room at county fairs featuring preparedness information, hands-on exhibits and regional tsunami hazard maps. Since the development of the TsunamiReady Program in 2001, the RCTWG facilitates community TsunamiReady certification. To assess the effectiveness of mitigation efforts, five telephone surveys between 1993 and 2001 were conducted by the Humboldt Earthquake Education Center. A sixth survey is planned for this fall. Each survey includes between 400 and 600 respondents. Over the nine year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased from 51 to 73 percent and knowing what the Cascadia subduction zone is from 16 to 42 percent. It is not surprising that the earlier surveys showed increases as several strong earthquakes occurred in the area between 1992 and 1995 and there was considerable media attention. But the 2001 survey, seven years after the last widely felt event, still shows significant increases in almost all preparedness indicators. The 1995 CDMG scenario was not the sole reason for the increased interest in earthquake and tsunami hazards in the area, but the scenario gave government recognition to an event that was previously only considered seriously in the scientific community and has acted as a catalyst for mitigation and planning efforts.

  6. The International Platform on Earthquake Early Warning Systems (IP-EEWS)

    NASA Astrophysics Data System (ADS)

    Torres, Jair; Fanchiotti, Margherita

    2017-04-01

    The Sendai Framework for Disaster Risk Reduction 2015-2030 recognizes the need to "substantially increase the availability of and access to multi-hazard early warning systems and disaster risk information and assessments to the people by 2030" as one of its global targets (target "g"). While considerable progress has been made in recent decades, early warning systems (EWSs) continue to be less developed for geo-hazards and significant challenges remain in advancing the development of EWSs for specific hazards, particularly for fastest onset hazards such as earthquakes. An earthquake early warning system (EEWS) helps in disseminating timely information about potentially catastrophic earthquake hazards to the public, emergency managers and the private sector to provide enough time to implement automatized emergency measures. At the same time, these systems help to reduce considerably the CO2 emissions produced by the catastrophic impacts and subsequent effects of earthquakes, such as those generated by fires, collapses, and pollution (among others), as well as those produced in the recovery and reconstruction processes. In recent years, EEWSs have been developed independently in few countries: EEWSs have shown operational in Japan and Mexico, while other regions in California (USA), Turkey, Italy, Canada, South Korea and China (including Taiwan) are in the development stages or under restricted applications. Many other countries in the Indian Subcontinent, Southeast Asia, Central Asia, Middle East, Eastern Africa, Southeast Africa, as well as Central America, South America and the Caribbean, are located in some of the most seismically active regions in the world, or present moderate seismicity but high vulnerability, and would strongly benefit from the development of EEWSs. Given that, in many instances, the development of an EEWS still requires further testing, increased density coverage in seismic observation stations, regional coordination, and further scientific understanding, there is a strong need to enhance the technical and operational capacities required for these systems and to further understand the implications for policy. In an effort to address this gap, in December 2015, UNESCO launched the International Platform on Earthquake Early Warning Systems (IP-EEWS). The main objective of the Platform is to assess the current state of the art in the development and operationalisation of EEWS globally, foster dialogue and international cooperation for capacity building around these systems, and therefore promote and strengthen EEWS in earthquake-prone countries worldwide. This paper will discuss the opportunities and challenges for the development and operationalisation of these systems, as well as the specific aim, objectives and expected contributions of this newly established Platform. The following ten countries are already represented in IP-EEWS through leading scientific experts in top institutions: USA (University of California Berkeley), Japan (Meteorological Research Institute), Mexico (Centro de Instrumentacion y Registro Sismico), Italy (University of Naples Federico II), Switzerland (ETH - Swiss Federal Institute of Technology Zurich), Spain (Universidad Complutense de Madrid), Turkey (Kandili Observatory and Earthquake Research Institute, Boǧaziçi University), Germany (GFZ - German Research Centre for Geosciences), China (University of Beijing), and Romania (National Institute for Earth Physics). More countries are expected to join the initiative.

  7. Recent Mega-Thrust Tsunamigenic Earthquakes and PTHA

    NASA Astrophysics Data System (ADS)

    Lorito, S.

    2013-05-01

    The occurrence of several mega-thrust tsunamigenic earthquakes in the last decade, including but not limited to the 2004 Sumatra-Andaman, the 2010 Maule, and 2011 Tohoku earthquakes, has been a dramatic reminder of the limitations in our capability of assessing earthquake and tsunami hazard and risk. However, the increasingly high-quality geophysical observational networks allowed the retrieval of most accurate than ever models of the rupture process of mega-thrust earthquakes, thus paving the way for future improved hazard assessments. Probabilistic Tsunami Hazard Analysis (PTHA) methodology, in particular, is less mature than its seismic counterpart, PSHA. Worldwide recent research efforts of the tsunami science community allowed to start filling this gap, and to define some best practices that are being progressively employed in PTHA for different regions and coasts at threat. In the first part of my talk, I will briefly review some rupture models of recent mega-thrust earthquakes, and highlight some of their surprising features that likely result in bigger error bars associated to PTHA results. More specifically, recent events of unexpected size at a given location, and with unexpected rupture process features, posed first-order open questions which prevent the definition of an heterogeneous rupture probability along a subduction zone, despite of several recent promising results on the subduction zone seismic cycle. In the second part of the talk, I will dig a bit more into a specific ongoing effort for improving PTHA methods, in particular as regards epistemic and aleatory uncertainties determination, and the computational PTHA feasibility when considering the full assumed source variability. Only logic trees are usually explicated in PTHA studies, accounting for different possible assumptions on the source zone properties and behavior. The selection of the earthquakes to be actually modelled is then in general made on a qualitative basis or remains implicit, despite different methods like event trees have been used for different applications. I will define a quite general PTHA framework, based on the mixed use of logic and event trees. I will first discuss a particular class of epistemic uncertainties, i.e. those related to the parametric fault characterization in terms of geometry, kinematics, and assessment of activity rates. A systematic classification in six justification levels of epistemic uncertainty related with the existence and behaviour of fault sources will be presented. Then, a particular branch of the logic tree is chosen in order to discuss just the aleatory variability of earthquake parameters, represented with an event tree. Even so, PTHA based on numerical scenarios is a too demanding computational task, particularly when probabilistic inundation maps are needed. For trying to reduce the computational burden without under-representing the source variability, the event tree is first constructed by taking care of densely (over-)sampling the earthquake parameter space, and then the earthquakes are filtered basing on their associated tsunami impact offshore, before calculating inundation maps. I'll describe this approach by means of a case study in the Mediterranean Sea, namely the PTHA for some locations of Eastern Sicily coasts and Southern Crete coast due to potential subduction earthquakes occurring on the Hellenic Arc.

  8. The finite, kinematic rupture properties of great-sized earthquakes since 1990

    USGS Publications Warehouse

    Hayes, Gavin

    2017-01-01

    Here, I present a database of >160 finite fault models for all earthquakes of M 7.5 and above since 1990, created using a consistent modeling approach. The use of a common approach facilitates easier comparisons between models, and reduces uncertainties that arise when comparing models generated by different authors, data sets and modeling techniques.I use this database to verify published scaling relationships, and for the first time show a clear and intriguing relationship between maximum potency (the product of slip and area) and average potency for a given earthquake. This relationship implies that earthquakes do not reach the potential size given by the tectonic load of a fault (sometimes called “moment deficit,” calculated via a plate rate over time since the last earthquake, multiplied by geodetic fault coupling). Instead, average potency (or slip) scales with but is less than maximum potency (dictated by tectonic loading). Importantly, this relationship facilitates a more accurate assessment of maximum earthquake size for a given fault segment, and thus has implications for long-term hazard assessments. The relationship also suggests earthquake cycles may not completely reset after a large earthquake, and thus repeat rates of such events may appear shorter than is expected from tectonic loading. This in turn may help explain the phenomenon of “earthquake super-cycles” observed in some global subduction zones.

  9. The finite, kinematic rupture properties of great-sized earthquakes since 1990

    NASA Astrophysics Data System (ADS)

    Hayes, Gavin P.

    2017-06-01

    Here, I present a database of >160 finite fault models for all earthquakes of M 7.5 and above since 1990, created using a consistent modeling approach. The use of a common approach facilitates easier comparisons between models, and reduces uncertainties that arise when comparing models generated by different authors, data sets and modeling techniques. I use this database to verify published scaling relationships, and for the first time show a clear and intriguing relationship between maximum potency (the product of slip and area) and average potency for a given earthquake. This relationship implies that earthquakes do not reach the potential size given by the tectonic load of a fault (sometimes called ;moment deficit,; calculated via a plate rate over time since the last earthquake, multiplied by geodetic fault coupling). Instead, average potency (or slip) scales with but is less than maximum potency (dictated by tectonic loading). Importantly, this relationship facilitates a more accurate assessment of maximum earthquake size for a given fault segment, and thus has implications for long-term hazard assessments. The relationship also suggests earthquake cycles may not completely reset after a large earthquake, and thus repeat rates of such events may appear shorter than is expected from tectonic loading. This in turn may help explain the phenomenon of ;earthquake super-cycles; observed in some global subduction zones.

  10. Estimation of Maximum Ground Motions in the Form of ShakeMaps and Assessment of Potential Human Fatalities from Scenario Earthquakes on the Chishan Active Fault in southern Taiwan

    NASA Astrophysics Data System (ADS)

    Liu, Kun Sung; Huang, Hsiang Chi; Shen, Jia Rong

    2017-04-01

    Historically, there were many damaging earthquakes in southern Taiwan during the last century. Some of these earthquakes had resulted in heavy loss of human lives. Accordingly, assessment of potential seismic hazards has become increasingly important in southern Taiwan, including Kaohsiung, Tainan and northern Pingtung areas since the Central Geological Survey upgraded the Chishan active fault from suspected fault to Category I in 2010. In this study, we first estimate the maximum seismic ground motions in term of PGA, PGV and MMI by incorporating a site-effect term in attenuation relationships, aiming to show high seismic hazard areas in southern Taiwan. Furthermore, we will assess potential death tolls due to large future earthquakes occurring on Chishan active fault. As a result, from the maximum PGA ShakeMap for an Mw7.2 scenario earthquake on the Chishan active fault in southern Taiwan, we can see that areas with high PGA above 400 gals, are located in the northeastern, central and northern parts of southwestern Kaohsiung as well as the southern part of central Tainan. In addition, comparing the cities located in Tainan City at similar distances from the Chishan fault have relatively greater PGA and PGV than those in Kaohsiung City and Pingtung County. This is mainly due to large site response factors in Tainan. On the other hand, seismic hazard in term of PGA and PGV, respectively, show that they are not particular high in the areas near the Chishan fault. The main reason is that these areas are marked with low site response factors. Finally, the estimated fatalities in Kaohsiung City at 5230, 4285 and 2786, respectively, for Mw 7.2, 7.0 and 6.8 are higher than those estimated for Tainan City and Pingtung County. The main reason is high population density above 10000 persons per km2 are present in Fongshan, Zuoying, Sanmin, Cianjin, Sinsing, Yancheng, Lingya Districts and between 5,000 and 10,000 persons per km2 are present in Nanzih and Gushan Districts in Kaohsiung City. Another to pay special attention is Kaohsiung City has more than 540 thousands households whose residences over 50 years old, including bungalows and 2-3 stories houses. Many of them are still in use. Even more worry some is that in Kaohsiung many of these old structures are used for shops in the city center where population is highly concentrated. In case of earthquake, the consequences would be unthinkable. In light of results of this study, we urge both the municipal and central governments to take effective seismic hazard mitigation measures in the highly urbanized areas with large number of old buildings in southern Taiwan.

  11. Constant strain accumulation rate between major earthquakes on the North Anatolian Fault.

    PubMed

    Hussain, Ekbal; Wright, Tim J; Walters, Richard J; Bekaert, David P S; Lloyd, Ryan; Hooper, Andrew

    2018-04-11

    Earthquakes are caused by the release of tectonic strain accumulated between events. Recent advances in satellite geodesy mean we can now measure this interseismic strain accumulation with a high degree of accuracy. But it remains unclear how to interpret short-term geodetic observations, measured over decades, when estimating the seismic hazard of faults accumulating strain over centuries. Here, we show that strain accumulation rates calculated from geodetic measurements around a major transform fault are constant for its entire 250-year interseismic period, except in the ~10 years following an earthquake. The shear strain rate history requires a weak fault zone embedded within a strong lower crust with viscosity greater than ~10 20  Pa s. The results support the notion that short-term geodetic observations can directly contribute to long-term seismic hazard assessment and suggest that lower-crustal viscosities derived from postseismic studies are not representative of the lower crust at all spatial and temporal scales.

  12. Accelerating slip rates on the puente hills blind thrust fault system beneath metropolitan Los Angeles, California, USA

    USGS Publications Warehouse

    Bergen, Kristian J.; Shaw, John H.; Leon, Lorraine A.; Dolan, James F.; Pratt, Thomas L.; Ponti, Daniel J.; Morrow, Eric; Barrera, Wendy; Rhodes, Edward J.; Murari, Madhav K.; Owen, Lewis A.

    2017-01-01

    Slip rates represent the average displacement across a fault over time and are essential to estimating earthquake recurrence for proba-bilistic seismic hazard assessments. We demonstrate that the slip rate on the western segment of the Puente Hills blind thrust fault system, which is beneath downtown Los Angeles, California (USA), has accel-erated from ~0.22 mm/yr in the late Pleistocene to ~1.33 mm/yr in the Holocene. Our analysis is based on syntectonic strata derived from the Los Angeles River, which has continuously buried a fold scarp above the blind thrust. Slip on the fault beneath our field site began during the late-middle Pleistocene and progressively increased into the Holocene. This increase in rate implies that the magnitudes and/or the frequency of earthquakes on this fault segment have increased over time. This challenges the characteristic earthquake model and presents an evolving and potentially increasing seismic hazard to metropolitan Los Angeles.

  13. The 2014 update to the National Seismic Hazard Model in California

    USGS Publications Warehouse

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  14. 44 CFR 362.3 - Criteria for determining acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... gift of services is offered to the Administrator for the benefit of the National Earthquake Hazards... objectives of the National Earthquake Hazards Reduction Program, as defined in 42 U.S.C. 7702. (b) All gifts...

  15. 44 CFR 362.3 - Criteria for determining acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... gift of services is offered to the Administrator for the benefit of the National Earthquake Hazards... objectives of the National Earthquake Hazards Reduction Program, as defined in 42 U.S.C. 7702. (b) All gifts...

  16. 44 CFR 362.3 - Criteria for determining acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... gift of services is offered to the Administrator for the benefit of the National Earthquake Hazards... objectives of the National Earthquake Hazards Reduction Program, as defined in 42 U.S.C. 7702. (b) All gifts...

  17. 44 CFR 362.3 - Criteria for determining acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... gift of services is offered to the Administrator for the benefit of the National Earthquake Hazards... objectives of the National Earthquake Hazards Reduction Program, as defined in 42 U.S.C. 7702. (b) All gifts...

  18. 44 CFR 362.3 - Criteria for determining acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... gift of services is offered to the Administrator for the benefit of the National Earthquake Hazards... objectives of the National Earthquake Hazards Reduction Program, as defined in 42 U.S.C. 7702. (b) All gifts...

  19. Improving the RST Approach for Earthquake Prone Areas Monitoring: Results of Correlation Analysis among Significant Sequences of TIR Anomalies and Earthquakes (M>4) occurred in Italy during 2004-2014

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Coviello, I.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2015-12-01

    Looking toward the assessment of a multi-parametric system for dynamically updating seismic hazard estimates and earthquake short term (from days to weeks) forecast, a preliminary step is to identify those parameters (chemical, physical, biological, etc.) whose anomalous variations can be, to some extent, associated to the complex process of preparation of a big earthquake. Among the different parameters, the fluctuations of Earth's thermally emitted radiation, as measured by sensors on board of satellite system operating in the Thermal Infra-Red (TIR) spectral range, have been proposed since long time as potential earthquake precursors. Since 2001, a general approach called Robust Satellite Techniques (RST) has been used to discriminate anomalous thermal signals, possibly associated to seismic activity from normal fluctuations of Earth's thermal emission related to other causes (e.g. meteorological) independent on the earthquake occurrence. Thanks to its full exportability on different satellite packages, RST has been implemented on TIR images acquired by polar (e.g. NOAA-AVHRR, EOS-MODIS) and geostationary (e.g. MSG-SEVIRI, NOAA-GOES/W, GMS-5/VISSR) satellite sensors, in order to verify the presence (or absence) of TIR anomalies in presence (absence) of earthquakes (with M>4) in different seismogenic areas around the world (e.g. Italy, Turkey, Greece, California, Taiwan, etc.).In this paper, a refined RST (Robust Satellite Techniques) data analysis approach and RETIRA (Robust Estimator of TIR Anomalies) index were used to identify Significant Sequences of TIR Anomalies (SSTAs) during eleven years (from May 2004 to December 2014) of TIR satellite records, collected over Italy by the geostationary satellite sensor MSG-SEVIRI. On the basis of specific validation rules (mainly based on physical models and results obtained by applying RST approach to several earthquakes all around the world) the level of space-time correlation among SSTAs and earthquakes (with M≥4) occurrence has been evaluated. Achieved results will be discussed, also in the framework of a multi-parametric approach to time-Dependent Assessment of Seismic Hazard (t-DASH).

  20. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part A, Prehistoric earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax, the maximum earthquake magnitude thought to be possible within a specified geographic region. This report is Part A of an Open-File Report that describes the construction of a global catalog of moderate to large earthquakes, from which one can estimate Mmax for most of the Central and Eastern United States and adjacent Canada. The catalog and Mmax estimates derived from it were used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. This Part A discusses prehistoric earthquakes that occurred in eastern North America, northwestern Europe, and Australia, whereas a separate Part B deals with historical events.

  1. Publication: Evansville hazard maps

    USGS Publications Warehouse

    ,

    2012-01-01

    The Evansville (Indiana) Area Earthquake Hazards Mapping Project was completed in February 2012. It was a collaborative effort among the U.S. Geological Survey and regional partners Purdue University; the Center for Earthquake Research and Information at the University of Memphis; the state geologic surveys of Kentucky, Illinois, and Indiana; the Southwest Indiana Disaster Resistant Community Corporation; and the Central U.S. Earthquake Consortium state geologists.

  2. The Development of an Earthquake Preparedness Plan for a Child Care Center in a Geologically Hazardous Region.

    ERIC Educational Resources Information Center

    Wokurka, Linda

    The director of a child care center at a community college in California developed an earthquake preparedness plan for the center which met state and local requirements for earthquake preparedness at schools. The plan consisted of: (1) the identification and reduction of nonstructural hazards in classrooms, office, and staff rooms; (2) storage of…

  3. NEHRP turns 40

    USGS Publications Warehouse

    Leith, William S.

    2017-01-01

    This year, the National Earthquake Hazards Reduction Program (NEHRP) turns 40, four decades since the Earthquake Hazards Reduction Act of 1977 was enacted establishing the Program, spurring numerous federal, state, and community actions to reduce earthquake losses in the U.S.A. and its territories and setting a standard for earthquake loss‐reduction projects internationally. Four agencies are partners in NEHRP: the Federal Emergency Management Agency (FEMA), the National Institute of Standards and Technology (NIST, the lead agency), the National Science Foundation (NSF), and the U.S. Geological Survey (USGS).

  4. Quantitative risk assessment of landslides triggered by earthquakes and rainfall based on direct costs of urban buildings

    NASA Astrophysics Data System (ADS)

    Vega, Johnny Alexander; Hidalgo, Cesar Augusto

    2016-11-01

    This paper outlines a framework for risk assessment of landslides triggered by earthquakes and rainfall in urban buildings in the city of Medellín - Colombia, applying a model that uses a geographic information system (GIS). We applied a computer model that includes topographic, geological, geotechnical and hydrological features of the study area to assess landslide hazards using the Newmark's pseudo-static method, together with a probabilistic approach based on the first order and second moment method (FOSM). The physical vulnerability assessment of buildings was conducted using structural fragility indexes, as well as the definition of damage level of buildings via decision trees and using Medellin's cadastral inventory data. The probability of occurrence of a landslide was calculated assuming that an earthquake produces horizontal ground acceleration (Ah) and considering the uncertainty of the geotechnical parameters and the soil saturation conditions of the ground. The probability of occurrence was multiplied by the structural fragility index values and by the replacement value of structures. The model implemented aims to quantify the risk caused by this kind of disaster in an area of the city of Medellín based on different values of Ah and an analysis of the damage costs of this disaster to buildings under different scenarios and structural conditions. Currently, 62% of ;Valle de Aburra; where the study area is located is under very low condition of landslide hazard and 38% is under low condition. If all buildings in the study area fulfilled the requirements of the Colombian building code, the costs of a landslide would be reduced 63% compared with the current condition. An earthquake with a return period of 475 years was used in this analysis according to the seismic microzonation study in 2002.

  5. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    USGS Publications Warehouse

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    The ground motion hazard for Sumatra and the Malaysian peninsula is calculated in a probabilistic framework, using procedures developed for the US National Seismic Hazard Maps. We constructed regional earthquake source models and used standard published and modified attenuation equations to calculate peak ground acceleration at 2% and 10% probability of exceedance in 50 years for rock site conditions. We developed or modified earthquake catalogs and declustered these catalogs to include only independent earthquakes. The resulting catalogs were used to define four source zones that characterize earthquakes in four tectonic environments: subduction zone interface earthquakes, subduction zone deep intraslab earthquakes, strike-slip transform earthquakes, and intraplate earthquakes. The recurrence rates and sizes of historical earthquakes on known faults and across zones were also determined from this modified catalog. In addition to the source zones, our seismic source model considers two major faults that are known historically to generate large earthquakes: the Sumatran subduction zone and the Sumatran transform fault. Several published studies were used to describe earthquakes along these faults during historical and pre-historical time, as well as to identify segmentation models of faults. Peak horizontal ground accelerations were calculated using ground motion prediction relations that were developed from seismic data obtained from the crustal interplate environment, crustal intraplate environment, along the subduction zone interface, and from deep intraslab earthquakes. Most of these relations, however, have not been developed for large distances that are needed for calculating the hazard across the Malaysian peninsula, and none were developed for earthquake ground motions generated in an interplate tectonic environment that are propagated into an intraplate tectonic environment. For the interplate and intraplate crustal earthquakes, we have applied ground-motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  6. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach, Deterministic Approach, Historical Heritage, Cappadocia.

  7. Implications of the 26 December 2004 Sumatra-Andaman earthquake on tsunami forecast and assessment models for great subduction-zone earthquakes

    USGS Publications Warehouse

    Geist, Eric L.; Titov, Vasily V.; Arcas, Diego; Pollitz, Fred F.; Bilek, Susan L.

    2007-01-01

    Results from different tsunami forecasting and hazard assessment models are compared with observed tsunami wave heights from the 26 December 2004 Indian Ocean tsunami. Forecast models are based on initial earthquake information and are used to estimate tsunami wave heights during propagation. An empirical forecast relationship based only on seismic moment provides a close estimate to the observed mean regional and maximum local tsunami runup heights for the 2004 Indian Ocean tsunami but underestimates mean regional tsunami heights at azimuths in line with the tsunami beaming pattern (e.g., Sri Lanka, Thailand). Standard forecast models developed from subfault discretization of earthquake rupture, in which deep- ocean sea level observations are used to constrain slip, are also tested. Forecast models of this type use tsunami time-series measurements at points in the deep ocean. As a proxy for the 2004 Indian Ocean tsunami, a transect of deep-ocean tsunami amplitudes recorded by satellite altimetry is used to constrain slip along four subfaults of the M >9 Sumatra–Andaman earthquake. This proxy model performs well in comparison to observed tsunami wave heights, travel times, and inundation patterns at Banda Aceh. Hypothetical tsunami hazard assessments models based on end- member estimates for average slip and rupture length (Mw 9.0–9.3) are compared with tsunami observations. Using average slip (low end member) and rupture length (high end member) (Mw 9.14) consistent with many seismic, geodetic, and tsunami inversions adequately estimates tsunami runup in most regions, except the extreme runup in the western Aceh province. The high slip that occurred in the southern part of the rupture zone linked to runup in this location is a larger fluctuation than expected from standard stochastic slip models. In addition, excess moment release (∼9%) deduced from geodetic studies in comparison to seismic moment estimates may generate additional tsunami energy, if the exponential time constant of slip is less than approximately 1 hr. Overall, there is significant variation in assessed runup heights caused by quantifiable uncertainty in both first-order source parameters (e.g., rupture length, slip-length scaling) and spatiotemporal complexity of earthquake rupture.

  8. Improving vulnerability models: lessons learned from a comparison between flood and earthquake assessments

    NASA Astrophysics Data System (ADS)

    de Ruiter, Marleen; Ward, Philip; Daniell, James; Aerts, Jeroen

    2017-04-01

    In a cross-discipline study, an extensive literature review has been conducted to increase the understanding of vulnerability indicators used in both earthquake- and flood vulnerability assessments, and to provide insights into potential improvements of earthquake and flood vulnerability assessments. It identifies and compares indicators used to quantitatively assess earthquake and flood vulnerability, and discusses their respective differences and similarities. Indicators have been categorized into Physical- and Social categories, and further subdivided into (when possible) measurable and comparable indicators. Physical vulnerability indicators have been differentiated to exposed assets such as buildings and infrastructure. Social indicators are grouped in subcategories such as demographics, economics and awareness. Next, two different vulnerability model types have been described that use these indicators: index- and curve-based vulnerability models. A selection of these models (e.g. HAZUS) have been described, and compared on several characteristics such as temporal- and spatial aspects. It appears that earthquake vulnerability methods are traditionally strongly developed towards physical attributes at an object scale and used in vulnerability curve models, whereas flood vulnerability studies focus more on indicators applied to aggregated land-use scales. Flood risk studies could be improved using approaches from earthquake studies, such as incorporating more detailed lifeline and building indicators, and developing object-based vulnerability curve assessments of physical vulnerability, for example by defining building material based flood vulnerability curves. Related to this, is the incorporation of time of the day based building occupation patterns (at 2am most people will be at home while at 2pm most people will be in the office). Earthquake assessments could learn from flood studies when it comes to the refined selection of social vulnerability indicators. Based on the lessons obtained in this study, we recommend future studies to further explore cross-hazard studies.

  9. A preliminary regional assessment of earthquake-induced landslide susceptibility for Vrancea Seismic Region

    NASA Astrophysics Data System (ADS)

    Micu, Mihai; Balteanu, Dan; Ionescu, Constantin; Havenith, Hans; Radulian, Mircea; van Westen, Cees; Damen, Michiel; Jurchescu, Marta

    2015-04-01

    In seismically-active regions, earthquakes may trigger landslides enhancing the short-to-long term slope denudation and sediment delivery and conditioning the general landscape evolution. Co-seismic slope failures present in general a low frequency - high magnitude pattern which should be addressed accordingly by landslide hazard assessment, with respect to the generally more frequent precipitation-triggered landslides. The Vrancea Seismic Region, corresponding to the curvature sector of the Eastern Romanian Carpathians, represents the most active sub-crustal (focal depth > 50 km) earthquake province of Europe. It represents the main seismic energy source throughout Romania with significant transboundary effects recorded as far as Ukraine and Bulgaria. During the last 300 years, the region featured 14 earthquakes with M>7, among which seven events with magnitude above 7.5 and three between 7.7 and 7.9. Apart from the direct damages, the Vrancea earthquakes are also responsible for causing numerous other geohazards, such as ground fracturing, groundwater level disturbances and possible deep-seated landslide occurrences (rock slumps, rock-block slides, rock falls, rock avalanches). The older deep-seated landslides (assumed to have been) triggered by earthquakes usually affect the entire slope profile. They often formed landslide dams strongly influencing the river morphology and representing potential threats (through flash-floods) in case of lake outburst. Despite the large potential of this research issue, the correlation between the region's seismotectonic context and landslide predisposing factors has not yet been entirely understood. Presently, there is a lack of information provided by the geohazards databases of Vrancea that does not allow us to outline the seismic influence on the triggering of slope failures in this region. We only know that the morphology of numerous large, deep-seated and dormant landslides (which can possibly be reactivated in future) with head scarps near mountain tops and close to faults is similar to the one of large mass movements for which a seismic origin is proved (such as in the Tien Shan, Pamir, Longmenshan, etc.). Thus, correlations between landslide occurrence and combined seismotectonic and climatic factors are needed to support a regional multi-hazard risk assessment. The purpose of this paper is to harmonize for the first time at a regional scale the landslide predisposing factors and seismotectonic triggers and to present a first qualitative insight into the earthquake-induced landslide susceptibility for the Vrancea Seismic Region in terms of a GIS-based analysis of Newmark displacement (ND). In this way, it aims at better defining spatial and temporal distribution patterns of earthquake-triggered landslides. Arias Intensity calculation involved in the assessment considers both regional seismic hazard aspects and singular earthquake scenarios (adjusted by topography amplification factors). The known distribution of landslides mapped through digital stereographic interpretation of high-resolution aerial photos is compared with digital active fault maps and the computed ND maps to statistically outline the seismotectonic influence on slope stability in the study area. The importance of this approach resides in two main outputs. The fist one, of a fundamental nature, by providing the first regional insight into the seismic landslides triggering framework, is allowing us to understand if deep-focus earthquakes may trigger massive slope failures in an area with a relatively smooth relief (compared to the high mountain regions in Central Asia, the Himalayas), considering possible geologic and topographic site effects. The second one, more applied, will allow a better accelerometer instrumentation and monitoring of slopes and also will provide a first correlation of different levels of seismic shaking with precipitation recurrences, an important relationship within a multi-hazard risk preparedness and prevention framework.

  10. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    USGS Publications Warehouse

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  11. Improvements on mapping soil liquefaction at a regional scale

    NASA Astrophysics Data System (ADS)

    Zhu, Jing

    Earthquake induced soil liquefaction is an important secondary hazard during earthquakes and can lead to significant damage to infrastructure. Mapping liquefaction hazard is important in both planning for earthquake events and guiding relief efforts by positioning resources once the events have occurred. This dissertation addresses two aspects of liquefaction hazard mapping at a regional scale including 1) predictive liquefaction hazard mapping and 2) post-liquefaction cataloging. First, current predictive hazard liquefaction mapping relies on detailed geologic maps and geotechnical data, which are not always available in at-risk regions. This dissertation improves the predictive liquefaction hazard mapping by the development and validation of geospatial liquefaction models (Chapter 2 and 3) that predict liquefaction extent and are appropriate for global application. The geospatial liquefaction models are developed using logistic regression from a liquefaction database consisting of the data from 27 earthquake events from six countries. The model that performs best over the entire dataset includes peak ground velocity (PGV), VS30, distance to river, distance to coast, and precipitation. The model that performs best over the noncoastal dataset includes PGV, VS30, water table depth, distance to water body, and precipitation. Second, post-earthquake liquefaction cataloging historically relies on field investigation that is often limited by time and expense, and therefore results in limited and incomplete liquefaction inventories. This dissertation improves the post-earthquake cataloging by the development and validation of a remote sensing-based method that can be quickly applied over a broad region after an earthquake and provide a detailed map of liquefaction surface effects (Chapter 4). Our method uses the optical satellite images before and after an earthquake event from the WorldView-2 satellite with 2 m spatial resolution and eight spectral bands. Our method uses the changes of spectral variables that are sensitive to surface moisture and soil characteristics paired with a supervised classification.

  12. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites from Vacska cave, Pilis Mountains of Hungary

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Kovács, Károly; Mónus, Péter; Konecny, Pavel; Lednicka, Marketa; Novák, Attila

    2017-04-01

    Damaging earthquakes in central Europe are infrequent, but do occur. This raises the important issue for society of how to react to this hazard: potential damages are enormous, and infrastructure costs for addressing these hazards are huge as well. Obtaining an unbiased expert knowledge of the seismic hazard (and risk) is therefore very important. Seismic activity in the Pannonian Basin is moderate. In territories with low or moderate seismic activity the recurrence time of large earthquakes can be as long as 10,000 years. Therefore, we cannot draw well-grounded inferences in the field of seismic hazard assessment exclusively from the seismic data of 1,000- to 2,000-years observational period, that we have in our earthquake catalogues. Long-term information can be gained from intact and vulnerable stalagmites (IVSTM) in natural karstic caves. These fragile formations survived all earthquakes that have occurred, over thousands of years - depending on the age of them. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that time period. Here we present such a stalagmite-based case study from the Pilis Mountains of Hungary. Evidence of historic events and of differential uplifting (incision of Danube at the River Bend and in Buda and Gerecse Hills) exists in the vicinity of investigated cave site. These observations imply that a better understanding of possible co-seismic ground motions in the nearby densely populated areas of Budapest is needed. A specially shaped (high, slim and more or less cylindrical form), intact and vulnerable stalagmites in the Vacska cave, Pilis Mountains were examined. The method of our investigation includes in-situ examination of the IVSTM and mechanical laboratory measurements of broken stalagmite samples. The used approach can yield significant new constraints on the seismic hazard of the investigated area, since tectonic structures close to Vacska cave could not have generated strong paleoearthquakes in the last few thousand years, which would have produced a horizontal ground acceleration larger than the upper acceleration threshold that we can determined from the intact and vulnerable stalagmites. A particular importance of this study results from the seismic hazard of the capital of Hungary.

  13. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 28. Recommended Accelerograms for Earthquake Ground Motions

    DTIC Science & Technology

    1992-06-01

    Rodolfo H. eu al., December 1985. Analisis de los Acelero- gramas del Terremoto del 3 de Marzo de 1985: University of Chile, Pub- lication SES I 4/1985 (199...196741975 Records: Open-File Report (unpublished). Mexico 1974 Prince, Jorge at al., February 1976 . Procesamiento de Acelerograas Obtenidos on 1974:, UNAM...engineering profession. The recent Mexican Guerrero data is a welcome exception to this generalization. 9 Calculations 24 . Few calculations were required for

  14. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    USGS Publications Warehouse

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude distribution similar to that including characteristic earthquakes. The island chain northwest of Hawaii Island is seismically and volcanically much less active. We model its seismic hazard with a combination of a linearly decaying ramp fit to the cataloged seismicity and spatially smoothed seismicity with a smoothing half-width of 10 km. We use a combination of up to four attenuation relations for each map because for either PGA or SA, there is no single relation that represents ground motion for all distance and magnitude ranges. Great slumps and landslides visible on the ocean floor correspond to catastrophes with effective energy magnitudes ME above 8.0. A crude estimate of their frequency suggests that the probabilistic earthquake hazard is at least an order of magnitude higher for flank earthquakes than that from submarine slumps.

  15. A New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.

    2017-12-01

    We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.

  16. Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S. A.; Spencer, B. D.

    2015-12-01

    The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.

  17. Comments on potential geologic and seismic hazards affecting coastal Ventura County, California

    USGS Publications Warehouse

    Ross, Stephanie L.; Boore, David M.; Fisher, Michael A.; Frankel, Arthur D.; Geist, Eric L.; Hudnut, Kenneth W.; Kayen, Robert E.; Lee, Homa J.; Normark, William R.; Wong, Florence L.

    2004-01-01

    This report examines the regional seismic and geologic hazards that could affect proposed liquefied natural gas (LNG) facilities in coastal Ventura County, California. Faults throughout this area are thought to be capable of producing earthquakes of magnitude 6.5 to 7.5, which could produce surface fault offsets of as much as 15 feet. Many of these faults are sufficiently well understood to be included in the current generation of the National Seismic Hazard Maps; others may become candidates for inclusion in future revisions as research proceeds. Strong shaking is the primary hazard that causes damage from earthquakes and this area is zoned with a high level of shaking hazard. The estimated probability of a magnitude 6.5 or larger earthquake (comparable in size to the 2003 San Simeon quake) occurring in the next 30 years within 30 miles of Platform Grace is 50-60%; for Cabrillo Port, the estimate is a 35% likelihood. Combining these probabilities of earthquake occurrence with relationships that give expected ground motions yields the estimated seismic-shaking hazard. In parts of the project area, the estimated shaking hazard is as high as along the San Andreas Fault. The combination of long-period basin waves and LNG installations with large long-period resonances potentially increases this hazard.

  18. Earthquake Damage Assessment Using Very High Resolution Satelliteimagery

    NASA Astrophysics Data System (ADS)

    Chiroiu, L.; André, G.; Bahoken, F.; Guillande, R.

    Various studies using satellite imagery were applied in the last years in order to assess natural hazard damages, most of them analyzing the case of floods, hurricanes or landslides. For the case of earthquakes, the medium or small spatial resolution data available in the recent past did not allow a reliable identification of damages, due to the size of the elements (e.g. buildings or other structures), too small compared with the pixel size. The recent progresses of remote sensing in terms of spatial resolution and data processing makes possible a reliable damage detection to the elements at risk. Remote sensing techniques applied to IKONOS (1 meter resolution) and IRS (5 meters resolution) imagery were used in order to evaluate seismic vulnerability and post earthquake damages. A fast estimation of losses was performed using a multidisciplinary approach based on earthquake engineering and geospatial analysis. The results, integrated into a GIS database, could be transferred via satellite networks to the rescue teams deployed on the affected zone, in order to better coordinate the emergency operations. The methodology was applied to the city of Bhuj and Anjar after the 2001 Gujarat (India) Earthquake.

  19. Seismic risk assessment for road in Indonesia

    NASA Astrophysics Data System (ADS)

    Toyfur, Mona Foralisa; Pribadi, Krishna S.

    2016-05-01

    Road networks in Indonesia consist of 446,000 km of national, provincial and local roads as well as toll highways. Indonesia is one of countries that exposed to various natural hazards, such as earthquakes, floods, landslides, etc. Within the Indonesian archipelago, several global tectonic plates interact, such as the Indo-Australian, Pacific, Eurasian, resulting in a complex geological setting, characterized by the existence of seismically active faults and subduction zones and a chain of more than one hundred active volcanoes. Roads in Indonesia are vital infrastructure needed for people and goods movement, thus supporting community life and economic activities, including promoting regional economic development. Road damages and losses due to earthquakes have not been studied widely, whereas road disruption caused enormous economic damage. The aim of this research is to develop a method to analyse risk caused by seismic hazard to roads. The seismic risk level of road segment is defined using an earthquake risk index, adopting the method of Earthquake Disaster Risk Index model developed by Davidson (1997). Using this method, road segments' risk level can be defined and compared, and road risk map can be developed as a tool for prioritizing risk mitigation programs for road networks in Indonesia.

  20. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at facilitating rapid and proportionate earthquake response. For uncertainty representation, PAGER employs an Earthquake Impact Scale (EIS) that provides simple alerting thresholds, derived from systematic analyses of past earthquake impact and response levels. The alert levels are characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). We made a conscious attempt at both simple and intuitive color-coded alerting criterion; yet, we preserve the necessary uncertainty measures (with simple histograms) by which one can gauge the likelihood for the alert to be over- or underestimated. In these hazard and loss modeling examples, both products are widely used across a range of technical as well as general audiences. Ironically, ShakeMap uncertainties--rigorously reported and portrayed for the primarily scientific portion of the audience--are rarely employed and are routinely misunderstood; for PAGER, uncertainties aimed at a wider user audience seem to be more easily digested. We discuss how differences in the way these uncertainties are portrayed may play into their acceptance and uptake, or lack thereof.

  1. Towards a robust framework for Probabilistic Tsunami Hazard Assessment (PTHA) for local and regional tsunami in New Zealand

    NASA Astrophysics Data System (ADS)

    Mueller, Christof; Power, William; Fraser, Stuart; Wang, Xiaoming

    2013-04-01

    Probabilistic Tsunami Hazard Assessment (PTHA) is conceptually closely related to Probabilistic Seismic Hazard Assessment (PSHA). The main difference is that PTHA needs to simulate propagation of tsunami waves through the ocean and cannot rely on attenuation relationships, which makes PTHA computationally more expensive. The wave propagation process can be assumed to be linear as long as water depth is much larger than the wave amplitude of the tsunami. Beyond this limit a non-linear scheme has to be employed with significantly higher algorithmic run times. PTHA considering far-field tsunami sources typically uses unit source simulations, and relies on the linearity of the process by later scaling and combining the wave fields of individual simulations to represent the intended earthquake magnitude and rupture area. Probabilistic assessments are typically made for locations offshore but close to the coast. Inundation is calculated only for significantly contributing events (de-aggregation). For local and regional tsunami it has been demonstrated that earthquake rupture complexity has a significant effect on the tsunami amplitude distribution offshore and also on inundation. In this case PTHA has to take variable slip distributions and non-linearity into account. A unit source approach cannot easily be applied. Rupture complexity is seen as an aleatory uncertainty and can be incorporated directly into the rate calculation. We have developed a framework that manages the large number of simulations required for local PTHA. As an initial case study the effect of rupture complexity on tsunami inundation and the statistics of the distribution of wave heights have been investigated for plate-interface earthquakes in the Hawke's Bay region in New Zealand. Assessing the probability that water levels will be in excess of a certain threshold requires the calculation of empirical cumulative distribution functions (ECDF). We compare our results with traditional estimates for tsunami inundation simulations that do not consider rupture complexity. De-aggregation based on moment magnitude alone might not be appropriate, because the hazard posed by any individual event can be underestimated locally if rupture complexity is ignored.

  2. Earthquake hazards in the Alaska transportation corridors

    DOT National Transportation Integrated Search

    1983-03-01

    Based on observations made by modern seismographic networks since 1967, and taking into consideration historical records of large Alaskan earthquakes in the past, it is judged that the hazards faced by transportation corridors in different areas of t...

  3. Predicting earthquake effects—Learning from Northridge and Loma Prieta

    USGS Publications Warehouse

    Holzer, Thomas L.

    1994-01-01

    The continental United States has been rocked by two particularly damaging earthquakes in the last 4.5 years, Loma Prieta in northern California in 1989 and Northridge in southern California in 1994. Combined losses from these two earthquakes approached $30 billion. Approximately half these losses were reimbursed by the federal government. Because large earthquakes typically overwhelm state resources and place unplanned burdens on the federal government, it is important to learn from these earthquakes how to reduce future losses. My purpose here is to explore a potential implication of the Northridge and Loma Prieta earthquakes for hazard-mitigation strategies: earth scientists should increase their efforts to map hazardous areas within urban regions. 

  4. The Implications of Strike-Slip Earthquake Source Properties on the Transform Boundary Development Process

    NASA Astrophysics Data System (ADS)

    Neely, J. S.; Huang, Y.; Furlong, K.

    2017-12-01

    Subduction-Transform Edge Propagator (STEP) faults, produced by the tearing of a subducting plate, allow us to study the development of a transform plate boundary and improve our understanding of both long-term geologic processes and short-term seismic hazards. The 280 km long San Cristobal Trough (SCT), formed by the tearing of the Australia plate as it subducts under the Pacific plate near the Solomon and Vanuatu subduction zones, shows along-strike variations in earthquake behaviors. The segment of the SCT closest to the tear rarely hosts earthquakes > Mw 6, whereas the SCT sections more than 80 - 100 km from the tear experience Mw7 earthquakes with repeated rupture along the same segments. To understand the effect of cumulative displacement on SCT seismicity, we analyze b-values, centroid-time delays and corner frequencies of the SCT earthquakes. We use the spectral ratio method based on Empirical Green's Functions (eGfs) to isolate source effects from propagation and site effects. We find high b-values along the SCT closest to the tear with values decreasing with distance before finally increasing again towards the far end of the SCT. Centroid time-delays for the Mw 7 strike-slip earthquakes increase with distance from the tear, but corner frequency estimates for a recent sequence of Mw 7 earthquakes are approximately equal, indicating a growing complexity in earthquake behavior with distance from the tear due to a displacement-driven transform boundary development process (see figure). The increasing complexity possibly stems from the earthquakes along the eastern SCT rupturing through multiple asperities resulting in multiple moment pulses. If not for the bounding Vanuatu subduction zone at the far end of the SCT, the eastern SCT section, which has experienced the most displacement, might be capable of hosting larger earthquakes. When assessing the seismic hazard of other STEP faults, cumulative fault displacement should be considered a key input in determining potential earthquake size.

  5. Assessing community vulnerabilities to natural hazards on the Island of Hawaii

    NASA Astrophysics Data System (ADS)

    Nishioka, Chris; Delparte, Donna

    2010-05-01

    The island of Hawaii is susceptible to numerous natural hazards such as tsunamis, flooding, lava flow, earthquakes, hurricanes, landslides, wildfires and storm surge. The impact of a natural disaster on the island's communities has the potential to endanger peoples' lives and threaten critical infrastructure, homes, businesses and economic drivers such as tourism. A Geographic Information System (GIS) has the ability to assess community vulnerabilities by examining the spatial relationships between hazard zones, socioeconomic infrastructure and demographic data. By drawing together existing datasets, GIS was used to examine a number of community vulnerabilities. Key areas of interest were government services, utilities, property assets, industry and transportation. GIS was also used to investigate population dynamics in hazard zones. Identification of community vulnerabilities from GIS analysis can support mitigation measures and assist planning and response measures to natural hazards.

  6. Seismic hazard analysis for Jayapura city, Papua

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A.

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock typemore » and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.« less

  7. Quaternary tectonic faulting in the Eastern United States

    USGS Publications Warehouse

    Wheeler, R.L.

    2006-01-01

    Paleoseismological study of geologic features thought to result from Quaternary tectonic faulting can characterize the frequencies and sizes of large prehistoric and historical earthquakes, thereby improving the accuracy and precision of seismic-hazard assessments. Greater accuracy and precision can reduce the likelihood of both underprotection and unnecessary design and construction costs. Published studies proposed Quaternary tectonic faulting at 31 faults, folds, seismic zones, and fields of earthquake-induced liquefaction phenomena in the Appalachian Mountains and Coastal Plain. Of the 31 features, seven are of known origin. Four of the seven have nontectonic origins and the other three features are liquefaction fields caused by moderate to large historical and Holocene earthquakes in coastal South Carolina, including Charleston; the Central Virginia Seismic Zone; and the Newbury, Massachusetts, area. However, the causal faults of the three liquefaction fields remain unclear. Charleston has the highest hazard because of large Holocene earthquakes in that area, but the hazard is highly uncertain because the earthquakes are uncertainly located. Of the 31 features, the remaining 24 are of uncertain origin. They require additional work before they can be clearly attributed either to Quaternary tectonic faulting or to nontectonic causes. Of these 24, 14 features, most of them faults, have little or no published geologic evidence of Quaternary tectonic faulting that could indicate the likely occurrence of earthquakes larger than those observed historically. Three more features of the 24 were suggested to have had Quaternary tectonic faulting, but paleoseismological and other studies of them found no evidence of large prehistoric earthquakes. The final seven features of uncertain origin require further examination because all seven are in or near urban areas. They are the Moodus Seismic Zone (Hartford, Connecticut), Dobbs Ferry fault zone and Mosholu fault (New York City), Lancaster Seismic Zone and the epicenter of the shallow Cacoosing Valley earthquake (Lancaster and Reading, Pennsylvania), Kingston fault (central New Jersey between New York and Philadelphia), and Everona fault-Mountain Run fault zone (Washington, D.C., and Arlington and Alexandria, Virginia). ?? 2005 Elsevier B.V. All rights reserved.

  8. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  9. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-02

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  10. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  11. Vulnerability of populations and man-made facilities to seismic hazards

    NASA Astrophysics Data System (ADS)

    Badal, J.; Vazquez-Prada, M.; Gonzalez, A.; Chourak, M.; Samardzhieva, E.; Zhang, Z.

    2003-04-01

    Earthquakes become major societal risks when they impinge on vulnerable populations. According to the available worldwide data during the twentieth century (NEIC Catalog of Earthquakes 1980-1999), almost half a thousand of earthquakes resulted in more than 1,615,000 human victims. Besides human casualty levels, destructive earthquakes frequently inflict huge economic losses. An additional problem of very different nature, but also worthy of being considered in a damage and loss analysis, is the direct cost associated with the damages derived from a strong seismic impact. We focus our attention on both aspects to their rapid quantitative assessment, and to lessen the earthquake disaster in areas affected by relatively strong earthquakes. Our final goal is the knowledge of potential losses from earthquakes to forward national programs in emergency management, and consequently the minimization of the life loss due to earthquakes, and to aid in response and recovery tasks. For this purpose we follow a suitable and comprehensible methodology for risk-based loss analysis, and simulate the occurence of a seismic event in densely populated areas of Spain.

  12. Surface Rupture Effects on Earthquake Moment-Area Scaling Relations

    NASA Astrophysics Data System (ADS)

    Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro

    2017-09-01

    Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.

  13. Microzonation of Seismic Hazard Potential in Taipei, Taiwan

    NASA Astrophysics Data System (ADS)

    Liu, K. S.; Lin, Y. P.

    2017-12-01

    The island of Taiwan lies at the boundary between the Philippine Sea plate and the Eurasia plate. Accordingly, the majority of seismic energy release near Taiwan originates from the two subduction zones. It is therefore not surprising that Taiwan has repeatedly been struck by large earthquakes such as 1986 Hualien earthquake, 1999 Chi Chi and 2002 Hualien earthquake. Microzonation of seismic hazard potential becomes necessary in Taipei City for the Central Geological Survey announced the Sanchiao active fault as Category II. In this study, a catalog of more than 2000 shallow earthquakes occurred from 1900 to 2015 with Mw magnitudes ranging from 5.0 to 8.2, and 11 disastrous earthquakes occurred from 1683-1899, as well as Sanchiao active fault in the vicinity are used to estimate the seismic hazard potential in Taipei City for seismic microzonation. Furthermore, the probabilities of seismic intensity exceeding CWB intensity 5, 6, 7 and MMI VI, VII, VIII in 10, 30, and 50-year periods in the above areas are also analyzed for the seismic microzonation. Finally, by comparing with the seismic zoning map of Taiwan in current building code that was revised after 921 earthquakes, Results of this study will show which areas with higher earthquake hazard potential in Taipei City. They provide a valuable database for the seismic design of critical facilities. It will help mitigate Taipei City earthquake disaster loss in the future, as well as provide critical information for emergency response plans.

  14. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  15. Earthquake hazards to domestic water distribution systems in Salt Lake County, Utah

    USGS Publications Warehouse

    Highland, Lynn M.

    1985-01-01

    A magnitude-7. 5 earthquake occurring along the central portion of the Wasatch Fault, Utah, may cause significant damage to Salt Lake County's domestic water system. This system is composed of water treatment plants, aqueducts, distribution mains, and other facilities that are vulnerable to ground shaking, liquefaction, fault movement, and slope failures. Recent investigations into surface faulting, landslide potential, and earthquake intensity provide basic data for evaluating the potential earthquake hazards to water-distribution systems in the event of a large earthquake. Water supply system components may be vulnerable to one or more earthquake-related effects, depending on site geology and topography. Case studies of water-system damage by recent large earthquakes in Utah and in other regions of the United States offer valuable insights in evaluating water system vulnerability to earthquakes.

  16. USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.

    2000-01-01

    The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and distances.

  17. Convolutional neural network for earthquake detection and location

    PubMed Central

    Perol, Thibaut; Gharbi, Michaël; Denolle, Marine

    2018-01-01

    The recent evolution of induced seismicity in Central United States calls for exhaustive catalogs to improve seismic hazard assessment. Over the last decades, the volume of seismic data has increased exponentially, creating a need for efficient algorithms to reliably detect and locate earthquakes. Today’s most elaborate methods scan through the plethora of continuous seismic records, searching for repeating seismic signals. We leverage the recent advances in artificial intelligence and present ConvNetQuake, a highly scalable convolutional neural network for earthquake detection and location from a single waveform. We apply our technique to study the induced seismicity in Oklahoma, USA. We detect more than 17 times more earthquakes than previously cataloged by the Oklahoma Geological Survey. Our algorithm is orders of magnitude faster than established methods. PMID:29487899

  18. Seismic hazard in the South Carolina coastal plain: 2002 update of the USGS national seismic hazard maps

    USGS Publications Warehouse

    Cramer, C.H.; Mays, T.W.; ,

    2005-01-01

    The damaging 1886 moment magnitude ???7 Charleston, South Carolina earthquake is indicative of the moderately likely earthquake activity along this portion of the Atlantic Coast. A recurrence of such an earthquake today would have serious consequences for the nation. The national seismic hazard maps produced by the U.S. Geological Survey (USGS) provide a picture of the levels of seismic hazard across the nation based on the best and most current scientific information. The USGS national maps were updated in 2002 and will become part of the International Codes in 2006. In the past decade, improvements have occurred in the scientific understanding of the nature and character of earthquake activity and expected ground motions in the central and eastern U.S. The paper summarizes the new knowledge of expected earthquake locations, magnitudes, recurrence, and ground-motion decay with distance. New estimates of peak ground acceleration and 0.2 s and 1.0 s spectral acceleration are compared with those displayed in the 1996 national maps. The 2002 maps show increased seismic hazard in much of the coastal plain of South Carolina, but a decrease in long period (1 s and greater) hazard by up to 20% at distances of over 50 km from the Charleston earthquake zone. Although the national maps do not account for the effects of local or regional sediments, deep coastal-plain sediments can significally alter expected ground shaking, particularly at long period motions where it can be 100% higher than the national maps.

  19. Seismicity in the source areas of the 1896 and 1933 Sanriku earthquakes and implications for large near-trench earthquake faults

    NASA Astrophysics Data System (ADS)

    Obana, Koichiro; Nakamura, Yasuyuki; Fujie, Gou; Kodaira, Shuichi; Kaiho, Yuka; Yamamoto, Yojiro; Miura, Seiichi

    2018-03-01

    In the northern part of the Japan Trench, the 1933 Showa-Sanriku earthquake (Mw 8.4), an outer-trench, normal-faulting earthquake, occurred 37 yr after the 1896 Meiji-Sanriku tsunami earthquake (Mw 8.0), a shallow, near-trench, plate-interface rupture. Tsunamis generated by both earthquakes caused severe damage along the Sanriku coast. Precise locations of earthquakes in the source areas of the 1896 and 1933 earthquakes have not previously been obtained because they occurred at considerable distances from the coast in deep water beyond the maximum operational depth of conventional ocean bottom seismographs (OBSs). In 2015, we incorporated OBSs designed for operation in deep water (ultradeep OBSs) in an OBS array during two months of seismic observations in the source areas of the 1896 and 1933 Sanriku earthquakes to investigate the relationship of seismicity there to outer-rise normal-faulting earthquakes and near-trench tsunami earthquakes. Our analysis showed that seismicity during our observation period occurred along three roughly linear trench-parallel trends in the outer-trench region. Seismic activity along these trends likely corresponds to aftershocks of the 1933 Showa-Sanriku earthquake and the Mw 7.4 normal-faulting earthquake that occurred 40 min after the 2011 Tohoku-Oki earthquake. Furthermore, changes of the clarity of reflections from the oceanic Moho on seismic reflection profiles and low-velocity anomalies within the oceanic mantle were observed near the linear trends of the seismicity. The focal mechanisms we determined indicate that an extensional stress regime extends to about 40 km depth, below which the stress regime is compressional. These observations suggest that rupture during the 1933 Showa-Sanriku earthquake did not extend to the base of the oceanic lithosphere and that compound rupture of multiple or segmented faults is a more plausible explanation for that earthquake. The source area of the 1896 Meiji-Sanriku tsunami earthquake is characterized by an aseismic region landward of the trench axis. Spatial heterogeneity of seismicity and crustal structure might indicate the near-trench faults that could lead to future hazardous events such as the 1896 and 1933 Sanriku earthquakes, and should be taken into account in assessment of tsunami hazards related to large near-trench earthquakes.

  20. GEOS seismograms recorded for aftershocks of the earthquakes of December 7, 1988, near Spitak, Armenia SSR, during the time period 26 December 1988 14:00 through 29 December 1988 (UTC)

    USGS Publications Warehouse

    Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward

    1989-01-01

    The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk.

Top