Sample records for laboratory probabilistic seismic

  1. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less

  2. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less

  3. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  4. Probabilistic Seismic Hazard Assessment for Iraq

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less

  5. Probabilistic seismic hazard characterization and design parameters for the Pantex Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernreuter, D. L.; Foxall, W.; Savy, J. B.

    1998-10-19

    The Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) updated the seismic hazard and design parameters at the Pantex Plant. The probabilistic seismic hazard (PSH) estimates were first updated using the latest available data and knowledge from LLNL (1993, 1998), Frankel et al. (1996), and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity and for the system of potentially active faults associated with the Amarillo-Wichita uplift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was taken from results of similar studies. Special attentionmore » was given to soil amplification factors for the site. Horizontal Peak Ground Acceleration (PGA) and 5% damped uniform hazard spectra were calculated for six return periods (100 yr., 500 yr., 1000 yr., 2000 yr., 10,000 yr., and 100,000 yr.). The design parameters were calculated following DOE standards (DOE-STD-1022 to 1024). Response spectra for design or evaluation of Performance Category 1 through 4 structures, systems, and components are presented.« less

  6. Incorporating seismic phase correlations into a probabilistic model of global-scale seismology

    NASA Astrophysics Data System (ADS)

    Arora, Nimar

    2013-04-01

    We present a probabilistic model of seismic phases whereby the attributes of the body-wave phases are correlated to those of the first arriving P phase. This model has been incorporated into NET-VISA (Network processing Vertically Integrated Seismic Analysis) a probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. In the earlier version of NET-VISA, seismic phase were assumed to be independent of each other. Although this didn't affect the quality of the inferred seismic bulletin, for the most part, it did result in a few instances of anomalous phase association. For example, an S phase with a smaller slowness than the corresponding P phase. We demonstrate that the phase attributes are indeed highly correlated, for example the uncertainty in the S phase travel time is significantly reduced given the P phase travel time. Our new model exploits these correlations to produce better calibrated probabilities for the events, as well as fewer anomalous associations.

  7. Probabilistic seismic hazard zonation for the Cuban building code update

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Llanes-Buron, C.

    2013-05-01

    A probabilistic seismic hazard assessment has been performed in response to a revision and update of the Cuban building code (NC-46-99) for earthquake-resistant building construction. The hazard assessment have been done according to the standard probabilistic approach (Cornell, 1968) and importing the procedures adopted by other nations dealing with the problem of revising and updating theirs national building codes. Problems of earthquake catalogue treatment, attenuation of peak and spectral ground acceleration, as well as seismic source definition have been rigorously analyzed and a logic-tree approach was used to represent the inevitable uncertainties encountered through the whole seismic hazard estimation process. The seismic zonation proposed here, is formed by a map where it is reflected the behaviour of the spectral acceleration values for short (0.2 seconds) and large (1.0 seconds) periods on rock conditions with a 1642 -year return period, which being considered as maximum credible earthquake (ASCE 07-05). In addition, other three design levels are proposed (severe earthquake: with a 808 -year return period, ordinary earthquake: with a 475 -year return period and minimum earthquake: with a 225 -year return period). The seismic zonation proposed here fulfils the international standards (IBC-ICC) as well as the world tendencies in this thematic.

  8. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  9. A parimutuel gambling perspective to compare probabilistic seismicity forecasts

    NASA Astrophysics Data System (ADS)

    Zechar, J. Douglas; Zhuang, Jiancang

    2014-10-01

    Using analogies to gaming, we consider the problem of comparing multiple probabilistic seismicity forecasts. To measure relative model performance, we suggest a parimutuel gambling perspective which addresses shortcomings of other methods such as likelihood ratio, information gain and Molchan diagrams. We describe two variants of the parimutuel approach for a set of forecasts: head-to-head, in which forecasts are compared in pairs, and round table, in which all forecasts are compared simultaneously. For illustration, we compare the 5-yr forecasts of the Regional Earthquake Likelihood Models experiment for M4.95+ seismicity in California.

  10. Probabilistic seismic history matching using binary images

    NASA Astrophysics Data System (ADS)

    Davolio, Alessandra; Schiozer, Denis Jose

    2018-02-01

    Currently, the goal of history-matching procedures is not only to provide a model matching any observed data but also to generate multiple matched models to properly handle uncertainties. One such approach is a probabilistic history-matching methodology based on the discrete Latin Hypercube sampling algorithm, proposed in previous works, which was particularly efficient for matching well data (production rates and pressure). 4D seismic (4DS) data have been increasingly included into history-matching procedures. A key issue in seismic history matching (SHM) is to transfer data into a common domain: impedance, amplitude or pressure, and saturation. In any case, seismic inversions and/or modeling are required, which can be time consuming. An alternative to avoid these procedures is using binary images in SHM as they allow the shape, rather than the physical values, of observed anomalies to be matched. This work presents the incorporation of binary images in SHM within the aforementioned probabilistic history matching. The application was performed with real data from a segment of the Norne benchmark case that presents strong 4D anomalies, including softening signals due to pressure build up. The binary images are used to match the pressurized zones observed in time-lapse data. Three history matchings were conducted using: only well data, well and 4DS data, and only 4DS. The methodology is very flexible and successfully utilized the addition of binary images for seismic objective functions. Results proved the good convergence of the method in few iterations for all three cases. The matched models of the first two cases provided the best results, with similar well matching quality. The second case provided models presenting pore pressure changes according to the expected dynamic behavior (pressurized zones) observed on 4DS data. The use of binary images in SHM is relatively new with few examples in the literature. This work enriches this discussion by presenting a new

  11. The Experimental Breeder Reactor II seismic probabilistic risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roglans, J; Hill, D J

    1994-02-01

    The Experimental Breeder Reactor II (EBR-II) is a US Department of Energy (DOE) Category A research reactor located at Argonne National Laboratory (ANL)-West in Idaho. EBR-II is a 62.5 MW-thermal Liquid Metal Reactor (LMR) that started operation in 1964 and it is currently being used as a testbed in the Integral Fast Reactor (IFR) Program. ANL has completed a Level 1 Probabilistic Risk Assessment (PRA) for EBR-II. The Level 1 PRA for internal events and most external events was completed in June 1991. The seismic PRA for EBR-H has recently been completed. The EBR-II reactor building contains the reactor, themore » primary system, and the decay heat removal systems. The reactor vessel, which contains the core, and the primary system, consisting of two primary pumps and an intermediate heat exchanger, are immersed in the sodium-filled primary tank, which is suspended by six hangers from a beam support structure. Three systems or functions in EBR-II were identified as the most significant from the standpoint of risk of seismic-induced fuel damage: (1) the reactor shutdown system, (2) the structural integrity of the passive decay heat removal systems, and (3) the integrity of major structures, like the primary tank containing the reactor that could threaten both the reactivity control and decay heat removal functions. As part of the seismic PRA, efforts were concentrated in studying these three functions or systems. The passive safety response of EBR-II reactor -- both passive reactivity shutdown and passive decay heat removal, demonstrated in a series of tests in 1986 -- was explicitly accounted for in the seismic PRA as it had been included in the internal events assessment.« less

  12. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  13. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  14. Probabilistic Seismic Hazard Maps for Ecuador

    NASA Astrophysics Data System (ADS)

    Mariniere, J.; Beauval, C.; Yepes, H. A.; Laurence, A.; Nocquet, J. M.; Alvarado, A. P.; Baize, S.; Aguilar, J.; Singaucho, J. C.; Jomard, H.

    2017-12-01

    A probabilistic seismic hazard study is led for Ecuador, a country facing a high seismic hazard, both from megathrust subduction earthquakes and shallow crustal moderate to large earthquakes. Building on the knowledge produced in the last years in historical seismicity, earthquake catalogs, active tectonics, geodynamics, and geodesy, several alternative earthquake recurrence models are developed. An area source model is first proposed, based on the seismogenic crustal and inslab sources defined in Yepes et al. (2016). A slightly different segmentation is proposed for the subduction interface, with respect to Yepes et al. (2016). Three earthquake catalogs are used to account for the numerous uncertainties in the modeling of frequency-magnitude distributions. The hazard maps obtained highlight several source zones enclosing fault systems that exhibit low seismic activity, not representative of the geological and/or geodetical slip rates. Consequently, a fault model is derived, including faults with an earthquake recurrence model inferred from geological and/or geodetical slip rate estimates. The geodetical slip rates on the set of simplified faults are estimated from a GPS horizontal velocity field (Nocquet et al. 2014). Assumptions on the aseismic component of the deformation are required. Combining these alternative earthquake models in a logic tree, and using a set of selected ground-motion prediction equations adapted to Ecuador's different tectonic contexts, a mean hazard map is obtained. Hazard maps corresponding to the percentiles 16 and 84% are also derived, highlighting the zones where uncertainties on the hazard are highest.

  15. CPT-based probabilistic and deterministic assessment of in situ seismic soil liquefaction potential

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Der Kiureghian, A.; Cetin, K.O.

    2006-01-01

    This paper presents a complete methodology for both probabilistic and deterministic assessment of seismic soil liquefaction triggering potential based on the cone penetration test (CPT). A comprehensive worldwide set of CPT-based liquefaction field case histories were compiled and back analyzed, and the data then used to develop probabilistic triggering correlations. Issues investigated in this study include improved normalization of CPT resistance measurements for the influence of effective overburden stress, and adjustment to CPT tip resistance for the potential influence of "thin" liquefiable layers. The effects of soil type and soil character (i.e., "fines" adjustment) for the new correlations are based on a combination of CPT tip and sleeve resistance. To quantify probability for performancebased engineering applications, Bayesian "regression" methods were used, and the uncertainties of all variables comprising both the seismic demand and the liquefaction resistance were estimated and included in the analysis. The resulting correlations were developed using a Bayesian framework and are presented in both probabilistic and deterministic formats. The results are compared to previous probabilistic and deterministic correlations. ?? 2006 ASCE.

  16. Shear-wave velocity-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    USGS Publications Warehouse

    Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.

    2013-01-01

    Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.

  17. Probabilistic seismic hazard assessment for the effect of vertical ground motions on seismic response of highway bridges

    NASA Astrophysics Data System (ADS)

    Yilmaz, Zeynep

    Typically, the vertical component of the ground motion is not considered explicitly in seismic design of bridges, but in some cases the vertical component can have a significant effect on the structural response. The key question of when the vertical component should be incorporated in design is answered by the probabilistic seismic hazard assessment study incorporating the probabilistic seismic demand models and ground motion models. Nonlinear simulation models with varying configurations of an existing bridge in California were considered in the analytical study. The simulation models were subjected to the set of selected ground motions in two stages: at first, only horizontal components of the motion were applied; while in the second stage the structures were subjected to both horizontal and vertical components applied simultaneously and the ground motions that produced the largest adverse effects on the bridge system were identified. Moment demand in the mid-span and at the support of the longitudinal girder and the axial force demand in the column are found to be significantly affected by the vertical excitations. These response parameters can be modeled using simple ground motion parameters such as horizontal spectral acceleration and vertical spectral acceleration within 5% to 30% error margin depending on the type of the parameter and the period of the structure. For a complete hazard assessment, both of these ground motion parameters explaining the structural behavior should also be modeled. For the horizontal spectral acceleration, Abrahamson and Silva (2008) model was used within many available standard model. A new NGA vertical ground motion model consistent with the horizontal model was constructed. These models are combined in a vector probabilistic seismic hazard analyses. Series of hazard curves developed and presented for different locations in Bay Area for soil site conditions to provide a roadmap for the prediction of these features for future

  18. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  19. Probabilistic Seismic Hazard Analysis for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.; Sharia, T.; Arabidze, V.; Tibaldi, A.; Bonali, F. L. L.; Russo, E.; Pasquaré Mariotto, F.

    2017-12-01

    Nowadays, seismic hazard studies are developed in terms of the calculation of Peak Ground Acceleration (PGA), Spectral Acceleration (SA), Peak Ground Velocity (PGV) and other recorded parameters. In the frame of EMME project PSH were calculated for Georgia using GMPE based on selection criteria. In the frame of Project N 216758 (supported by Shota Rustaveli National Science Foundation (SRNF)) PSH maps were estimated using hybrid- empirical ground motion prediction equation developed for Georgia. Due to the paucity of seismically recorded information, in this work we focused our research on a more robust dataset related to macroseismic data,and attempted to calculate the probabilistic seismic hazard directly in terms of macroseismicintensity. For this reason, we started calculating new intensity prediction equations (IPEs)for Georgia taking into account different sets, belonging to the same new database, as well as distances from the seismic source.With respect to the seismic source, in order to improve the quality of the results, we have also hypothesized the size of faults from empirical relations, and calculated new IPEs also by considering Joyner-Boore and rupture distances in addition to epicentral and hypocentral distances. Finally, site conditions have been included as variables for IPEs calculation Regarding the database, we used a brand new revised set of macroseismic data and instrumental records for the significant earthquakes that struck Georgia between 1900 and 2002.Particularly, a large amount of research and documents related to macroseismic effects of individual earthquakes, stored in the archives of the Institute of Geophysics, were used as sources for the new macroseismic data. The latter are reported in the Medvedev-Sponheuer-Karnikmacroseismic scale (MSK64). For each earthquake the magnitude, the focal depth and the epicenter location are also reported. An online version of the database, with therelated metadata,has been produced for the 69

  20. Effect of time dependence on probabilistic seismic-hazard maps and deaggregation for the central Apennines, Italy

    USGS Publications Warehouse

    Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.; Perkins, D.

    2009-01-01

    We produce probabilistic seismic-hazard assessments for the central Apennines, Italy, using time-dependent models that are characterized using a Brownian passage time recurrence model. Using aperiodicity parameters, ?? of 0.3, 0.5, and 0.7, we examine the sensitivity of the probabilistic ground motion and its deaggregation to these parameters. For the seismic source model we incorporate both smoothed historical seismicity over the area and geological information on faults. We use the maximum magnitude model for the fault sources together with a uniform probability of rupture along the fault (floating fault model) to model fictitious faults to account for earthquakes that cannot be correlated with known geologic structural segmentation.

  1. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  2. Planar seismic source characterization models developed for probabilistic seismic hazard assessment of Istanbul

    NASA Astrophysics Data System (ADS)

    Gülerce, Zeynep; Buğra Soyman, Kadir; Güner, Barış; Kaymakci, Nuretdin

    2017-12-01

    This contribution provides an updated planar seismic source characterization (SSC) model to be used in the probabilistic seismic hazard assessment (PSHA) for Istanbul. It defines planar rupture systems for the four main segments of the North Anatolian fault zone (NAFZ) that are critical for the PSHA of Istanbul: segments covering the rupture zones of the 1999 Kocaeli and Düzce earthquakes, central Marmara, and Ganos/Saros segments. In each rupture system, the source geometry is defined in terms of fault length, fault width, fault plane attitude, and segmentation points. Activity rates and the magnitude recurrence models for each rupture system are established by considering geological and geodetic constraints and are tested based on the observed seismicity that is associated with the rupture system. Uncertainty in the SSC model parameters (e.g., b value, maximum magnitude, slip rate, weights of the rupture scenarios) is considered, whereas the uncertainty in the fault geometry is not included in the logic tree. To acknowledge the effect of earthquakes that are not associated with the defined rupture systems on the hazard, a background zone is introduced and the seismicity rates in the background zone are calculated using smoothed-seismicity approach. The state-of-the-art SSC model presented here is the first fully documented and ready-to-use fault-based SSC model developed for the PSHA of Istanbul.

  3. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    NASA Astrophysics Data System (ADS)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate

  4. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  5. Probabilistic seismic hazard assessment of southern part of Ghana

    NASA Astrophysics Data System (ADS)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2018-05-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  6. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015

  7. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  8. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  9. On the use of faults and background seismicity in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Lorito, Stefano; Basili, Roberto; Tonini, Roberto; Tiberti, Mara Monica; Romano, Fabrizio; Perfetti, Paolo; Volpe, Manuela

    2017-04-01

    Most of the SPTHA studies and applications rely on several working assumptions: i) the - mostly offshore - tsunamigenic faults are sufficiently well known; ii) the subduction zone earthquakes dominate the hazard; iii) and their location and geometry is sufficiently well constrained. Hence, a probabilistic model is constructed as regards the magnitude-frequency distribution and sometimes the slip distribution of earthquakes occurring on assumed known faults. Then, tsunami scenarios are usually constructed for all earthquakes location, sizes, and slip distributions included in the probabilistic model, through deterministic numerical modelling of tsunami generation, propagation and impact on realistic bathymetries. Here, we adopt a different approach (Selva et al., GJI, 2016) that releases some of the above assumptions, considering that i) also non-subduction earthquakes may contribute significantly to SPTHA, depending on the local tectonic context; ii) that not all the offshore faults are known or sufficiently well constrained; iii) and that the faulting mechanism of future earthquakes cannot be considered strictly predictable. This approach uses as much as possible information from known faults which, depending on the amount of available information and on the local tectonic complexity, among other things, are either modelled as Predominant Seismicity (PS) or as Background Seismicity (BS). PS is used when it is possible to assume sufficiently known geometry and mechanism (e.g. for the main subduction zones). Conversely, within the BS approach information on faults is merged with that on past seismicity, dominant stress regime, and tectonic characterisation, to determine a probability density function for the faulting mechanism. To illustrate the methodology and its impact on the hazard estimates, we present an application in the NEAM region (Northeast Atlantic, Mediterranean and connected seas), initially designed during the ASTARTE project and now applied for the

  10. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  11. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  12. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  13. A probabilistic seismic model for the European Arctic

    NASA Astrophysics Data System (ADS)

    Hauser, Juerg; Dyer, Kathleen M.; Pasyanos, Michael E.; Bungum, Hilmar; Faleide, Jan I.; Clark, Stephen A.; Schweitzer, Johannes

    2011-01-01

    The development of three-dimensional seismic models for the crust and upper mantle has traditionally focused on finding one model that provides the best fit to the data while observing some regularization constraints. In contrast to this, the inversion employed here fits the data in a probabilistic sense and thus provides a quantitative measure of model uncertainty. Our probabilistic model is based on two sources of information: (1) prior information, which is independent from the data, and (2) different geophysical data sets, including thickness constraints, velocity profiles, gravity data, surface wave group velocities, and regional body wave traveltimes. We use a Markov chain Monte Carlo (MCMC) algorithm to sample models from the prior distribution, the set of plausible models, and test them against the data to generate the posterior distribution, the ensemble of models that fit the data with assigned uncertainties. While being computationally more expensive, such a probabilistic inversion provides a more complete picture of solution space and allows us to combine various data sets. The complex geology of the European Arctic, encompassing oceanic crust, continental shelf regions, rift basins and old cratonic crust, as well as the nonuniform coverage of the region by data with varying degrees of uncertainty, makes it a challenging setting for any imaging technique and, therefore, an ideal environment for demonstrating the practical advantages of a probabilistic approach. Maps of depth to basement and depth to Moho derived from the posterior distribution are in good agreement with previously published maps and interpretations of the regional tectonic setting. The predicted uncertainties, which are as important as the absolute values, correlate well with the variations in data coverage and quality in the region. A practical advantage of our probabilistic model is that it can provide estimates for the uncertainties of observables due to model uncertainties. We will

  14. Scalable Probabilistic Inference for Global Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Dear, T.; Russell, S.

    2011-12-01

    We describe a probabilistic generative model for seismic events, their transmission through the earth, and their detection (or mis-detection) at seismic stations. We also describe an inference algorithm that constructs the most probable event bulletin explaining the observed set of detections. The model and inference are called NET-VISA (network processing vertically integrated seismic analysis) and is designed to replace the current automated network processing at the IDC, the SEL3 bulletin. Our results (attached table) demonstrate that NET-VISA significantly outperforms SEL3 by reducing the missed events from 30.3% down to 12.5%. The difference is even more dramatic for smaller magnitude events. NET-VISA has no difficulty in locating nuclear explosions as well. The attached figure demonstrates the location predicted by NET-VISA versus other bulletins for the second DPRK event. Further evaluation on dense regional networks demonstrates that NET-VISA finds many events missed in the LEB bulletin, which is produced by the human analysts. Large aftershock sequences, as produced by the 2004 December Sumatra earthquake and the 2011 March Tohoku earthquake, can pose a significant load for automated processing, often delaying the IDC bulletins by weeks or months. Indeed these sequences can overload the serial NET-VISA inference as well. We describe an enhancement to NET-VISA to make it multi-threaded, and hence take full advantage of the processing power of multi-core and -cpu machines. Our experiments show that the new inference algorithm is able to achieve 80% efficiency in parallel speedup.

  15. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  16. A probabilistic framework for single-station location of seismicity on Earth and Mars

    NASA Astrophysics Data System (ADS)

    Böse, M.; Clinton, J. F.; Ceylan, S.; Euchner, F.; van Driel, M.; Khan, A.; Giardini, D.; Lognonné, P.; Banerdt, W. B.

    2017-01-01

    Locating the source of seismic energy from a single three-component seismic station is associated with large uncertainties, originating from challenges in identifying seismic phases, as well as inevitable pick and model uncertainties. The challenge is even higher for planets such as Mars, where interior structure is a priori largely unknown. In this study, we address the single-station location problem by developing a probabilistic framework that combines location estimates from multiple algorithms to estimate the probability density function (PDF) for epicentral distance, back azimuth, and origin time. Each algorithm uses independent and complementary information in the seismic signals. Together, the algorithms allow locating seismicity ranging from local to teleseismic quakes. Distances and origin times of large regional and teleseismic events (M > 5.5) are estimated from observed and theoretical body- and multi-orbit surface-wave travel times. The latter are picked from the maxima in the waveform envelopes in various frequency bands. For smaller events at local and regional distances, only first arrival picks of body waves are used, possibly in combination with fundamental Rayleigh R1 waveform maxima where detectable; depth phases, such as pP or PmP, help constrain source depth and improve distance estimates. Back azimuth is determined from the polarization of the Rayleigh- and/or P-wave phases. When seismic signals are good enough for multiple approaches to be used, estimates from the various methods are combined through the product of their PDFs, resulting in an improved event location and reduced uncertainty range estimate compared to the results obtained from each algorithm independently. To verify our approach, we use both earthquake recordings from existing Earth stations and synthetic Martian seismograms. The Mars synthetics are generated with a full-waveform scheme (AxiSEM) using spherically-symmetric seismic velocity, density and attenuation models of

  17. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  18. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could

  19. Evaluation of potential surface rupture and review of current seismic hazards program at the Los Alamos National Laboratory. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-09

    This report summarizes the authors review and evaluation of the existing seismic hazards program at Los Alamos National Laboratory (LANL). The report recommends that the original program be augmented with a probabilistic analysis of seismic hazards involving assignment of weighted probabilities of occurrence to all potential sources. This approach yields a more realistic evaluation of the likelihood of large earthquake occurrence particularly in regions where seismic sources may have recurrent intervals of several thousand years or more. The report reviews the locations and geomorphic expressions of identified fault lines along with the known displacements of these faults and last knowmore » occurrence of seismic activity. Faults are mapped and categorized into by their potential for actual movement. Based on geologic site characterization, recommendations are made for increased seismic monitoring; age-dating studies of faults and geomorphic features; increased use of remote sensing and aerial photography for surface mapping of faults; the development of a landslide susceptibility map; and to develop seismic design standards for all existing and proposed facilities at LANL.« less

  20. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  1. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    body, as well as to forecast of changes in its relief. As the volcanic and seismic processes are of cosmic nature and occurrence, it seems logical to investigate their chronological structure in terms of astronomical time reference system or in parameters of the Earth orbital movement. Gravitational interaction of the Earth with the moon, the Sun and planets of the Solar system forms the physical basis of this multidimensional system; it manifests itself in tidal deformations of the Earth's lithosphere and in periodical changes in the planet rotation and orbital speed. A search for chronological correlation between the Earth's volcanism and seismicity on one hand and the orbital parameters dynamic on the other shows a certain promise in relation to prognostic decisions. It should be kept in mind that the calculation of astronomical characteristics (Ephemerides), which is one of the main lines in theoretical astronomy, spans many years both in the past and in future. It seems appropriate therefore to apply the astronomical time reference system to investigations of chronological structure of volcanic and seismic processes from the methodical viewpoint, as well as for retrospective and prognostic analyses. To investigate temporal pattern of the volcanic and seismic processes and to find a degree of their dependence on tidal forces, we used the astronomical time reference system as related to the Earth's orbital movement. The system is based on substitution of calendar dates of eruption and earthquakes for corresponding values of known astronomical characteristics, such as the Earth to Sun and Earth to Moon distances, ecliptic latitude of the Moon, etc. In coordinates of astronomical parameters (JPL Planetary and Lunar Efemerides, 1997, as compiled by the Jet Propulsion Laboratory, California Institute of Technology, on the basis of DE 406 block developed by NASA), we analyzed massifs of information, both volcanological (Catalogue of the World volcanic eruptions by I

  2. Some Probabilistic and Statistical Properties of the Seismic Regime of Zemmouri (Algeria) Seismoactive Zone

    NASA Astrophysics Data System (ADS)

    Baddari, Kamel; Bellalem, Fouzi; Baddari, Ibtihel; Makdeche, Said

    2016-10-01

    Statistical tests have been used to adjust the Zemmouri seismic data using a distribution function. The Pareto law has been used and the probabilities of various expected earthquakes were computed. A mathematical expression giving the quantiles was established. The extreme values limiting law confirmed the accuracy of the adjustment method. Using the moment magnitude scale, a probabilistic model was made to predict the occurrences of strong earthquakes. The seismic structure has been characterized by the slope of the recurrence plot γ, fractal dimension D, concentration parameter K sr, Hurst exponents H r and H t. The values of D, γ, K sr, H r, and H t diminished many months before the principal seismic shock ( M = 6.9) of the studied seismoactive zone has occurred. Three stages of the deformation of the geophysical medium are manifested in the variation of the coefficient G% of the clustering of minor seismic events.

  3. A probabilistic assessment of waste water injection induced seismicity in central California

    NASA Astrophysics Data System (ADS)

    Goebel, T.; Hauksson, E.; Ampuero, J. P.; Aminzadeh, F.; Cappa, F.; Saleeby, J.

    2014-12-01

    The recent, large increase in seismic activity within the central and eastern U.S. may be connected to an increase in fluid injection activity since ~2001. Anomalous seismic sequences can easily be identified in regions with low background seismicity rates. Here, we analyze seismicity in plate boundary regions where tectonically-driven earthquake sequences are common, potentially masking injection-induced events. We show results from a comprehensive analysis of waste water disposal wells in Kern county, the largest oil-producing county in California. We focus on spatial-temporal correlations between seismic and injection activity and seismicity-density changes due to injection. We perform a probabilistic assessment of induced vs. tectonic earthquakes, which can be applied to different regions independent of background rates and may provide insights into the probability of inducing earthquakes as a function of injection parameters and local geological conditions. Our results show that most earthquakes are caused by tectonic forcing, however, waste water injection contributes to seismic activity in four different regions with several events above M4. The seismicity shows different migration characteristics relative to the injection sites, including linear and non-linear trends. The latter is indicative of diffusive processes which take advantage of reservoir properties and fault structures and can induce earthquakes at distances of up to 10 km. Our results suggest that injection-related triggering processes are complex, possibly involving creep, and delayed triggering. Pore-pressure diffusion may be more extensive in the presence of active faults and high-permeability damage zones thus altering the local seismic hazard in a non-linear fashion. As a consequence, generic "best-practices" for fluid injections like a maximum distance from the nearest active fault may not be sufficient to mitigate a potential seismic hazard increase.

  4. Probabilistic seismic demand analysis using advanced ground motion intensity measures

    USGS Publications Warehouse

    Tothong, P.; Luco, N.

    2007-01-01

    One of the objectives in performance-based earthquake engineering is to quantify the seismic reliability of a structure at a site. For that purpose, probabilistic seismic demand analysis (PSDA) is used as a tool to estimate the mean annual frequency of exceeding a specified value of a structural demand parameter (e.g. interstorey drift). This paper compares and contrasts the use, in PSDA, of certain advanced scalar versus vector and conventional scalar ground motion intensity measures (IMs). One of the benefits of using a well-chosen IM is that more accurate evaluations of seismic performance are achieved without the need to perform detailed ground motion record selection for the nonlinear dynamic structural analyses involved in PSDA (e.g. record selection with respect to seismic parameters such as earthquake magnitude, source-to-site distance, and ground motion epsilon). For structural demands that are dominated by a first mode of vibration, using inelastic spectral displacement (Sdi) can be advantageous relative to the conventionally used elastic spectral acceleration (Sa) and the vector IM consisting of Sa and epsilon (??). This paper demonstrates that this is true for ordinary and for near-source pulse-like earthquake records. The latter ground motions cannot be adequately characterized by either Sa alone or the vector of Sa and ??. For structural demands with significant higher-mode contributions (under either of the two types of ground motions), even Sdi (alone) is not sufficient, so an advanced scalar IM that additionally incorporates higher modes is used.

  5. Probabilistic Seismic Hazard Assessment for Iraq Using Complete Earthquake Catalogue Files

    NASA Astrophysics Data System (ADS)

    Ameer, A. S.; Sharma, M. L.; Wason, H. R.; Alsinawi, S. A.

    2005-05-01

    Probabilistic seismic hazard analysis (PSHA) has been carried out for Iraq. The earthquake catalogue used in the present study covers an area between latitude 29° 38.5° N and longitude 39° 50° E containing more than a thousand events for the period 1905 2000. The entire Iraq region has been divided into thirteen seismogenic sources based on their seismic characteristics, geological setting and tectonic framework. The completeness of the seismicity catalogue has been checked using the method proposed by Stepp (1972). The analysis of completeness shows that the earthquake catalogue is not complete below Ms=4.8 for all of Iraq and seismic source zones S1, S4, S5, and S8, while it varies for the other seismic zones. A statistical treatment of completeness of the data file was carried out in each of the magnitude classes. The Frequency Magnitude Distributions (FMD) for the study area including all seismic source zones were established and the minimum magnitude of complete reporting (Mc) were then estimated. For the entire Iraq the Mc was estimated to be about Ms=4.0 while S11 shows the lowest Mc to be about Ms=3.5 and the highest Mc of about Ms=4.2 was observed for S4. The earthquake activity parameters (activity rate λ, b value, maximum regional magnitude mmax) as well as the mean return period (R) with a certain lower magnitude mmin ≥ m along with their probability of occurrence have been determined for all thirteen seismic source zones of Iraq. The maximum regional magnitude mmax was estimated as 7.87 ± 0.86 for entire Iraq. The return period for magnitude 6.0 is largest for source zone S3 which is estimated to be 705 years while the smallest value is estimated as 9.9 years for all of Iraq.

  6. Site-specific probabilistic seismic hazard analyses for the Idaho National Engineering Laboratory. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-05-01

    The identification of seismic sources is often based on a combination of geologic and tectonic considerations and patterns of observed seismicity; hence, a historical earthquake catalogue is important. A historical catalogue of earthquakes of approximate magnitude (M) 2.5 and greater for the time period 1850 through 1992 was compiled for the INEL region. The primary data source used was the Decade of North American Geology (DNAG) catalogue for the time period from about 1800 through 1985 (Engdahl and Rinehart, 1988). A large number of felt earthquakes, especially prior to the 1970`s, which were below the threshold of completeness established inmore » the DNAG catalogue (Engdahl and Rinehart, 1991), were taken from the state catalogues compiled by Stover and colleagues at the National Earthquake Information Center (NEIC) and combined with the DNAG catalogue for the INEL region. The state catalogues were those of Idaho, Montana, Nevada, Utah, and Wyoming. NEIC`s Preliminary Determination of Epicenters (PDE) and the state catalogues compiled by the Oregon Department of Geology and Mineral Industries (DOGAMI), and the University of Nevada at Reno (UNR) were also used to supplement the pre-1986 time period. A few events reanalyzed by Jim Zollweg (Boise State University, written communication, 1994) were also modified in the catalogue. In the case of duplicate events, the DNAG entry was preferred over the Stover et al. entry for the period 1850 through 1985. A few events from Berg and Baker (1963) were also added to the catalogue. This information was and will be used in determining the seismic risk of buildings and facilities located at the Idaho National Engineering Laboratory.« less

  7. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its

  8. A preliminary probabilistic analysis of tsunami sources of seismic and non-seismic origin applied to the city of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Anita, G.

    2011-12-01

    In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.

  9. Variabilities in probabilistic seismic hazard maps for natural and induced seismicity in the central and eastern United States

    USGS Publications Warehouse

    Mousavi, S. Mostafa; Beroza, Gregory C.; Hoover, Susan M.

    2018-01-01

    Probabilistic seismic hazard analysis (PSHA) characterizes ground-motion hazard from earthquakes. Typically, the time horizon of a PSHA forecast is long, but in response to induced seismicity related to hydrocarbon development, the USGS developed one-year PSHA models. In this paper, we present a display of the variability in USGS hazard curves due to epistemic uncertainty in its informed submodel using a simple bootstrapping approach. We find that variability is highest in low-seismicity areas. On the other hand, areas of high seismic hazard, such as the New Madrid seismic zone or Oklahoma, exhibit relatively lower variability simply because of more available data and a better understanding of the seismicity. Comparing areas of high hazard, New Madrid, which has a history of large naturally occurring earthquakes, has lower forecast variability than Oklahoma, where the hazard is driven mainly by suspected induced earthquakes since 2009. Overall, the mean hazard obtained from bootstrapping is close to the published model, and variability increased in the 2017 one-year model relative to the 2016 model. Comparing the relative variations caused by individual logic-tree branches, we find that the highest hazard variation (as measured by the 95% confidence interval of bootstrapping samples) in the final model is associated with different ground-motion models and maximum magnitudes used in the logic tree, while the variability due to the smoothing distance is minimal. It should be pointed out that this study is not looking at the uncertainty in the hazard in general, but only as it is represented in the USGS one-year models.

  10. Probabilistic Seismic Hazard Analysis of Victoria, British Columbia, Canada: Considering an Active Leech River Fault

    NASA Astrophysics Data System (ADS)

    Kukovica, J.; Molnar, S.; Ghofrani, H.

    2017-12-01

    The Leech River fault is situated on Vancouver Island near the city of Victoria, British Columbia, Canada. The 60km transpressional reverse fault zone runs east to west along the southern tip of Vancouver Island, dividing the lithologic units of Jurassic-Cretaceous Leech River Complex schists to the north and Eocene Metchosin Formation basalts to the south. This fault system poses a considerable hazard due to its proximity to Victoria and 3 major hydroelectric dams. The Canadian seismic hazard model for the 2015 National Building Code of Canada (NBCC) considered the fault system to be inactive. However, recent paleoseismic evidence suggests there to be at least 2 surface-rupturing events to have exceeded a moment magnitude (M) of 6.5 within the last 15,000 years (Morell et al. 2017). We perform a Probabilistic Seismic Hazard Analysis (PSHA) for the city of Victoria with consideration of the Leech River fault as an active source. A PSHA for Victoria which replicates the 2015 NBCC estimates is accomplished to calibrate our PSHA procedure. The same seismic source zones, magnitude recurrence parameters, and Ground Motion Prediction Equations (GMPEs) are used. We replicate the uniform hazard spectrum for a probability of exceedance of 2% in 50 years for a 500 km radial area around Victoria. An active Leech River fault zone is then added; known length and dip. We are determining magnitude recurrence parameters based on a Gutenberg-Richter relationship for the Leech River fault from various catalogues of the recorded seismicity (M 2-3) within the fault's vicinity and the proposed paleoseismic events. We seek to understand whether inclusion of an active Leech River fault source will significantly increase the probabilistic seismic hazard for Victoria. Morell et al. 2017. Quaternary rupture of a crustal fault beneath Victoria, British Columbia, Canada. GSA Today, 27, doi: 10.1130/GSATG291A.1

  11. Probabilistic seismic hazard analyses for ground motions and fault displacement at Yucca Mountain, Nevada

    USGS Publications Warehouse

    Stepp, J.C.; Wong, I.; Whitney, J.; Quittmeyer, R.; Abrahamson, N.; Toro, G.; Young, S.R.; Coppersmith, K.; Savy, J.; Sullivan, T.

    2001-01-01

    Probabilistic seismic hazard analyses were conducted to estimate both ground motion and fault displacement hazards at the potential geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. The study is believed to be the largest and most comprehensive analyses ever conducted for ground-shaking hazard and is a first-of-a-kind assessment of probabilistic fault displacement hazard. The major emphasis of the study was on the quantification of epistemic uncertainty. Six teams of three experts performed seismic source and fault displacement evaluations, and seven individual experts provided ground motion evaluations. State-of-the-practice expert elicitation processes involving structured workshops, consensus identification of parameters and issues to be evaluated, common sharing of data and information, and open exchanges about the basis for preliminary interpretations were implemented. Ground-shaking hazard was computed for a hypothetical rock outcrop at -300 m, the depth of the potential waste emplacement drifts, at the designated design annual exceedance probabilities of 10-3 and 10-4. The fault displacement hazard was calculated at the design annual exceedance probabilities of 10-4 and 10-5.

  12. Including foreshocks and aftershocks in time-independent probabilistic seismic hazard analyses

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    Time‐independent probabilistic seismic‐hazard analysis treats each source as being temporally and spatially independent; hence foreshocks and aftershocks, which are both spatially and temporally dependent on the mainshock, are removed from earthquake catalogs. Yet, intuitively, these earthquakes should be considered part of the seismic hazard, capable of producing damaging ground motions. In this study, I consider the mainshock and its dependents as a time‐independent cluster, each cluster being temporally and spatially independent from any other. The cluster has a recurrence time of the mainshock; and, by considering the earthquakes in the cluster as a union of events, dependent events have an opportunity to contribute to seismic ground motions and hazard. Based on the methods of the U.S. Geological Survey for a high‐hazard site, the inclusion of dependent events causes ground motions that are exceeded at probability levels of engineering interest to increase by about 10% but could be as high as 20% if variations in aftershock productivity can be accounted for reliably.

  13. Probabilistic seismic hazard maps for Sinai Peninsula, Egypt

    NASA Astrophysics Data System (ADS)

    Deif, A.; Abou Elenean, K.; El Hadidy, M.; Tealeb, A.; Mohamed, A.

    2009-09-01

    Sinai experienced the largest Egyptian earthquake with moment magnitude (Mw) 7.2 in 1995 in the Gulf of Aqaba, 350 km from Cairo. It is characterized by the presence of many tourist projects in addition to different natural resources. The aim of the current study is to present, for the first time, the probabilistic spectral hazard maps for Sinai. Revised earthquake catalogues for Sinai and its surroundings, from 112 BC to 2006 AD with magnitude equal or greater than 3.0, are used to calculate seismic hazard in the region of interest between 27°N and 31.5°N and 32°E and 36°E. We declustered these catalogues to include only independent events. The catalogues were tested for the completeness of different magnitude ranges. 28 seismic source zones are used to define the seismicity. The recurrence rates and the maximum earthquakes across these zones were also determined from these modified catalogues. Strong ground motion relations for rock are used to produce 5% damped spectral acceleration values for four different periods (0.2, 0.5, 1.0 and 2.0 s) to define the uniform response spectra at each site (grid of 0.2° × 0.2° all over the area). Maps showing spectral acceleration values at 0.2, 0.5, 1.0 and 2.0 s periods as well as peak ground acceleration (PGA) for the return period of 475 years (equivalent to 90% probability on non-exceedence in 50 years) are presented. In addition, Uniform Hazard Spectra (UHS) at 25 different periods for the four main cities (Hurghda, Sharm El-Sheikh, Nuweibaa and Suez) are graphed. The highest hazard is found in the Gulf of Aqaba with maximum spectral accelerations 356 cm s-2 at a period of 0.22 s for a return period of 475 years.

  14. Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska

    USGS Publications Warehouse

    Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.

    2007-01-01

    We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.

  15. Estimating Fault Friction From Seismic Signals in the Laboratory

    DOE PAGES

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.; ...

    2018-01-29

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress andmore » frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. Finally, these results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.« less

  16. Estimating Fault Friction From Seismic Signals in the Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress andmore » frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. Finally, these results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.« less

  17. Estimating Fault Friction From Seismic Signals in the Laboratory

    NASA Astrophysics Data System (ADS)

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.; Ren, Christopher X.; Riviere, Jacques; Marone, Chris; Guyer, Robert A.; Johnson, Paul A.

    2018-02-01

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress and frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. These results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.

  18. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    USGS Publications Warehouse

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude

  19. Probabilistic seismic hazard estimates incorporating site effects - An example from Indiana, U.S.A

    USGS Publications Warehouse

    Hasse, J.S.; Park, C.H.; Nowack, R.L.; Hill, J.R.

    2010-01-01

    The U.S. Geological Survey (USGS) has published probabilistic earthquake hazard maps for the United States based on current knowledge of past earthquake activity and geological constraints on earthquake potential. These maps for the central and eastern United States assume standard site conditions with Swave velocities of 760 m/s in the top 30 m. For urban and infrastructure planning and long-term budgeting, the public is interested in similar probabilistic seismic hazard maps that take into account near-surface geological materials. We have implemented a probabilistic method for incorporating site effects into the USGS seismic hazard analysis that takes into account the first-order effects of the surface geologic conditions. The thicknesses of sediments, which play a large role in amplification, were derived from a P-wave refraction database with over 13, 000 profiles, and a preliminary geology-based velocity model was constructed from available information on S-wave velocities. An interesting feature of the preliminary hazard maps incorporating site effects is the approximate factor of two increases in the 1-Hz spectral acceleration with 2 percent probability of exceedance in 50 years for parts of the greater Indianapolis metropolitan region and surrounding parts of central Indiana. This effect is primarily due to the relatively thick sequence of sediments infilling ancient bedrock topography that has been deposited since the Pleistocene Epoch. As expected, the Late Pleistocene and Holocene depositional systems of the Wabash and Ohio Rivers produce additional amplification in the southwestern part of Indiana. Ground motions decrease, as would be expected, toward the bedrock units in south-central Indiana, where motions are significantly lower than the values on the USGS maps.

  20. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  1. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  2. Sequential Data Assimilation for Seismicity: a Proof of Concept

    NASA Astrophysics Data System (ADS)

    van Dinther, Y.; Fichtner, A.; Kuensch, H. R.

    2015-12-01

    Our physical understanding and probabilistic forecasting ability of earthquakes is significantly hampered by limited indications of the state of stress and strength on faults and their governing parameters. Using the sequential data assimilation framework developed in meteorology and oceanography (e.g., Evensen, JGR, 1994) and a seismic cycle forward model based on Navier-Stokes Partial Differential Equations (van Dinther et al., JGR, 2013), we show that such information with its uncertainties is within reach, at least for laboratory setups. We aim to provide the first, thorough proof of concept for seismicity related PDE applications via a perfect model test of seismic cycles in a simplified wedge-like subduction setup. By evaluating the performance with respect to known numerical input and output, we aim to answer wether there is any probabilistic forecast value for this laboratory-like setup, which and how many parameters can be constrained, and how much data in both space and time would be needed to do so. Thus far our implementation of an Ensemble Kalman Filter demonstrated that probabilistic estimates of both the state of stress and strength on a megathrust fault can be obtained and utilized even when assimilating surface velocity data at a single point in time and space. An ensemble-based error covariance matrix containing velocities, stresses and pressure links surface velocity observations to fault stresses and strengths well enough to update fault coupling accordingly. Depending on what synthetic data show, coseismic events can then be triggered or inhibited.

  3. Preliminary Seismic Probabilistic Tsunami Hazard Map for Italy

    NASA Astrophysics Data System (ADS)

    Lorito, Stefano; Selva, Jacopo; Basili, Roberto; Grezio, Anita; Molinari, Irene; Piatanesi, Alessio; Romano, Fabrizio; Tiberti, Mara Monica; Tonini, Roberto; Bonini, Lorenzo; Michelini, Alberto; Macias, Jorge; Castro, Manuel J.; González-Vida, José Manuel; de la Asunción, Marc

    2015-04-01

    We present a preliminary release of the first seismic probabilistic tsunami hazard map for Italy. The map aims to become an important tool for the Italian Department of Civil Protection (DPC), as well as a support tool for the NEAMTWS Tsunami Service Provider, the Centro Allerta Tsunami (CAT) at INGV, Rome. The map shows the offshore maximum tsunami elevation expected for several average return periods. Both crustal and subduction earthquakes are considered. The probability for each scenario (location, depth, mechanism, source size, magnitude and temporal rate) is defined on a uniform grid covering the entire Mediterranean for crustal earthquakes and on the plate interface for subduction earthquakes. Activity rates are assigned from seismic catalogues and basing on a tectonic regionalization of the Mediterranean area. The methodology explores the associated aleatory uncertainty through the innovative application of an Event Tree. Main sources of epistemic uncertainty are also addressed although in preliminary way. The whole procedure relies on a database of pre-calculated Gaussian-shaped Green's functions for the sea level elevation, to be used also as a real time hazard assessment tool by CAT. Tsunami simulations are performed using the non-linear shallow water multi-GPU code HySEA, over a 30 arcsec bathymetry (from the SRTM30+ dataset) and the maximum elevations are stored at the 50-meter isobath and then extrapolated through the Green's law at 1 meter depth. This work is partially funded by project ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839, and by the Italian flagship project RITMARE.

  4. Reassessment of probabilistic seismic hazard in the Marmara region

    USGS Publications Warehouse

    Kalkan, Erol; Gulkan, Polat; Yilmaz, Nazan; Çelebi, Mehmet

    2009-01-01

    In 1999, the eastern coastline of the Marmara region (Turkey) witnessed increased seismic activity on the North Anatolian fault (NAF) system with two damaging earthquakes (M 7.4 Kocaeli and M 7.2 D??zce) that occurred almost three months apart. These events have reduced stress on the western segment of the NAF where it continues under the Marmara Sea. The undersea fault segments have been recently explored using bathymetric and reflection surveys. These recent findings helped scientists to understand the seismotectonic environment of the Marmara basin, which has remained a perplexing tectonic domain. On the basis of collected new data, seismic hazard of the Marmara region is reassessed using a probabilistic approach. Two different earthquake source models: (1) the smoothed-gridded seismicity model and (2) fault model and alternate magnitude-frequency relations, Gutenberg-Richter and characteristic, were used with local and imported ground-motion-prediction equations. Regional exposure is computed and quantified on a set of hazard maps that provide peak horizontal ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 sec on uniform firm-rock site condition (760 m=sec average shear wave velocity in the upper 30 m). These acceleration levels were computed for ground motions having 2% and 10% probabilities of exceedance in 50 yr, corresponding to return periods of about 2475 and 475 yr, respectively. The maximum PGA computed (at rock site) is 1.5g along the fault segments of the NAF zone extending into the Marmara Sea. The new maps generally show 10% to 15% increase for PGA, 0.2 and 1.0 sec spectral acceleration values across much of Marmara compared to previous regional hazard maps. Hazard curves and smooth design spectra for three site conditions: rock, soil, and soft-soil are provided for the Istanbul metropolitan area as possible tools in future risk estimates.

  5. Laboratory simulation of volcano seismicity.

    PubMed

    Benson, Philip M; Vinciguerra, Sergio; Meredith, Philip G; Young, R Paul

    2008-10-10

    The physical processes generating seismicity within volcanic edifices are highly complex and not fully understood. We report results from a laboratory experiment in which basalt from Mount Etna volcano (Italy) was deformed and fractured. The experiment was monitored with an array of transducers around the sample to permit full-waveform capture, location, and analysis of microseismic events. Rapid post-failure decompression of the water-filled pore volume and damage zone triggered many low-frequency events, analogous to volcanic long-period seismicity. The low frequencies were associated with pore fluid decompression and were located in the damage zone in the fractured sample; these events exhibited a weak component of shear (double-couple) slip, consistent with fluid-driven events occurring beneath active volcanoes.

  6. Tsunamigenic scenarios for southern Peru and northern Chile seismic gap: Deterministic and probabilistic hybrid approach for hazard assessment

    NASA Astrophysics Data System (ADS)

    González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.

    2017-12-01

    Plausible worst-case tsunamigenic scenarios definition plays a relevant role in tsunami hazard assessment focused in emergency preparedness and evacuation planning for coastal communities. During the last decade, the occurrence of major and moderate tsunamigenic earthquakes along worldwide subduction zones has given clues about critical parameters involved in near-field tsunami inundation processes, i.e. slip spatial distribution, shelf resonance of edge waves and local geomorphology effects. To analyze the effects of these seismic and hydrodynamic variables over the epistemic uncertainty of coastal inundation, we implement a combined methodology using deterministic and probabilistic approaches to construct 420 tsunamigenic scenarios in a mature seismic gap of southern Peru and northern Chile, extended from 17ºS to 24ºS. The deterministic scenarios are calculated using a regional distribution of trench-parallel gravity anomaly (TPGA) and trench-parallel topography anomaly (TPTA), three-dimensional Slab 1.0 worldwide subduction zones geometry model and published interseismic coupling (ISC) distributions. As result, we find four higher slip deficit zones interpreted as major seismic asperities of the gap, used in a hierarchical tree scheme to generate ten tsunamigenic scenarios with seismic magnitudes fluctuates between Mw 8.4 to Mw 8.9. Additionally, we construct ten homogeneous slip scenarios as inundation baseline. For the probabilistic approach, we implement a Karhunen - Loève expansion to generate 400 stochastic tsunamigenic scenarios over the maximum extension of the gap, with the same magnitude range of the deterministic sources. All the scenarios are simulated through a non-hydrostatic tsunami model Neowave 2D, using a classical nesting scheme, for five coastal major cities in northern Chile (Arica, Iquique, Tocopilla, Mejillones and Antofagasta) obtaining high resolution data of inundation depth, runup, coastal currents and sea level elevation. The

  7. Probabilistic seismic hazard at the archaeological site of Gol Gumbaz in Vijayapura, south India

    NASA Astrophysics Data System (ADS)

    Patil, Shivakumar G.; Menon, Arun; Dodagoudar, G. R.

    2018-03-01

    Probabilistic seismic hazard analysis (PSHA) is carried out for the archaeological site of Vijayapura in south India in order to obtain hazard consistent seismic input ground-motions for seismic risk assessment and design of seismic protection measures for monuments, where warranted. For this purpose the standard Cornell-McGuire approach, based on seismogenic zones with uniformly distributed seismicity is employed. The main features of this study are the usage of an updated and unified seismic catalogue based on moment magnitude, new seismogenic source models and recent ground motion prediction equations (GMPEs) in logic tree framework. Seismic hazard at the site is evaluated for level and rock site condition with 10% and 2% probabilities of exceedance in 50 years, and the corresponding peak ground accelerations (PGAs) are 0.074 and 0.142 g, respectively. In addition, the uniform hazard spectra (UHS) of the site are compared to the Indian code-defined spectrum. Comparisons are also made with results from National Disaster Management Authority (NDMA 2010), in terms of PGA and pseudo spectral accelerations (PSAs) at T = 0.2, 0.5, 1.0 and 1.25 s for 475- and 2475-yr return periods. Results of the present study are in good agreement with the PGA calculated from isoseismal map of the Killari earthquake, {M}w = 6.4 (1993). Disaggregation of PSHA results for the PGA and spectral acceleration ({S}a) at 0.5 s, displays the controlling scenario earthquake for the study region as low to moderate magnitude with the source being at a short distance from the study site. Deterministic seismic hazard (DSHA) is also carried out by taking into account three scenario earthquakes. The UHS corresponding to 475-yr return period (RP) is used to define the target spectrum and accordingly, the spectrum-compatible natural accelerograms are selected from the suite of recorded accelerograms.

  8. A Moore's cellular automaton model to get probabilistic seismic hazard maps for different magnitude releases: A case study for Greece

    NASA Astrophysics Data System (ADS)

    Jiménez, A.; Posadas, A. M.

    2006-09-01

    Cellular automata are simple mathematical idealizations of natural systems and they supply useful models for many investigations in natural science. Examples include sandpile models, forest fire models, and slider block models used in seismology. In the present paper, they have been used for establishing temporal relations between the energy releases of the seismic events that occurred in neighboring parts of the crust. The catalogue is divided into time intervals, and the region is divided into cells which are declared active or inactive by means of a threshold energy release criterion. Thus, a pattern of active and inactive cells which evolves over time is determined. A stochastic cellular automaton is constructed starting with these patterns, in order to simulate their spatio-temporal evolution, by supposing a Moore's neighborhood interaction between the cells. The best model is chosen by maximizing the mutual information between the past and the future states. Finally, a Probabilistic Seismic Hazard Map is given for the different energy releases considered. The method has been applied to the Greece catalogue from 1900 to 1999. The Probabilistic Seismic Hazard Maps for energies corresponding to m = 4 and m = 5 are close to the real seismicity after the data in that area, and they correspond to a background seismicity in the whole area. This background seismicity seems to cover the whole area in periods of around 25-50 years. The optimum cell size is in agreement with other studies; for m > 6 the optimum area increases according to the threshold of clear spatial resolution, and the active cells are not so clustered. The results are coherent with other hazard studies in the zone and with the seismicity recorded after the data set, as well as provide an interaction model which points out the large scale nature of the earthquake occurrence.

  9. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  10. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  11. An Application of the SSHAC Level 3 Process to the Probabilistic Seismic Hazard Analysis for Nuclear Facilities at the Hanford Site, Eastern Washington, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coppersmith , Kevin J.; Bommer, Julian J.; Bryce, Robert W.

    Under the sponsorship of the US Department of Energy (DOE) and the electric utility Energy Northwest, the Pacific Northwest National Laboratory (PNNL) is conducting a probabilistic seismic hazard analysis (PSHA) within the framework of a SSHAC Level 3 procedure (Senior Seismic Hazard Analysis Committee; Budnitz et al., 1997). Specifically, the project is being conducted following the guidelines and requirements specified in NUREG-2117 (USNRC, 2012b) and consistent with approach given in the American Nuclear Standard ANSI/ANS-2.29-2008 Probabilistic Seismic Hazard Analysis. The collaboration between DOE and Energy Northwest is spawned by the needs of both organizations for an accepted PSHA with highmore » levels of regulatory assurance that can be used for the design and safety evaluation of nuclear facilities. DOE committed to this study after performing a ten-year review of the existing PSHA, as required by DOE Order 420.1C. The study will also be used by Energy Northwest as a basis for fulfilling the NRC’s 10CFR50.54(f) requirement that the western US nuclear power plants conduct PSHAs in conformance with SSHAC Level 3 procedures. The study was planned and is being carried out in conjunction with a project Work Plan, which identifies the purpose of the study, the roles and responsibilities of all participants, tasks and their associated schedules, Quality Assurance (QA) requirements, and project deliverables. New data collection and analysis activities are being conducted as a means of reducing the uncertainties in key inputs to the PSHA. It is anticipated that the results of the study will provide inputs to the site response analyses at multiple nuclear facility sites within the Hanford Site and at the Columbia Generating Station.« less

  12. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  13. Development of Laboratory Seismic Exploration Experiment for Education and Demonstration

    NASA Astrophysics Data System (ADS)

    Kuwano, O.; Nakanishi, A.

    2016-12-01

    We developed a laboratory experiment to simulate a seismic refraction survey for educational purposes. The experiment is tabletop scaled experiment using the soft hydrogel as an analogue material of a layered crust. So, we can conduct the seismic exploration experiment in a laboratory or a classroom. The softness and the transparency of the gel material enable us to observe the wave propagation with our naked eyes, using the photoelastic technique. By analyzing the waveforms obtained by the image analysis of the movie of the experiment, one can estimate the velocities and the structure of the gel specimen in the same way as an actual seismic survey. We report details of the practical course and the public outreach activities using the experiment.

  14. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    NASA Astrophysics Data System (ADS)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  15. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases

  16. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  17. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  18. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  19. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  20. PSHAe (Probabilistic Seismic Hazard enhanced): the case of Istanbul.

    NASA Astrophysics Data System (ADS)

    Stupazzini, Marco; Allmann, Alexander; Infantino, Maria; Kaeser, Martin; Mazzieri, Ilario; Paolucci, Roberto; Smerzini, Chiara

    2016-04-01

    The Probabilistic Seismic Hazard Analysis (PSHA) only relying on GMPEs tends to be insufficiently constrained at short distances and data only partially account for the rupture process, seismic wave propagation and three-dimensional (3D) complex configurations. Given a large and representative set of numerical results from 3D scenarios, analysing the resulting database from a statistical point of view and implementing the results as a generalized attenuation function (GAF) into the classical PSHA might be an appealing way to deal with this problem (Villani et al., 2014). Nonetheless, the limited amount of computational resources or time available tend to pose substantial constrains in a broad application of the previous method and, furthermore, the method is only partially suitable for taking into account the spatial correlation of ground motion as modelled by each forward physics-based simulation (PBS). Given that, we envision a streamlined and alternative implementation of the previous approach, aiming at selecting a limited number of scenarios wisely chosen and associating them a probability of occurrence. The experience gathered in the past year regarding 3D modelling of seismic wave propagation in complex alluvial basin (Pilz et al., 2011, Guidotti et al., 2011, Smerzini and Villani, 2012) allowed us to enhance the choice of simulated scenarios in order to explore the variability of ground motion, preserving the full spatial correlation necessary for risk modelling, on one hand and on the other the simulated losses for a given location and a given building stock. 3D numerical modelling of scenarios occurring the North Anatolian Fault in the proximity of Istanbul are carried out through the spectral element code SPEED (http://speed.mox.polimi.it). The results are introduced in a PSHA, exploiting the capabilities of the proposed methodology against a traditional approach based on GMPE. References Guidotti R, M Stupazzini, C Smerzini, R Paolucci, P Ramieri

  1. Probabilistic Simulation of Territorial Seismic Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baratta, Alessandro; Corbi, Ileana

    2008-07-08

    The paper is focused on a stochastic process for the prevision of seismic scenarios on the territory and developed by means of some basic assumptions in the procedure and by elaborating the fundamental parameters recorded during some ground motions occurred in a seismic area.

  2. Considering the ranges of uncertainties in the New Probabilistic Seismic Hazard Assessment of Germany - Version 2016

    NASA Astrophysics Data System (ADS)

    Grunthal, Gottfried; Stromeyer, Dietrich; Bosse, Christian; Cotton, Fabrice; Bindi, Dino

    2017-04-01

    The seismic load parameters for the upcoming National Annex to the Eurocode 8 result from the reassessment of the seismic hazard supported by the German Institution for Civil Engineering . This 2016 version of hazard assessment for Germany as target area was based on a comprehensive involvement of all accessible uncertainties in models and parameters into the approach and the provision of a rational framework for facilitating the uncertainties in a transparent way. The developed seismic hazard model represents significant improvements; i.e. it is based on updated and extended databases, comprehensive ranges of models, robust methods and a selection of a set of ground motion prediction equations of their latest generation. The output specifications were designed according to the user oriented needs as suggested by two review teams supervising the entire project. In particular, seismic load parameters were calculated for rock conditions with a vS30 of 800 ms-1 for three hazard levels (10%, 5% and 2% probability of occurrence or exceedance within 50 years) in form of, e.g., uniform hazard spectra (UHS) based on 19 sprectral periods in the range of 0.01 - 3s, seismic hazard maps for spectral response accelerations for different spectral periods or for macroseismic intensities. The developed hazard model consists of a logic tree with 4040 end branches and essential innovations employed to capture epistemic uncertainties and aleatory variabilities. The computation scheme enables the sound calculation of the mean and any quantile of required seismic load parameters. Mean, median and 84th percentiles of load parameters were provided together with the full calculation model to clearly illustrate the uncertainties of such a probabilistic assessment for a region of a low-to-moderate level of seismicity. The regional variations of these uncertainties (e.g. ratios between the mean and median hazard estimations) were analyzed and discussed.

  3. Probabilistic Seismic Hazard Assessment for a NPP in the Upper Rhine Graben, France

    NASA Astrophysics Data System (ADS)

    Clément, Christophe; Chartier, Thomas; Jomard, Hervé; Baize, Stéphane; Scotti, Oona; Cushing, Edward

    2015-04-01

    The southern part of the Upper Rhine Graben (URG) straddling the border between eastern France and western Germany, presents a relatively important seismic activity for an intraplate area. A magnitude 5 or greater shakes the URG every 25 years and in 1356 a magnitude greater than 6.5 struck the city of Basel. Several potentially active faults have been identified in the area and documented in the French Active Fault Database (web site in construction). These faults are located along the Graben boundaries and also inside the Graben itself, beneath heavily populated areas and critical facilities (including the Fessenheim Nuclear Power Plant). These faults are prone to produce earthquakes with magnitude 6 and above. Published regional models and preliminary geomorphological investigations provided provisional assessment of slip rates for the individual faults (0.1-0.001 mm/a) resulting in recurrence time of 10 000 years or greater for magnitude 6+ earthquakes. Using a fault model, ground motion response spectra are calculated for annual frequencies of exceedance (AFE) ranging from 10-4 to 10-8 per year, typical for design basis and probabilistic safety analyses of NPPs. A logic tree is implemented to evaluate uncertainties in seismic hazard assessment. The choice of ground motion prediction equations (GMPEs) and range of slip rate uncertainty are the main sources of seismic hazard variability at the NPP site. In fact, the hazard for AFE lower than 10-4 is mostly controlled by the potentially active nearby Rhine River fault. Compared with areal source zone models, a fault model localizes the hazard around the active faults and changes the shape of the Uniform Hazard Spectrum at the site. Seismic hazard deaggregations are performed to identify the earthquake scenarios (including magnitude, distance and the number of standard deviations from the median ground motion as predicted by GMPEs) that contribute to the exceedance of spectral acceleration for the different AFE

  4. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy

    NASA Astrophysics Data System (ADS)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe

    2014-05-01

    The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify

  5. Probabilistic inversion of AVO seismic data for reservoir properties and related uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zunino, Andrea; Mosegaard, Klaus

    2017-04-01

    Sought-after reservoir properties of interest are linked only indirectly to the observable geophysical data which are recorded at the earth's surface. In this framework, seismic data represent one of the most reliable tool to study the structure and properties of the subsurface for natural resources. Nonetheless, seismic analysis is not an end in itself, as physical properties such as porosity are often of more interest for reservoir characterization. As such, inference of those properties implies taking into account also rock physics models linking porosity and other physical properties to elastic parameters. In the framework of seismic reflection data, we address this challenge for a reservoir target zone employing a probabilistic method characterized by a multi-step complex nonlinear forward modeling that combines: 1) a rock physics model with 2) the solution of full Zoeppritz equations and 3) a convolutional seismic forward modeling. The target property of this work is porosity, which is inferred using a Monte Carlo approach where porosity models, i.e., solutions to the inverse problem, are directly sampled from the posterior distribution. From a theoretical point of view, the Monte Carlo strategy can be particularly useful in the presence of nonlinear forward models, which is often the case when employing sophisticated rock physics models and full Zoeppritz equations and to estimate related uncertainty. However, the resulting computational challenge is huge. We propose to alleviate this computational burden by assuming some smoothness of the subsurface parameters and consequently parameterizing the model in terms of spline bases. This allows us a certain flexibility in that the number of spline bases and hence the resolution in each spatial direction can be controlled. The method is tested on a 3-D synthetic case and on a 2-D real data set.

  6. Characterization of seismic properties across scales: from the laboratory- to the field scale

    NASA Astrophysics Data System (ADS)

    Grab, Melchior; Quintal, Beatriz; Caspari, Eva; Maurer, Hansruedi; Greenhalgh, Stewart

    2016-04-01

    When exploring geothermal systems, the main interest is on factors controlling the efficiency of the heat exchanger. This includes the energy state of the pore fluids and the presence of permeable structures building part of the fluid transport system. Seismic methods are amongst the most common exploration techniques to image the deep subsurface in order to evaluate such a geothermal heat exchanger. They make use of the fact that a seismic wave caries information on the properties of the rocks in the subsurface through which it passes. This enables the derivation of the stiffness and the density of the host rock from the seismic velocities. Moreover, it is well-known that the seismic waveforms are modulated while propagating trough the subsurface by visco-elastic effects due to wave induced fluid flow, hence, delivering information about the fluids in the rock's pore space. To constrain the interpretation of seismic data, that is, to link seismic properties with the fluid state and host rock permeability, it is common practice to measure the rock properties of small rock specimens in the laboratory under in-situ conditions. However, in magmatic geothermal systems or in systems situated in the crystalline basement, the host rock is often highly impermeable and fluid transport predominately takes place in fracture networks, consisting of fractures larger than the rock samples investigated in the laboratory. Therefore, laboratory experiments only provide the properties of relatively intact rock and an up-scaling procedure is required to characterize the seismic properties of large rock volumes containing fractures and fracture networks and to study the effects of fluids in such fractured rock. We present a technique to parameterize fractured rock volumes as typically encountered in Icelandic magmatic geothermal systems, by combining laboratory experiments with effective medium calculations. The resulting models can be used to calculate the frequency-dependent bulk

  7. Probabilistic seismic hazard assessment of the Eastern and Central groups of the Azores - Portugal

    NASA Astrophysics Data System (ADS)

    Fontiela, João; Bezzeghoud, Mourad; Rosset, Philippe; Borges, José; Rodrigues, Francisco; Caldeira, Bento

    2017-04-01

    Azores islands of the Eastern and Central groups are located at the triple junction of the American, Eurasian and Nubian plates inducing a large number of low magnitude earthquakes. Since its settlement in the 15th century, 33 earthquakes with intensity ≥ VII have caused severe damage and high death toll. The most severe ones occurred in 1522 at São Miguel Island with a maximum MM intensity of X; in 1614 at Terceira Island (X) in 1757 at São Jorge Island (XI); 1852 at São Miguel Island (VIII); 1926 at Faial Island (Mb 5.3-5.9); in 1980 at Terceira Island (Mw7.1) and in 1998 at Faial Island (Mw6.2). The analysis of the Probabilistic Seismic Hazard Assessment (PSHA) were carried out using the classical Cornell-McGuire approach using seismogenic zones recently defined by Fontiela et al. (2014). We create a new earthquake catalogue merging local and global datasets with a large time span (1522 - 2016) to calculate recurrence times and maximum magnitudes. In order to reduce the epistemic uncertainties, we test several ground motion prediction equations in agreement with the geological heterogeneities typical of young volcanic islands. Probabilistic seismic hazard maps are proposed for 475 and 975 years returns periods as well as hazard curves and uniform hazard spectra for the main cities. REFERENCES: Fontiela, J. et al., 2014. Azores seismogenic zones. Comunicações Geológicas, 101(1), pp.351-354. ACKNOWLEDGMENTS: João Fontiela is supported by grant M3.1.2/F/060/2011 of Regional Science Fund of the Regional Government Azores and this study is co-funded by the European Union through the European fund of Regional Development, framed in COMPETE 2020 (Operational Competitiveness Programme and Internationalization) through the ICT project (UID/GEO/04683/2013) with the reference POCI-01-0145-FEDER-007690.

  8. Applying a probabilistic seismic-petrophysical inversion and two different rock-physics models for reservoir characterization in offshore Nile Delta

    NASA Astrophysics Data System (ADS)

    Aleardi, Mattia

    2018-01-01

    We apply a two-step probabilistic seismic-petrophysical inversion for the characterization of a clastic, gas-saturated, reservoir located in offshore Nile Delta. In particular, we discuss and compare the results obtained when two different rock-physics models (RPMs) are employed in the inversion. The first RPM is an empirical, linear model directly derived from the available well log data by means of an optimization procedure. The second RPM is a theoretical, non-linear model based on the Hertz-Mindlin contact theory. The first step of the inversion procedure is a Bayesian linearized amplitude versus angle (AVA) inversion in which the elastic properties, and the associated uncertainties, are inferred from pre-stack seismic data. The estimated elastic properties constitute the input to the second step that is a probabilistic petrophysical inversion in which we account for the noise contaminating the recorded seismic data and the uncertainties affecting both the derived rock-physics models and the estimated elastic parameters. In particular, a Gaussian mixture a-priori distribution is used to properly take into account the facies-dependent behavior of petrophysical properties, related to the different fluid and rock properties of the different litho-fluid classes. In the synthetic and in the field data tests, the very minor differences between the results obtained by employing the two RPMs, and the good match between the estimated properties and well log information, confirm the applicability of the inversion approach and the suitability of the two different RPMs for reservoir characterization in the investigated area.

  9. 230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis

    USGS Publications Warehouse

    Paces, James B.

    2014-01-01

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  10. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrencemore » relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.« less

  11. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  12. On the use of a laser ablation as a laboratory seismic source

    NASA Astrophysics Data System (ADS)

    Shen, Chengyi; Brito, Daniel; Diaz, Julien; Zhang, Deyuan; Poydenot, Valier; Bordes, Clarisse; Garambois, Stéphane

    2017-04-01

    Mimic near-surface seismic imaging conducted in well-controlled laboratory conditions is potentially a powerful tool to study large scale wave propagations in geological media by means of upscaling. Laboratory measurements are indeed particularly suited for tests of theoretical modellings and comparisons with numerical approaches. We have developed an automated Laser Doppler Vibrometer (LDV) platform, which is able to detect and register broadband nano-scale displacements on the surface of various materials. This laboratory equipment has already been validated in experiments where piezoelectric transducers were used as seismic sources. We are currently exploring a new seismic source in our experiments, a laser ablation, in order to compensate some drawbacks encountered with piezoelectric sources. The laser ablation source is considered to be an interesting ultrasound wave generator since the 1960s. It was believed to have numerous potential applications such as the Non-Destructive Testing (NDT) and the measurements of velocities and attenuations in solid samples. We aim at adapting and developing this technique into geophysical experimental investigations in order to produce and explore complete micro-seismic data sets in the laboratory. We will first present the laser characteristics including its mechanism, stability, reproducibility, and will evaluate in particular the directivity patterns of such a seismic source. We have started by applying the laser ablation source on the surfaces of multi-scale homogeneous aluminum samples and are now testing it on heterogeneous and fractured limestone cores. Some other results of data processing will also be shown, especially the 2D-slice V P and V S tomographic images obtained in limestone samples. Apart from the experimental records, numerical simulations will be carried out for both the laser source modelling and the wave propagation in different media. First attempts will be done to compare quantitatively the

  13. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  14. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard

  15. Probabilistic seismic hazard assessment for the two layer fault system of Antalya (SW Turkey) area

    NASA Astrophysics Data System (ADS)

    Dipova, Nihat; Cangir, Bülent

    2017-09-01

    Southwest Turkey, along Mediterranean coast, is prone to large earthquakes resulting from subduction of the African plate under the Eurasian plate and shallow crustal faults. Maximum observed magnitude of subduction earthquakes is Mw = 6.5 whereas that of crustal earthquakes is Mw = 6.6. Crustal earthquakes are sourced from faults which are related with Isparta Angle and Cyprus Arc tectonic structures. The primary goal of this study is to assess seismic hazard for Antalya area (SW Turkey) using a probabilistic approach. A new earthquake catalog for Antalya area, with unified moment magnitude scale, was prepared in the scope of the study. Seismicity of the area has been evaluated by the Gutenberg-Richter recurrence relationship. For hazard computation, CRISIS2007 software was used following the standard Cornell-McGuire methodology. Attenuation model developed by Youngs et al. Seismol Res Lett 68(1):58-73, (1997) was used for deep subduction earthquakes and Chiou and Youngs Earthq Spectra 24(1):173-215, (2008) model was used for shallow crustal earthquakes. A seismic hazard map was developed for peak ground acceleration and for rock ground with a hazard level of a 10% probability of exceedance in 50 years. Results of the study show that peak ground acceleration values on bedrock change between 0.215 and 0.23 g in the center of Antalya.

  16. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paces, James B.

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rindsmore » on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.« less

  17. Bayesian identification of multiple seismic change points and varying seismic rates caused by induced seismicity

    NASA Astrophysics Data System (ADS)

    Montoya-Noguera, Silvana; Wang, Yu

    2017-04-01

    The Central and Eastern United States (CEUS) has experienced an abnormal increase in seismic activity, which is believed to be related to anthropogenic activities. The U.S. Geological Survey has acknowledged this situation and developed the CEUS 2016 1 year seismic hazard model using the catalog of 2015 by assuming stationary seismicity in that period. However, due to the nonstationary nature of induced seismicity, it is essential to identify change points for accurate probabilistic seismic hazard analysis (PSHA). We present a Bayesian procedure to identify the most probable change points in seismicity and define their respective seismic rates. It uses prior distributions in agreement with conventional PSHA and updates them with recent data to identify seismicity changes. It can determine the change points in a regional scale and may incorporate different types of information in an objective manner. It is first successfully tested with simulated data, and then it is used to evaluate Oklahoma's regional seismicity.

  18. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy - Part 1: Model components for sources parameterization

    NASA Astrophysics Data System (ADS)

    Azzaro, Raffaele; Barberi, Graziella; D'Amico, Salvatore; Pace, Bruno; Peruzza, Laura; Tuvè, Tiziana

    2017-11-01

    The volcanic region of Mt. Etna (Sicily, Italy) represents a perfect lab for testing innovative approaches to seismic hazard assessment. This is largely due to the long record of historical and recent observations of seismic and tectonic phenomena, the high quality of various geophysical monitoring and particularly the rapid geodynamics clearly demonstrate some seismotectonic processes. We present here the model components and the procedures adopted for defining seismic sources to be used in a new generation of probabilistic seismic hazard assessment (PSHA), the first results and maps of which are presented in a companion paper, Peruzza et al. (2017). The sources include, with increasing complexity, seismic zones, individual faults and gridded point sources that are obtained by integrating geological field data with long and short earthquake datasets (the historical macroseismic catalogue, which covers about 3 centuries, and a high-quality instrumental location database for the last decades). The analysis of the frequency-magnitude distribution identifies two main fault systems within the volcanic complex featuring different seismic rates that are controlled essentially by volcano-tectonic processes. We discuss the variability of the mean occurrence times of major earthquakes along the main Etnean faults by using an historical approach and a purely geologic method. We derive a magnitude-size scaling relationship specifically for this volcanic area, which has been implemented into a recently developed software tool - FiSH (Pace et al., 2016) - that we use to calculate the characteristic magnitudes and the related mean recurrence times expected for each fault. Results suggest that for the Mt. Etna area, the traditional assumptions of uniform and Poissonian seismicity can be relaxed; a time-dependent fault-based modeling, joined with a 3-D imaging of volcano-tectonic sources depicted by the recent instrumental seismicity, can therefore be implemented in PSHA maps

  19. Fully probabilistic seismic source inversion - Part 2: Modelling errors and station covariances

    NASA Astrophysics Data System (ADS)

    Stähler, Simon C.; Sigloch, Karin

    2016-11-01

    -correlation measurements usable for fully probabilistic sampling strategies, in source inversion and related applications such as seismic tomography.

  20. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  1. First USGS urban seismic hazard maps predict the effects of soils

    USGS Publications Warehouse

    Cramer, C.H.; Gomberg, J.S.; Schweig, E.S.; Waldron, B.A.; Tucker, K.

    2006-01-01

    Probabilistic and scenario urban seismic hazard maps have been produced for Memphis, Shelby County, Tennessee covering a six-quadrangle area of the city. The nine probabilistic maps are for peak ground acceleration and 0.2 s and 1.0 s spectral acceleration and for 10%, 5%, and 2% probability of being exceeded in 50 years. Six scenario maps for these three ground motions have also been generated for both an M7.7 and M6.2 on the southwest arm of the New Madrid seismic zone ending at Marked Tree, Arkansas. All maps include the effect of local geology. Relative to the national seismic hazard maps, the effect of the thick sediments beneath Memphis is to decrease 0.2 s probabilistic ground motions by 0-30% and increase 1.0 s probabilistic ground motions by ???100%. Probabilistic peak ground accelerations remain at levels similar to the national maps, although the ground motion gradient across Shelby County is reduced and ground motions are more uniform within the county. The M7.7 scenario maps show ground motions similar to the 5%-in-50-year probabilistic maps. As an effect of local geology, both M7.7 and M6.2 scenario maps show a more uniform seismic ground-motion hazard across Shelby County than scenario maps with constant site conditions (i.e., NEHRP B/C boundary).

  2. The Evolving Role of Field and Laboratory Seismic Measurements in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Stokoe, K. H.

    2017-12-01

    The geotechnical engineering has been faced with the problem of characterizing geological materials for site-specific design in the built environment since the profession began. When one of the design requirements included determining the dynamic response of important and critical facilities to earthquake shaking or other types of dynamic loads, seismically-based measurements in the field and laboratory became important tools for direct characterization of the stiffnesses and energy dissipation (material damping) of these materials. In the 1960s, field seismic measurements using small-strain body waves were adapted from exploration geophysics. At the same time, laboratory measurements began using dynamic, torsional, resonant-column devices to measure shear stiffness and material damping in shear. The laboratory measurements also allowed parameters such as material type, confinement state, and nonlinear straining to be evaluated. Today, seismic measurements are widely used and evolving because: (1) the measurements have a strong theoretical basis, (2) they can be performed in the field and laboratory, thus forming an important link between these measurements, and (3) in recent developments in field testing involving surface waves, they are noninvasive which makes them cost effective in comparison to other methods. Active field seismic measurements are used today over depths ranging from about 5 to 1000 m. Examples of shear-wave velocity (VS) profiles evaluated using boreholes, penetrometers, suspension logging, and Rayleigh-type surface waves are presented. The VS measurements were performed in materials ranging from uncemented soil to unweathered rock. The coefficients of variation (COVs) in the VS profiles are generally less than 0.15 over sites with surface areas of 50 km2 or more as long as material types are not laterally mixed. Interestingly, the largest COVs often occur around layer boundaries which vary vertically. It is also interesting to observe how the

  3. Near Field Observations of Seismicity in Volcanic Environments: A Read-Made Field Laboratory

    NASA Astrophysics Data System (ADS)

    Bean, C. J.; Thun, J.; Eibl, E. P. S.; Benson, P. M.; Rowley, P.; Lokmer, I.; Cauchie, L.

    2017-12-01

    Volcanic environments experience periods of rapid stress fluctuations and consequent seismicity. This volcano seismicity is diverse in character, spanning the range from discrete high frequency events through low-frequency earthquakes and tremor. The inter-relationships between these events appear to be controlled by edifice rheology, stress state and the presence of fluids (which help modulate the stress field). In general volcanoes are accessible to instrumentation, allowing near-field access to the seismicity at play. Here we present results from a range of field, numerical and laboratory experiments that demonstrate the controls that rheology and strain rate play on seismicity type. In particular we demonstrate the role played by internal friction angles on the initiation and evolution of seismicity, in dry weak-compliant volcanic materials. Furthermore we show the importance of near field observation in constraining details of the seismic source, in a meso-scale field setting.

  4. The Dependency of Probabilistic Tsunami Hazard Assessment on Magnitude Limits of Seismic Sources in the South China Sea and Adjoining Basins

    NASA Astrophysics Data System (ADS)

    Li, Hongwei; Yuan, Ye; Xu, Zhiguo; Wang, Zongchen; Wang, Juncheng; Wang, Peitao; Gao, Yi; Hou, Jingming; Shan, Di

    2017-06-01

    The South China Sea (SCS) and its adjacent small basins including Sulu Sea and Celebes Sea are commonly identified as tsunami-prone region by its historical records on seismicity and tsunamis. However, quantification of tsunami hazard in the SCS region remained an intractable issue due to highly complex tectonic setting and multiple seismic sources within and surrounding this area. Probabilistic Tsunami Hazard Assessment (PTHA) is performed in the present study to evaluate tsunami hazard in the SCS region based on a brief review on seismological and tsunami records. 5 regional and local potential tsunami sources are tentatively identified, and earthquake catalogs are generated using Monte Carlo simulation following the Tapered Gutenberg-Richter relationship for each zone. Considering a lack of consensus on magnitude upper bound on each seismic source, as well as its critical role in PTHA, the major concern of the present study is to define the upper and lower limits of tsunami hazard in the SCS region comprehensively by adopting different corner magnitudes that could be derived by multiple principles and approaches, including TGR regression of historical catalog, fault-length scaling, tectonic and seismic moment balance, and repetition of historical largest event. The results show that tsunami hazard in the SCS and adjoining basins is subject to large variations when adopting different corner magnitudes, with the upper bounds 2-6 times of the lower. The probabilistic tsunami hazard maps for specified return periods reveal much higher threat from Cotabato Trench and Sulawesi Trench in the Celebes Sea, whereas tsunami hazard received by the coasts of the SCS and Sulu Sea is relatively moderate, yet non-negligible. By combining empirical method with numerical study of historical tsunami events, the present PTHA results are tentatively validated. The correspondence lends confidence to our study. Considering the proximity of major sources to population-laden cities

  5. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  6. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  7. Impacts of potential seismic landslides on lifeline corridors.

    DOT National Transportation Integrated Search

    2015-02-01

    This report presents a fully probabilistic method for regional seismically induced landslide hazard analysis and : mapping. The method considers the most current predictions for strong ground motions and seismic sources : through use of the U.S.G.S. ...

  8. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    USGS Publications Warehouse

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  9. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions

    NASA Astrophysics Data System (ADS)

    Vermeulen, Petrus

    2017-04-01

    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  10. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and

  11. Variational Bayesian Inversion of Quasi-Localized Seismic Attributes for the Spatial Distribution of Geological Facies

    NASA Astrophysics Data System (ADS)

    Nawaz, Muhammad Atif; Curtis, Andrew

    2018-04-01

    We introduce a new Bayesian inversion method that estimates the spatial distribution of geological facies from attributes of seismic data, by showing how the usual probabilistic inverse problem can be solved using an optimization framework still providing full probabilistic results. Our mathematical model consists of seismic attributes as observed data, which are assumed to have been generated by the geological facies. The method infers the post-inversion (posterior) probability density of the facies plus some other unknown model parameters, from the seismic attributes and geological prior information. Most previous research in this domain is based on the localized likelihoods assumption, whereby the seismic attributes at a location are assumed to depend on the facies only at that location. Such an assumption is unrealistic because of imperfect seismic data acquisition and processing, and fundamental limitations of seismic imaging methods. In this paper, we relax this assumption: we allow probabilistic dependence between seismic attributes at a location and the facies in any neighbourhood of that location through a spatial filter. We term such likelihoods quasi-localized.

  12. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in

  13. A New Insight into Probabilistic Seismic Hazard Analysis for Central India

    NASA Astrophysics Data System (ADS)

    Mandal, H. S.; Shukla, A. K.; Khan, P. K.; Mishra, O. P.

    2013-12-01

    The Son-Narmada-Tapti lineament and its surroundings of Central India (CI) is the second most important tectonic regime following the converging margin along Himalayas-Myanmar-Andaman of the Indian sub-continent, which attracted several geoscientists to assess its seismic hazard potential. Our study area, a part of CI, is bounded between latitudes 18°-26°N and longitudes 73°-83°E, representing a stable part of Peninsular India. Past damaging moderate magnitude earthquakes as well as continuing microseismicity in the area provided enough data for seismological study. Our estimates based on regional Gutenberg-Richter relationship showed lower b values (i.e., between 0.68 and 0.76) from the average for the study area. The Probabilistic Seismic Hazard Analysis carried out over the area with a radius of ~300 km encircling Bhopal yielded a conspicuous relationship between earthquake return period ( T) and peak ground acceleration (PGA). Analyses of T and PGA shows that PGA value at bedrock varies from 0.08 to 0.15 g for 10 % ( T = 475 years) and 2 % ( T = 2,475 years) probabilities exceeding 50 years, respectively. We establish the empirical relationships and between zero period acceleration (ZPA) and shear wave velocity up to a depth of 30 m [ V s (30)] for the two different return periods. These demonstrate that the ZPA values decrease with increasing shear wave velocity, suggesting a diagnostic indicator for designing the structures at a specific site of interest. The predictive designed response spectra generated at a site for periods up to 4.0 s at 10 and 2 % probability of exceedance of ground motion for 50 years can be used for designing duration dependent structures of variable vertical dimension. We infer that this concept of assimilating uniform hazard response spectra and predictive design at 10 and 2 % probability of exceedance in 50 years at 5 % damping at bedrocks of different categories may offer potential inputs for designing earthquake resistant

  14. Long-period amplification in deep alluvial basins and consequences for site-specific probabilistic seismic-hazard: the case of Castelleone in the Po Plain (Northern Italy)

    NASA Astrophysics Data System (ADS)

    Barani, S.; Mascandola, C.; Massa, M.; Spallarossa, D.

    2017-12-01

    The recent Emilia seismic sequence (Northern Italy) occurred at the end of the first half of 2012 with main shock of Mw6.1 highlighted the importance of studying site effects in the Po Plain, the larger and deeper sedimentary basin in Italy. As has long been known, long-period amplification related to deep sedimentary basins can significantly affect the characteristics of the ground-motion induced by strong earthquakes. It follows that the effects of deep sedimentary deposits on ground shaking require special attention during the definition of the design seismic action. The work presented here analyzes the impact of deep-soil discontinuities on ground-motion amplification, with particular focus on long-period probabilistic seismic-hazard assessment. The study focuses on the site of Castelleone, where a seismic station of the Italian National Seismic Network has been recording since 2009. Our study includes both experimental and numerical site response analyses. Specifically, extensive active and passive geophysical measurements were carried out in order to define a detailed shear-wave velocity (VS) model to be used in the numerical analyses. These latter are needed to assess the site-specific ground-motion hazard. Besides classical seismic refraction profiles and multichannel analysis of surface waves, we analyzed ambient vibration measurements in both single and array configurations. The VS profile was determined via joint inversion of the experimental phase-velocity dispersion curve with the ellipticity curve derived from horizontal-to-vertical spectral ratios. The profile shows two main discontinuities at depths of around 160 and 1350 m, respectively. The probabilistic site-specific hazard was assessed in terms of both spectral acceleration and displacement. A partially non-ergodic approach was adopted. We have found that the spectral acceleration hazard is barely sensitive to long-period (up to 10 s) amplification related to the deeper discontinuity whereas the

  15. Seismic and geodetic signatures of fault slip at the Slumgullion Landslide Natural Laboratory

    USGS Publications Warehouse

    Gomberg, J.; Schulz, W.; Bodin, P.; Kean, J.

    2011-01-01

    We tested the hypothesis that the Slumgullion landslide is a useful natural laboratory for observing fault slip, specifically that slip along its basal surface and side-bounding strike-slip faults occurs with comparable richness of aseismic and seismic modes as along crustal- and plate-scale boundaries. Our study provides new constraints on models governing landslide motion. We monitored landslide deformation with temporary deployments of a 29-element prism array surveyed by a robotic theodolite and an 88-station seismic network that complemented permanent extensometers and environmental instrumentation. Aseismic deformation observations show that large blocks of the landslide move steadily at approximately centimeters per day, possibly punctuated by variations of a few millimeters, while localized transient slip episodes of blocks less than a few tens of meters across occur frequently. We recorded a rich variety of seismic signals, nearly all of which originated outside the monitoring network boundaries or from the side-bounding strike-slip faults. The landslide basal surface beneath our seismic network likely slipped almost completely aseismically. Our results provide independent corroboration of previous inferences that dilatant strengthening along sections of the side-bounding strike-slip faults controls the overall landslide motion, acting as seismically radiating brakes that limit acceleration of the aseismically slipping basal surface. Dilatant strengthening has also been invoked in recent models of transient slip and tremor sources along crustal- and plate-scale faults suggesting that the landslide may indeed be a useful natural laboratory for testing predictions of specific mechanisms that control fault slip at all scales.

  16. Validating induced seismicity forecast models—Induced Seismicity Test Bench

    NASA Astrophysics Data System (ADS)

    Király-Proag, Eszter; Zechar, J. Douglas; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph

    2016-08-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in but is only mediocre at forecasting the spatial distribution. On the other hand, SaSS forecasts the spatial distribution better and gives better seismicity rate estimates before shut-in. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in.

  17. Stress-dependent elastic properties of shales—laboratory experiments at seismic and ultrasonic frequencies

    NASA Astrophysics Data System (ADS)

    Szewczyk, Dawid; Bauer, Andreas; Holt, Rune M.

    2018-01-01

    Knowledge about the stress sensitivity of elastic properties and velocities of shales is important for the interpretation of seismic time-lapse data taken as part of reservoir and caprock surveillance of both unconventional and conventional oil and gas fields (e.g. during 4-D monitoring of CO2 storage). Rock physics models are often developed based on laboratory measurements at ultrasonic frequencies. However, as shown previously, shales exhibit large seismic dispersion, and it is possible that stress sensitivities of velocities are also frequency dependent. In this work, we report on a series of seismic and ultrasonic laboratory tests in which the stress sensitivity of elastic properties of Mancos shale and Pierre shale I were investigated. The shales were tested at different water saturations. Dynamic rock engineering parameters and elastic wave velocities were examined on core plugs exposed to isotropic loading. Experiments were carried out in an apparatus allowing for static-compaction and dynamic measurements at seismic and ultrasonic frequencies within single test. For both shale types, we present and discuss experimental results that demonstrate dispersion and stress sensitivity of the rock stiffness, as well as P- and S-wave velocities, and stiffness anisotropy. Our experimental results show that the stress-sensitivity of shales is different at seismic and ultrasonic frequencies, which can be linked with simultaneously occurring changes in the dispersion with applied stress. Measured stress sensitivity of elastic properties for relatively dry samples was higher at seismic frequencies however, the increasing saturation of shales decreases the difference between seismic and ultrasonic stress-sensitivities, and for moist samples stress-sensitivity is higher at ultrasonic frequencies. Simultaneously, the increased saturation highly increases the dispersion in shales. We have also found that the stress-sensitivity is highly anisotropic in both shales and that in

  18. Laboratory meter-scale seismic monitoring of varying water levels in granular media

    NASA Astrophysics Data System (ADS)

    Pasquet, S.; Bodet, L.; Bergamo, P.; Guérin, R.; Martin, R.; Mourgues, R.; Tournat, V.

    2016-12-01

    Laboratory physical modelling and non-contacting ultrasonic techniques are frequently proposed to tackle theoretical and methodological issues related to geophysical prospecting. Following recent developments illustrating the ability of seismic methods to image spatial and/or temporal variations of water content in the vadose zone, we developed laboratory experiments aimed at testing the sensitivity of seismic measurements (i.e., pressure-wave travel times and surface-wave phase velocities) to water saturation variations. Ultrasonic techniques were used to simulate typical seismic acquisitions on small-scale controlled granular media presenting different water levels. Travel times and phase velocity measurements obtained at the dry state were validated with both theoretical models and numerical simulations and serve as reference datasets. The increasing water level clearly affects the recorded wave field in both its phase and amplitude, but the collected data cannot yet be inverted in the absence of a comprehensive theoretical model for such partially saturated and unconsolidated granular media. The differences in travel time and phase velocity observed between the dry and wet models show patterns that are interestingly coincident with the observed water level and depth of the capillary fringe, thus offering attractive perspectives for studying soil water content variations in the field.

  19. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  20. Fault2SHA- A European Working group to link faults and Probabilistic Seismic Hazard Assessment communities in Europe

    NASA Astrophysics Data System (ADS)

    Scotti, Oona; Peruzza, Laura

    2016-04-01

    The key questions we ask are: What is the best strategy to fill in the gap in knowledge and know-how in Europe when considering faults in seismic hazard assessments? Are field geologists providing the relevant information for seismic hazard assessment? Are seismic hazard analysts interpreting field data appropriately? Is the full range of uncertainties associated with the characterization of faults correctly understood and propagated in the computations? How can fault-modellers contribute to a better representation of the long-term behaviour of fault-networks in seismic hazard studies? Providing answers to these questions is fundamental, in order to reduce the consequences of future earthquakes and improve the reliability of seismic hazard assessments. An informal working group was thus created at a meeting in Paris in November 2014, partly financed by the Institute of Radioprotection and Nuclear Safety, with the aim to motivate exchanges between field geologists, fault modellers and seismic hazard practitioners. A variety of approaches were presented at the meeting and a clear gap emerged between some field geologists, that are not necessarily familiar with probabilistic seismic hazard assessment methods and needs and practitioners that do not necessarily propagate the "full" uncertainty associated with the characterization of faults. The group thus decided to meet again a year later in Chieti (Italy), to share concepts and ideas through a specific exercise on a test case study. Some solutions emerged but many problems of seismic source characterizations with people working in the field as well as with people tackling models of interacting faults remained. Now, in Wien, we want to open the group and launch a call for the European community at large to contribute to the discussion. The 2016 EGU session Fault2SHA is motivated by such an urgency to increase the number of round tables on this topic and debate on the peculiarities of using faults in seismic hazard

  1. Seismogenic zones and attenuation laws for probabilistic seismic hazard assessment in low deformation area =

    NASA Astrophysics Data System (ADS)

    Le Goff, Boris

    Seismic Hazard Analysis (PSHA), rather than the subjective methodologies that are currently used. This study focuses particularly in the definition of the seismic sources, through the seismotectonic zoning, and the determination of historical earthquake location. An important step in the Probabilistic Seismic Hazard Analysis consists in defining the seismic source model. Such a model expresses the association of the seismicity characteristics with the tectonically-active geological structures evidenced by seismotectonic studies. Given that most of the faults, in low seismic regions, are not characterized well enough, the source models are generally defined as areal zones, delimited with finite boundary polygons, within which the seismicity and the geological features are deemed homogeneous (e.g., focal depth, seismicity rate). Besides the lack of data (short period of instrumental seismicity), such a method generates different problems for regions with low seismic activity: 1) a large sensitivity of resulting hazard maps to the location of zone boundaries, while these boundaries are set by expert decisions; 2) the zoning cannot represent any variability or structural complexity in seismic parameters; 3) the seismicity rate is distributed throughout the zone and the location of the determinant information used for its calculation is lost. We investigate an alternative approach to model the seismotectonic zoning, with three main objectives: 1) obtaining a reproducible method that 2) preserves the information on the sources and extent of the uncertainties, so as to allow to propagate them (through Ground Motion Prediction Equations on to the hazard maps), and that 3) redefines the seismic source concept to debrief our knowledge on the seismogenic structures and the clustering. To do so, the Bayesian methods are favored. First, a generative model with two zones, differentiated by two different surface activity rates, was developed, creating synthetic catalogs drawn

  2. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  3. Lawrence Livermore National Laboratory Site Seismic Safety Program: Summary of Findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savy, J B; Foxall, W

    The Lawrence Livermore National Laboratory (LLNL) Site Seismic Safety Program was conceived in 1979 during the preparation of the site Draft Environmental Impact Statement. The impetus for the program came from the development of new methodologies and geologic data that affect assessments of geologic hazards at the LLNL site; it was designed to develop a new assessment of the seismic hazard to the LLNL site and LLNL employees. Secondarily, the program was also intended to provide the technical information needed to make ongoing decisions about design criteria for future construction at LLNL and about the adequacy of existing facilities. Thismore » assessment was intended to be of the highest technical quality and to make use of the most recent and accepted hazard assessment methodologies. The basic purposes and objectives of the current revision are similar to those of the previous studies. Although all the data and experience assembled in the previous studies were utilized to their fullest, the large quantity of new information and new methodologies led to the formation of a new team that includes LLNL staff and outside consultants from academia and private consulting firms. A peer-review panel composed of individuals from academia (A. Cornell, Stanford University), the Department of Energy (DOE; Jeff Kimball), and consulting (Kevin Coppersmith), provided review and guidance. This panel was involved from the beginning of the project in a ''participatory'' type of review. The Senior Seismic Hazard Analysis Committee (SSHAC, a committee sponsored by the U.S. Nuclear Regulatory Commission, DOE, and the Electric Power Research Institute) strongly recommends the use of participatory reviews, in which the reviewers follow the progress of a project from the beginning, rather than waiting until the end to provide comments (Budnitz et al., 1997). Following the requirements for probabilistic seismic hazard analysis (PSHA) stipulated in the DOE standard DOE-STD-1023-95, a

  4. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  5. Investigating Brittle Rock Failure and Associated Seismicity Using Laboratory Experiments and Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Zhao, Qi

    Rock failure process is a complex phenomenon that involves elastic and plastic deformation, microscopic cracking, macroscopic fracturing, and frictional slipping of fractures. Understanding this complex behaviour has been the focus of a significant amount of research. In this work, the combined finite-discrete element method (FDEM) was first employed to study (1) the influence of rock discontinuities on hydraulic fracturing and associated seismicity and (2) the influence of in-situ stress on seismic behaviour. Simulated seismic events were analyzed using post-processing tools including frequency-magnitude distribution (b-value), spatial fractal dimension (D-value), seismic rate, and fracture clustering. These simulations demonstrated that at the local scale, fractures tended to propagate following the rock mass discontinuities; while at reservoir scale, they developed in the direction parallel to the maximum in-situ stress. Moreover, seismic signature (i.e., b-value, D-value, and seismic rate) can help to distinguish different phases of the failure process. The FDEM modelling technique and developed analysis tools were then coupled with laboratory experiments to further investigate the different phases of the progressive rock failure process. Firstly, a uniaxial compression experiment, monitored using a time-lapse ultrasonic tomography method, was carried out and reproduced by the numerical model. Using this combination of technologies, the entire deformation and failure processes were studied at macroscopic and microscopic scales. The results not only illustrated the rock failure and seismic behaviours at different stress levels, but also suggested several precursory behaviours indicating the catastrophic failure of the rock. Secondly, rotary shear experiments were conducted using a newly developed rock physics experimental apparatus ERDmu-T) that was paired with X-ray micro-computed tomography (muCT). This combination of technologies has significant advantages

  6. Ultrasonic laboratory measurements of the seismic velocity changes due to CO2 injection

    NASA Astrophysics Data System (ADS)

    Park, K. G.; Choi, H.; Park, Y. C.; Hwang, S.

    2009-04-01

    Monitoring the behavior and movement of carbon dioxide (CO2) in the subsurface is a quite important in sequestration of CO2 in geological formation because such information provides a basis for demonstrating the safety of CO2 sequestration. Recent several applications in many commercial and pilot scale projects and researches show that 4D surface or borehole seismic methods are among the most promising techniques for this purpose. However, such information interpreted from the seismic velocity changes can be quite subjective and qualitative without petrophysical characterization for the effect of CO2 saturation on the seismic changes since seismic wave velocity depends on various factors and parameters like mineralogical composition, hydrogeological factors, in-situ conditions. In this respect, we have developed an ultrasonic laboratory measurement system and have carried out measurements for a porous sandstone sample to characterize the effects of CO2 injection to seismic velocity and amplitude. Measurements are done by ultrasonic piezoelectric transducer mounted on both ends of cylindrical core sample under various pressure, temperature, and saturation conditions. According to our fundamental experiments, injected CO2 introduces the decrease of seismic velocity and amplitude. We identified that the velocity decreases about 6% or more until fully saturated by CO2, but the attenuation of seismic amplitude is more drastically than the velocity decrease. We also identified that Vs/Vp or elastic modulus is more sensitive to CO2 saturation. We note that this means seismic amplitude and elastic modulus change can be an alternative target anomaly of seismic techniques in CO2 sequestration monitoring. Thus, we expect that we can estimate more quantitative petrophysical relationships between the changes of seismic attributes and CO2 concentration, which can provide basic relation for the quantitative assessment of CO2 sequestration by further researches.

  7. Exploring uncertainties in probabilistic seismic hazard estimates for Quito

    NASA Astrophysics Data System (ADS)

    Beauval, Celine; Yepes, Hugo; Audin, Laurence; Alvarado, Alexandra; Nocquet, Jean-Mathieu

    2016-04-01

    In the present study, probabilistic seismic hazard estimates at 475 years return period for Quito, capital city of Ecuador, show that the crustal host zone is the only source zone that determines the city's hazard levels for such return period. Therefore, the emphasis is put on identifying the uncertainties characterizing the host zone, i.e. uncertainties in the recurrence of earthquakes expected in the zone and uncertainties on the ground motions that these earthquakes may produce. As the number of local strong-ground motions is still scant, ground-motion prediction equations are imported from other regions. Exploring recurrence models for the host zone based on different observations and assumptions, and including three GMPE candidates (Akkar and Bommer 2010, Zhao et al. 2006, Boore and Atkinson 2008), we obtain a significant variability on the estimated acceleration at 475 years (site coordinates: -78.51 in longitude and -0.2 in latitude, VS30 760 m/s): 1) Considering historical earthquake catalogs, and relying on frequency-magnitude distributions where rates for magnitudes 6-7 are extrapolated from statistics of magnitudes 4.5-6.0 mostly in the 20th century, the acceleration at the PGA varies between 0.28g and 0.55g with a mean value around 0.4g. The results show that both the uncertainties in the GMPE choice and in the seismicity model are responsible for this variability. 2) Considering slip rates inferred form geodetic measurements across the Quito fault system, and assuming that most of the deformation occurs seismically (conservative hypothesis), leads to a much greater range of accelerations, 0.43 to 0.73g for the PGA (with a mean of 0.55g). 3) Considering slip rates inferred from geodetic measurements, and assuming that 50% only of the deformation is released in earthquakes (partially locked fault, model based on 15 years of GPS data), leads to a range of accelerations 0.32g to 0.58g for the PGA, with a mean of 0.42g. These accelerations are in agreement

  8. Seismic hazard assessment for Guam and the Northern Mariana Islands

    USGS Publications Warehouse

    Mueller, Charles S.; Haller, Kathleen M.; Luco, Nicholas; Petersen, Mark D.; Frankel, Arthur D.

    2012-01-01

    We present the results of a new probabilistic seismic hazard assessment for Guam and the Northern Mariana Islands. The Mariana island arc has formed in response to northwestward subduction of the Pacific plate beneath the Philippine Sea plate, and this process controls seismic activity in the region. Historical seismicity, the Mariana megathrust, and two crustal faults on Guam were modeled as seismic sources, and ground motions were estimated by using published relations for a firm-rock site condition. Maps of peak ground acceleration, 0.2-second spectral acceleration for 5 percent critical damping, and 1.0-second spectral acceleration for 5 percent critical damping were computed for exceedance probabilities of 2 percent and 10 percent in 50 years. For 2 percent probability of exceedance in 50 years, probabilistic peak ground acceleration is 0.94 gravitational acceleration at Guam and 0.57 gravitational acceleration at Saipan, 0.2-second spectral acceleration is 2.86 gravitational acceleration at Guam and 1.75 gravitational acceleration at Saipan, and 1.0-second spectral acceleration is 0.61 gravitational acceleration at Guam and 0.37 gravitational acceleration at Saipan. For 10 percent probability of exceedance in 50 years, probabilistic peak ground acceleration is 0.49 gravitational acceleration at Guam and 0.29 gravitational acceleration at Saipan, 0.2-second spectral acceleration is 1.43 gravitational acceleration at Guam and 0.83 gravitational acceleration at Saipan, and 1.0-second spectral acceleration is 0.30 gravitational acceleration at Guam and 0.18 gravitational acceleration at Saipan. The dominant hazard source at the islands is upper Benioff-zone seismicity (depth 40–160 kilometers). The large probabilistic ground motions reflect the strong concentrations of this activity below the arc, especially near Guam.

  9. Mean and modal ϵ in the deaggregation of probabilistic ground motion

    USGS Publications Warehouse

    Harmsen, Stephen C.

    2001-01-01

    Mean and modal ϵ exhibit a wide variation geographically for any specified PE. Modal ϵ for the 2% in 50 yr PE exceeds 2 near the most active western California faults, is less than –1 near some less active faults of the western United States (principally in the Basin and Range), and may be less than 0 in areal fault zones of the central and eastern United States (CEUS). This geographic variation is useful for comparing probabilistic ground motions with ground motions from scenario earthquakes on dominating faults, often used in seismic-resistant provisions of building codes. An interactive seismic-hazard deaggregation menu item has been added to the USGS probabilistic seismic-hazard analysis Web site, http://geohazards.cr.usgs.gov/eq/, allowing visitors to compute mean and modal distance, magnitude, and ϵ corresponding to ground motions having mean return times from 250 to 5000 yr for any site in the United States.

  10. Seismically induced landslides: current research by the US Geological Survey.

    USGS Publications Warehouse

    Harp, E.L.; Wilson, R.C.; Keefer, D.K.; Wieczorek, G.F.

    1986-01-01

    We have produced a regional seismic slope-stability map and a probabilistic prediction of landslide distribution from a postulated earthquake. For liquefaction-induced landslides, in situ measurements of seismically induced pore-water pressures have been used to establish an elastic model of pore pressure generation. -from Authors

  11. Why aftershock duration matters for probabilistic seismic hazard assessment

    USGS Publications Warehouse

    Shinji Toda,; Stein, Ross S.

    2018-01-01

    Most hazard assessments assume that high background seismicity rates indicate a higher probability of large shocks and, therefore, of strong shaking. However, in slowly deforming regions, such as eastern North America, Australia, and inner Honshu, this assumption breaks down if the seismicity clusters are instead aftershocks of historic and prehistoric mainshocks. Here, therefore we probe the circumstances under which aftershocks can last for 100–1000 years. Basham and Adams (1983) and Ebel et al. (2000) proposed that intraplate seismicity in eastern North America could be aftershocks of mainshocks that struck hundreds of years beforehand, a view consonant with rate–state friction (Dieterich, 1994), in which aftershock duration varies inversely with fault‐stressing rate. To test these hypotheses, we estimate aftershock durations of the 2011 Mw 9 Tohoku‐Oki rupture at 12 sites up to 250 km from the source, as well as for the near‐fault aftershocks of eight large Japanese mainshocks, sampling faults slipping 0.01 to 80  mm/yr . Whereas aftershock productivity increases with mainshock magnitude, we find that aftershock duration, the time until the aftershock rate decays to the premainshock rate, does not. Instead, aftershock sequences lasted a month on the fastest‐slipping faults and are projected to persist for more than 2000 years on the slowest. Thus, long aftershock sequences can misguide and inflate hazard assessments in intraplate regions if misinterpreted as background seismicity, whereas areas between seismicity clusters may instead harbor a higher chance of large mainshocks, the opposite of what is being assumed today.

  12. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    NASA Astrophysics Data System (ADS)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-01

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  13. Laboratory investigations of seismicity caused by iceberg calving and capsize

    NASA Astrophysics Data System (ADS)

    Cathles, L. M. M., IV; Kaluzienski, L. M.; Burton, J. C.

    2015-12-01

    The calving and capsize of cubic kilometer-sized icebergs in both Greenland and Antarctica are known to be the source of long-period seismic events classified as glacial earthquakes. The ability to monitor both calving events and the mass of ice calved using the Global Seismographic Network is quite attractive, however, the basic physics of these large calving events must be understood to develop a robust relationship between seismic magnitude and mass of ice calved. The amplitude and duration of the seismic signal is expected to be related to the mass of the calved iceberg and the magnitude of the acceleration of the iceberg's center of mass, yet a simple relationship between these quantities has proved difficult to develop from in situ observations or numerical models. To address this, we developed and carried out a set of experiments on a laboratory scale model of iceberg calving. These experiments were designed to measure several aspects of the post-fracture calving process. Our results show that a combination of mechanical contact forces and hydrodynamic pressure forces are generated by the capsize of an iceberg adjacent to a glacier's terminus. These forces combine to produce the net horizontal centroid single force (CSF) which is often used to model glacial earthquake sources. We find that although the amplitude and duration of the force applied to the terminus generally increases with the iceberg mass, the details depend on the geometry of the iceberg and the depth of the water. The resulting seismic signal is thus crucially dependent on hydrodynamics of the capsize process.

  14. How well can we test probabilistic seismic hazard maps?

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart

    2017-04-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake

  15. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    USGS Publications Warehouse

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  16. USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.

    2000-01-01

    The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and

  17. Expected Seismicity and the Seismic Noise Environment of Europa

    NASA Astrophysics Data System (ADS)

    Panning, Mark P.; Stähler, Simon C.; Huang, Hsin-Hua; Vance, Steven D.; Kedar, Sharon; Tsai, Victor C.; Pike, William T.; Lorenz, Ralph D.

    2018-01-01

    Seismic data will be a vital geophysical constraint on internal structure of Europa if we land instruments on the surface. Quantifying expected seismic activity on Europa both in terms of large, recognizable signals and ambient background noise is important for understanding dynamics of the moon, as well as interpretation of potential future data. Seismic energy sources will likely include cracking in the ice shell and turbulent motion in the oceans. We define a range of models of seismic activity in Europa's ice shell by assuming each model follows a Gutenberg-Richter relationship with varying parameters. A range of cumulative seismic moment release between 1016 and 1018 Nm/yr is defined by scaling tidal dissipation energy to tectonic events on the Earth's moon. Random catalogs are generated and used to create synthetic continuous noise records through numerical wave propagation in thermodynamically self-consistent models of the interior structure of Europa. Spectral characteristics of the noise are calculated by determining probabilistic power spectral densities of the synthetic records. While the range of seismicity models predicts noise levels that vary by 80 dB, we show that most noise estimates are below the self-noise floor of high-frequency geophones but may be recorded by more sensitive instruments. The largest expected signals exceed background noise by ˜50 dB. Noise records may allow for constraints on interior structure through autocorrelation. Models of seismic noise generated by pressure variations at the base of the ice shell due to turbulent motions in the subsurface ocean may also generate observable seismic noise.

  18. Evaluation of seismic hazard at the northwestern part of Egypt

    NASA Astrophysics Data System (ADS)

    Ezzelarab, M.; Shokry, M. M. F.; Mohamed, A. M. E.; Helal, A. M. A.; Mohamed, Abuoelela A.; El-Hadidy, M. S.

    2016-01-01

    The objective of this study is to evaluate the seismic hazard at the northwestern Egypt using the probabilistic seismic hazard assessment approach. The Probabilistic approach was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. The doubly-truncated exponential model was adopted for calculations of the recurrence parameters. Ground-motion prediction equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 0.2° × 0.2° covering the study area, seismic hazard curves for every node were calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to six spectral periods (0.1, 0.2, 0.3, 1.0, 2.0 and 3.0 s) for return periods of 72, 475 and 2475 years. The unified hazard spectra of two selected rock sites at Alexandria and Mersa Matruh Cities were provided. Finally, the hazard curves were de-aggregated to determine the sources that contribute most of hazard level of 10% probability of exceedance in 50 years for the mentioned selected sites.

  19. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  20. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  1. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design

  2. Evaluation of induced seismicity forecast models in the Induced Seismicity Test Bench

    NASA Astrophysics Data System (ADS)

    Király, Eszter; Gischig, Valentin; Zechar, Jeremy; Doetsch, Joseph; Karvounis, Dimitrios; Wiemer, Stefan

    2016-04-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. Here, we propose an Induced Seismicity Test Bench to test and rank such models. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models that incorporate a different mix of physical understanding and stochastic representation of the induced sequences: Shapiro in Space (SiS) and Hydraulics and Seismics (HySei). SiS is based on three pillars: the seismicity rate is computed with help of the seismogenic index and a simple exponential decay of the seismicity; the magnitude distribution follows the Gutenberg-Richter relation; and seismicity is distributed in space based on smoothing seismicity during the learning period with 3D Gaussian kernels. The HySei model describes seismicity triggered by pressure diffusion with irreversible permeability enhancement. Our results show that neither model is fully superior to the other. HySei forecasts the seismicity rate well, but is only mediocre at forecasting the spatial distribution. On the other hand, SiS forecasts the spatial distribution well but not the seismicity rate. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in. Ensemble models that combine HySei's rate forecast with SiS's spatial forecast outperform each individual model.

  3. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationshipsmore » for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.« less

  4. NSR&D Program Fiscal Year (FY) 2015 Call for Proposals Mitigation of Seismic Risk at Nuclear Facilities using Seismic Isolation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin

    2015-02-01

    Seismic isolation (SI) has the potential to drastically reduce seismic response of structures, systems, or components (SSCs) and therefore the risk associated with large seismic events (large seismic event could be defined as the design basis earthquake (DBE) and/or the beyond design basis earthquake (BDBE) depending on the site location). This would correspond to a potential increase in nuclear safety by minimizing the structural response and thus minimizing the risk of material release during large seismic events that have uncertainty associated with their magnitude and frequency. The national consensus standard America Society of Civil Engineers (ASCE) Standard 4, Seismic Analysismore » of Safety Related Nuclear Structures recently incorporated language and commentary for seismically isolating a large light water reactor or similar large nuclear structure. Some potential benefits of SI are: 1) substantially decoupling the SSC from the earthquake hazard thus decreasing risk of material release during large earthquakes, 2) cost savings for the facility and/or equipment, and 3) applicability to both nuclear (current and next generation) and high hazard non-nuclear facilities. Issue: To date no one has evaluated how the benefit of seismic risk reduction reduces cost to construct a nuclear facility. Objective: Use seismic probabilistic risk assessment (SPRA) to evaluate the reduction in seismic risk and estimate potential cost savings of seismic isolation of a generic nuclear facility. This project would leverage ongoing Idaho National Laboratory (INL) activities that are developing advanced (SPRA) methods using Nonlinear Soil-Structure Interaction (NLSSI) analysis. Technical Approach: The proposed study is intended to obtain an estimate on the reduction in seismic risk and construction cost that might be achieved by seismically isolating a nuclear facility. The nuclear facility is a representative pressurized water reactor building nuclear power plant (NPP

  5. Large-band seismic characterization of the INFN Gran Sasso National Laboratory

    NASA Astrophysics Data System (ADS)

    Acernese, F.; Canonico, R.; De Rosa, R.; Giordano, G.; Romano, R.; Barone, F.

    2013-04-01

    In this paper we present the scientific data recorded by tunable mechanical monolithic horizontal seismometers located in the Gran Sasso National Laboratory of the INFN, within thermally insulating enclosures onto concrete slabs connected to the bedrock. The main goals of this long-term large-band measurements are for the seismic characterization of the site in the frequency band 10-6÷10Hz and the acquisition of all the relevant information for the optimization of the sensors.

  6. Impact of fault models on probabilistic seismic hazard assessment: the example of the West Corinth rift.

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène

    2016-04-01

    Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically

  7. Seismic probabilistic tsunami hazard: from regional to local analysis and use of geological and historical observations

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Lorito, S.; Orefice, S.; Graziani, L.; Brizuela, B.; Smedile, A.; Volpe, M.; Romano, F.; De Martini, P. M.; Maramai, A.; Selva, J.; Piatanesi, A.; Pantosti, D.

    2016-12-01

    Site-specific probabilistic tsunami hazard analyses demand very high computational efforts that are often reduced by introducing approximations on tsunami sources and/or tsunami modeling. On one hand, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could easily lead to important bias in the analysis. On the other hand, detailed inundation maps computed by tsunami numerical simulations require very long running time. When tsunami effects are calculated at regional scale, a common practice is to propagate tsunami waves in deep waters (up to 50-100 m depth) neglecting non-linear effects and using coarse bathymetric meshes. Then, maximum wave heights on the coast are empirically extrapolated, saving a significant amount of computational time. However, moving to local scale, such assumptions drop out and tsunami modeling would require much greater computational resources. In this work, we perform a local Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) for the 50 km long coastal segment between Augusta and Siracusa, a touristic and commercial area placed along the South-Eastern Sicily coast, Italy. The procedure consists in using the outcomes of a regional SPTHA as input for a two-step filtering method to select and substantially reduce the number of scenarios contributing to the specific target area. These selected scenarios are modeled using high resolution topo-bathymetry for producing detailed inundation maps. Results are presented as probabilistic hazard curves and maps, with the goal of analyze, compare and highlight the different results provided by regional and local hazard assessments. Moreover, the analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for the selected target areas are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore

  8. Correlation of field seismic refraction data with 3-D laboratory ultrasonic sounding data during exploration of a dimension stone deposit

    NASA Astrophysics Data System (ADS)

    Přikryl, Richard; Vilhelm, Jan; Lokajíček, Tomáš; Pros, Zdeněk; Klíma, Karel

    2004-05-01

    Multidirectional field seismic refraction data have been combined with 3-D laboratory ultrasonic sounding data in a preliminary exploration of a new dimension stone deposit in the Czech Republic. Rock fabric was interpreted from a detailed laboratory analysis of a 3-D P-wave velocity pattern and can be classified as pronounced orthorhombic due to a complex tectonometamorphic history of the rock. The P-wave velocity pattern recorded from laboratory measurements can be satisfactorily correlated with the anisotropy of P-wave velocity data acquired from field seismic refraction data. Rock fabric anisotropy also contributes to the observed anisotropy of strength and static deformational properties.

  9. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  10. Dominant seismic sources for the cities in South Sumatra

    NASA Astrophysics Data System (ADS)

    Sunardi, Bambang; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Rohadi, Supriyanto; Sulastri, Putra, Ade Surya

    2017-07-01

    Subduction zone along west of Sumatra and Sumatran fault zone are active seismic sources. Seismotectonically, South Sumatra could be affected by earthquakes triggered by these seismic sources. This paper discussed contribution of each seismic source to earthquake hazards for cities of Palembang, Prabumulih, Banyuasin, OganIlir, Ogan Komering Ilir, South Oku, Musi Rawas and Empat Lawang. These hazards are presented in form of seismic hazard curves. The study was conducted by using Probabilistic Seismic Hazard Analysis (PSHA) of 2% probability of exceedance in 50 years. Seismic sources used in analysis included megathrust zone M2 of Sumatra and South Sumatra, background seismic sources and shallow crustal seismic sources consist of Ketaun, Musi, Manna and Kumering faults. The results of the study showed that for cities relatively far from the seismic sources, subduction / megathrust seismic source with a depth ≤ 50 km greatly contributed to the seismic hazard and the other areas showed deep background seismic sources with a depth of more than 100 km dominate to seismic hazard respectively.

  11. The Spatial Assessment of the Current Seismic Hazard State for Hard Rock Underground Mines

    NASA Astrophysics Data System (ADS)

    Wesseloo, Johan

    2018-06-01

    Mining-induced seismic hazard assessment is an important component in the management of safety and financial risk in mines. As the seismic hazard is a response to the mining activity, it is non-stationary and variable both in space and time. This paper presents an approach for implementing a probabilistic seismic hazard assessment to assess the current hazard state of a mine. Each of the components of the probabilistic seismic hazard assessment is considered within the context of hard rock underground mines. The focus of this paper is the assessment of the in-mine hazard distribution and does not consider the hazard to nearby public or structures. A rating system and methodologies to present hazard maps, for the purpose of communicating to different stakeholders in the mine, i.e. mine managers, technical personnel and the work force, are developed. The approach allows one to update the assessment with relative ease and within short time periods as new data become available, enabling the monitoring of the spatial and temporal change in the seismic hazard.

  12. Hierarchical Bayesian Modeling of Fluid-Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.

    2017-11-01

    In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.

  13. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    USGS Publications Warehouse

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    The ground motion hazard for Sumatra and the Malaysian peninsula is calculated in a probabilistic framework, using procedures developed for the US National Seismic Hazard Maps. We constructed regional earthquake source models and used standard published and modified attenuation equations to calculate peak ground acceleration at 2% and 10% probability of exceedance in 50 years for rock site conditions. We developed or modified earthquake catalogs and declustered these catalogs to include only independent earthquakes. The resulting catalogs were used to define four source zones that characterize earthquakes in four tectonic environments: subduction zone interface earthquakes, subduction zone deep intraslab earthquakes, strike-slip transform earthquakes, and intraplate earthquakes. The recurrence rates and sizes of historical earthquakes on known faults and across zones were also determined from this modified catalog. In addition to the source zones, our seismic source model considers two major faults that are known historically to generate large earthquakes: the Sumatran subduction zone and the Sumatran transform fault. Several published studies were used to describe earthquakes along these faults during historical and pre-historical time, as well as to identify segmentation models of faults. Peak horizontal ground accelerations were calculated using ground motion prediction relations that were developed from seismic data obtained from the crustal interplate environment, crustal intraplate environment, along the subduction zone interface, and from deep intraslab earthquakes. Most of these relations, however, have not been developed for large distances that are needed for calculating the hazard across the Malaysian peninsula, and none were developed for earthquake ground motions generated in an interplate tectonic environment that are propagated into an intraplate tectonic environment. For the interplate and intraplate crustal earthquakes, we have applied ground

  14. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  15. Sequential Data Assimilation for Seismicity: a Proof of Concept

    NASA Astrophysics Data System (ADS)

    van Dinther, Ylona; Kuensch, Hans Rudolf; Fichtner, Andreas

    2017-04-01

    Integrating geological and geophysical observations, laboratory results and physics-based numerical modeling is crucial to improve our understanding of the occurrence of large subduction earthquakes. How to do this integration is less obvious, especially in light of the scarcity and uncertainty of natural and laboratory data and the difficulty of modeling the physics governing earthquakes. One way to efficiently combine information from these sources in order to estimate states and/or parameters is data assimilation, a mathematically sound framework extensively developed for weather forecasting purposes. We demonstrate the potential of using data assimilation by applying an Ensemble Kalman Filter to recover the current and forecast the future state of stress and strength on the megathrust based on data from a single borehole. Data and its errors are for the first time assimilated to - using the least-squares solution of Bayes theorem - update a Partial Differential Equation-driven seismic cycle model. This visco-elasto-plastic continuum forward model solves Navier-Stokes equations with a rate-dependent friction coefficient. To prove this concept we perform a perfect model test in an analogue subduction zone setting. Synthetic numerical data from a single analogue borehole are assimilated into 150 ensemble models. Since we know the true state of the numerical data model, a quantitative and qualitative evaluation shows that meaningful information on the stress and strength is available, even when only data from a single borehole is assimilated over only a part of a seismic cycle. This is possible, since the sampled error covariance matrix contains prior information on the physics that relates velocities, stresses, and pressures at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed in such a way that fault coupling can be updated to either inhibit or trigger events. In the subsequent forward

  16. Accuracy of finite-difference modeling of seismic waves : Simulation versus laboratory measurements

    NASA Astrophysics Data System (ADS)

    Arntsen, B.

    2017-12-01

    The finite-difference technique for numerical modeling of seismic waves is still important and for some areas extensively used.For exploration purposes is finite-difference simulation at the core of both traditional imaging techniques such as reverse-time migration and more elaborate Full-Waveform Inversion techniques.The accuracy and fidelity of finite-difference simulation of seismic waves are hard to quantify and meaningfully error analysis is really onlyeasily available for simplistic media. A possible alternative to theoretical error analysis is provided by comparing finite-difference simulated data with laboratory data created using a scale model. The advantage of this approach is the accurate knowledge of the model, within measurement precision, and the location of sources and receivers.We use a model made of PVC immersed in water and containing horizontal and tilted interfaces together with several spherical objects to generateultrasonic pressure reflection measurements. The physical dimensions of the model is of the order of a meter, which after scaling represents a model with dimensions of the order of 10 kilometer and frequencies in the range of one to thirty hertz.We find that for plane horizontal interfaces the laboratory data can be reproduced by the finite-difference scheme with relatively small error, but for steeply tilted interfaces the error increases. For spherical interfaces the discrepancy between laboratory data and simulated data is sometimes much more severe, to the extent that it is not possible to simulate reflections from parts of highly curved bodies. The results are important in view of the fact that finite-difference modeling is often at the core of imaging and inversion algorithms tackling complicatedgeological areas with highly curved interfaces.

  17. The exponential rise of induced seismicity with increasing stress levels in the Groningen gas field and its implications for controlling seismic risk

    NASA Astrophysics Data System (ADS)

    Bourne, S. J.; Oates, S. J.; van Elk, J.

    2018-06-01

    Induced seismicity typically arises from the progressive activation of recently inactive geological faults by anthropogenic activity. Faults are mechanically and geometrically heterogeneous, so their extremes of stress and strength govern the initial evolution of induced seismicity. We derive a statistical model of Coulomb stress failures and associated aftershocks within the tail of the distribution of fault stress and strength variations to show initial induced seismicity rates will increase as an exponential function of induced stress. Our model provides operational forecasts consistent with the observed space-time-magnitude distribution of earthquakes induced by gas production from the Groningen field in the Netherlands. These probabilistic forecasts also match the observed changes in seismicity following a significant and sustained decrease in gas production rates designed to reduce seismic hazard and risk. This forecast capability allows reliable assessment of alternative control options to better inform future induced seismic risk management decisions.

  18. Uncertainties in evaluation of hazard and seismic risk

    NASA Astrophysics Data System (ADS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru; Ortanza Cioflan, Carmen; Manea, Elena-Florinela

    2015-04-01

    Two methods are commonly used for seismic hazard assessment: probabilistic (PSHA) and deterministic(DSHA) seismic hazard analysis.Selection of a ground motion for engineering design requires a clear understanding of seismic hazard and risk among stakeholders, seismologists and engineers. What is wrong with traditional PSHA or DSHA ? PSHA common used in engineering is using four assumptions developed by Cornell in 1968:(1)-Constant-in-time average occurrence rate of earthquakes; (2)-Single point source; (3).Variability of ground motion at a site is independent;(4)-Poisson(or "memory - less") behavior of earthquake occurrences. It is a probabilistic method and "when the causality dies, its place is taken by probability, prestigious term meant to define the inability of us to predict the course of nature"(Nils Bohr). DSHA method was used for the original design of Fukushima Daichii, but Japanese authorities moved to probabilistic assessment methods and the probability of exceeding of the design basis acceleration was expected to be 10-4-10-6 . It was exceeded and it was a violation of the principles of deterministic hazard analysis (ignoring historical events)(Klügel,J,U, EGU,2014, ISSO). PSHA was developed from mathematical statistics and is not based on earthquake science(invalid physical models- point source and Poisson distribution; invalid mathematics; misinterpretation of annual probability of exceeding or return period etc.) and become a pure numerical "creation" (Wang, PAGEOPH.168(2011),11-25). An uncertainty which is a key component for seismic hazard assessment including both PSHA and DSHA is the ground motion attenuation relationship or the so-called ground motion prediction equation (GMPE) which describes a relationship between a ground motion parameter (i.e., PGA,MMI etc.), earthquake magnitude M, source to site distance R, and an uncertainty. So far, no one is taking into consideration strong nonlinear behavior of soils during of strong earthquakes. But

  19. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy - Part 2: Computational implementation and first results

    NASA Astrophysics Data System (ADS)

    Peruzza, Laura; Azzaro, Raffaele; Gee, Robin; D'Amico, Salvatore; Langer, Horst; Lombardo, Giuseppe; Pace, Bruno; Pagani, Marco; Panzera, Francesco; Ordaz, Mario; Suarez, Miguel Leonardo; Tusa, Giuseppina

    2017-11-01

    This paper describes the model implementation and presents results of a probabilistic seismic hazard assessment (PSHA) for the Mt. Etna volcanic region in Sicily, Italy, considering local volcano-tectonic earthquakes. Working in a volcanic region presents new challenges not typically faced in standard PSHA, which are broadly due to the nature of the local volcano-tectonic earthquakes, the cone shape of the volcano and the attenuation properties of seismic waves in the volcanic region. These have been accounted for through the development of a seismic source model that integrates data from different disciplines (historical and instrumental earthquake datasets, tectonic data, etc.; presented in Part 1, by Azzaro et al., 2017) and through the development and software implementation of original tools for the computation, such as a new ground-motion prediction equation and magnitude-scaling relationship specifically derived for this volcanic area, and the capability to account for the surficial topography in the hazard calculation, which influences source-to-site distances. Hazard calculations have been carried out after updating the most recent releases of two widely used PSHA software packages (CRISIS, as in Ordaz et al., 2013; the OpenQuake engine, as in Pagani et al., 2014). Results are computed for short- to mid-term exposure times (10 % probability of exceedance in 5 and 30 years, Poisson and time dependent) and spectral amplitudes of engineering interest. A preliminary exploration of the impact of site-specific response is also presented for the densely inhabited Etna's eastern flank, and the change in expected ground motion is finally commented on. These results do not account for M > 6 regional seismogenic sources which control the hazard at long return periods. However, by focusing on the impact of M < 6 local volcano-tectonic earthquakes, which dominate the hazard at the short- to mid-term exposure times considered in this study, we present a different

  20. Seismic Structure of Perth Basin (Australia) and surroundings from Passive Seismic Deployments

    NASA Astrophysics Data System (ADS)

    Issa, N.; Saygin, E.; Lumley, D. E.; Hoskin, T. E.

    2016-12-01

    We image the subsurface structure of Perth Basin, Western Australia and surroundings by using ambient seismic noise data from 14 seismic stations recently deployed by University of Western Australia (UWA) and other available permanent stations from Geoscience Australia seismic network and the Australian Seismometers in Schools program. Each of these 14 UWA seismic stations comprises a broadband sensor and a high fidelity 3-component 10 Hz geophone, recording in tandem at 250 Hz and 1000 Hz. The other stations used in this study are equipped with short period and broadband sensors. In addition, one shallow borehole station is operated with eight 3 component geophones at depths of between 2 and 44 m. The network is deployed to characterize natural seismicity in the basin and to try and identify any microseismic activity across Darling Fault Zone (DFZ), bounding the basin to the east. The DFZ stretches to approximately 1000 km north-south in Western Australia, and is one of the longest fault zones on the earth with a limited number of detected earthquakes. We use seismic noise cross- and auto-correlation methods to map seismic velocity perturbations across the basin and the transition from DFZ to the basin. Retrieved Green's functions are stable and show clear dispersed waveforms. Travel times of the surface wave Green's functions from noise cross-correlations are inverted with a two-step probabilistic framework to map the absolute shear wave velocities as a function of depth. The single station auto-correlations from the seismic noise yields P wave reflectivity under each station, marking the major discontinuities. Resulting images show the shear velocity perturbations across the region. We also quantify the variation of ambient seismic noise at different depths in the near surface using the geophones in the shallow borehole array.

  1. Interpreting intraplate tectonics for seismic hazard: a UK historical perspective

    NASA Astrophysics Data System (ADS)

    Musson, R. M. W.

    2012-04-01

    It is notoriously difficult to construct seismic source models for probabilistic seismic hazard assessment in intraplate areas on the basis of geological information, and many practitioners have given up the task in favour of purely seismicity-based models. This risks losing potentially valuable information in regions where the earthquake catalogue is short compared to the seismic cycle. It is interesting to survey how attitudes to this issue have evolved over the past 30 years. This paper takes the UK as an example, and traces the evolution of seismic source models through generations of hazard studies. It is found that in the UK, while the earliest studies did not consider regional tectonics in any way, there has been a gradual evolution towards more tectonically based models. Experience in other countries, of course, may differ.

  2. Seismic variability of subduction thrust faults: Insights from laboratory models

    NASA Astrophysics Data System (ADS)

    Corbi, F.; Funiciello, F.; Faccenna, C.; Ranalli, G.; Heuret, A.

    2011-06-01

    Laboratory models are realized to investigate the role of interface roughness, driving rate, and pressure on friction dynamics. The setup consists of a gelatin block driven at constant velocity over sand paper. The interface roughness is quantified in terms of amplitude and wavelength of protrusions, jointly expressed by a reference roughness parameter obtained by their product. Frictional behavior shows a systematic dependence on system parameters. Both stick slip and stable sliding occur, depending on driving rate and interface roughness. Stress drop and frequency of slip episodes vary directly and inversely, respectively, with the reference roughness parameter, reflecting the fundamental role for the amplitude of protrusions. An increase in pressure tends to favor stick slip. Static friction is a steeply decreasing function of the reference roughness parameter. The velocity strengthening/weakening parameter in the state- and rate-dependent dynamic friction law becomes negative for specific values of the reference roughness parameter which are intermediate with respect to the explored range. Despite the simplifications of the adopted setup, which does not address the problem of off-fault fracturing, a comparison of the experimental results with the depth distribution of seismic energy release along subduction thrust faults leads to the hypothesis that their behavior is primarily controlled by the depth- and time-dependent distribution of protrusions. A rough subduction fault at shallow depths, unable to produce significant seismicity because of low lithostatic pressure, evolves into a moderately rough, velocity-weakening fault at intermediate depths. The magnitude of events in this range is calibrated by the interplay between surface roughness and subduction rate. At larger depths, the roughness further decreases and stable sliding becomes gradually more predominant. Thus, although interplate seismicity is ultimately controlled by tectonic parameters (velocity of

  3. Seismic risk assessment and application in the central United States

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic risk is a somewhat subjective, but important, concept in earthquake engineering and other related decision-making. Another important concept that is closely related to seismic risk is seismic hazard. Although seismic hazard and seismic risk have often been used interchangeably, they are fundamentally different: seismic hazard describes the natural phenomenon or physical property of an earthquake, whereas seismic risk describes the probability of loss or damage that could be caused by a seismic hazard. The distinction between seismic hazard and seismic risk is of practical significance because measures for seismic hazard mitigation may differ from those for seismic risk reduction. Seismic risk assessment is a complicated process and starts with seismic hazard assessment. Although probabilistic seismic hazard analysis (PSHA) is the most widely used method for seismic hazard assessment, recent studies have found that PSHA is not scientifically valid. Use of PSHA will lead to (1) artifact estimates of seismic risk, (2) misleading use of the annual probability of exccedance (i.e., the probability of exceedance in one year) as a frequency (per year), and (3) numerical creation of extremely high ground motion. An alternative approach, which is similar to those used for flood and wind hazard assessments, has been proposed. ?? 2011 ASCE.

  4. Seismic Risk Assessment for the Kyrgyz Republic

    NASA Astrophysics Data System (ADS)

    Pittore, Massimiliano; Sousa, Luis; Grant, Damian; Fleming, Kevin; Parolai, Stefano; Fourniadis, Yannis; Free, Matthew; Moldobekov, Bolot; Takeuchi, Ko

    2017-04-01

    The Kyrgyz Republic is one of the most socially and economically dynamic countries in Central Asia, and one of the most endangered by earthquake hazard in the region. In order to support the government of the Kyrgyz Republic in the development of a country-level Disaster Risk Reduction strategy, a comprehensive seismic risk study has been developed with the support of the World Bank. As part of this project, state-of-the-art hazard, exposure and vulnerability models have been developed and combined into the assessment of direct physical and economic risk on residential, educational and transportation infrastructure. The seismic hazard has been modelled with three different approaches, in order to provide a comprehensive overview of the possible consequences. A probabilistic seismic hazard assessment (PSHA) approach has been used to quantitatively evaluate the distribution of expected ground shaking intensity, as constrained by the compiled earthquake catalogue and associated seismic source model. A set of specific seismic scenarios based on events generated from known fault systems have been also considered, in order to provide insight on the expected consequences in case of strong events in proximity of densely inhabited areas. Furthermore, long-span catalogues of events have been generated stochastically and employed in the probabilistic analysis of expected losses over the territory of the Kyrgyz Republic. Damage and risk estimates have been computed by using an exposure model recently developed for the country, combined with the assignment of suitable fragility/vulnerability models. The risk estimation has been carried out with spatial aggregation at the district (rayon) level. The obtained results confirm the high level of seismic risk throughout the country, also pinpointing the location of several risk hotspots, particularly in the southern districts, in correspondence with the Ferghana valley. The outcome of this project will further support the local

  5. Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS

    NASA Astrophysics Data System (ADS)

    Ahmad, Raed; Adris, Ahmad; Singh, Ramesh

    2016-07-01

    In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.

  6. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach

  7. Characterization of granular flow dynamics from the generated high-frequency seismic signal: insights from laboratory experiments

    NASA Astrophysics Data System (ADS)

    Mangeney, A.; Farin, M.; de Rosny, J.; Toussaint, R.; Trinh, P. T.

    2017-12-01

    Landslides, rock avalanche and rockfalls represent a major natural hazard in steep environments. However, owing to the lack of visual observations, the dynamics of these gravitational events is still not well understood. A burning challenge is to deduce the landslide dynamics (flow potential energy, involved volume, particle size…) from the characteristics of the generated seismic signal (radiated seismic energy, maximum amplitude, frequencies,...). Laboratory experiments of granular columns collapse are conducted on an inclined plane. The seismic signal generated by the collapse is recorded by piezoelectric accelerometers sensitive in a wide frequency range (1 Hz - 56 kHz). The granular flow are constituted with steel beads of same diameter. We compare the dynamic parameters of the granular flows, deduced from the movie of the experiments, to the seismic parameters deduced from the measured seismic signals. The ratio of radiated seismic energy to potential energy lost is shown to slightly decrease with slope angle and is between 0.2% and 9%. It decreases as time, slope angle and flow volume increase and when the particle diameter decreases. These results explain the dispersion over several orders of magnitude of the seismic efficiency of natural landslides. We distinguish two successive phases of rise and decay in the time profiles if the amplitude of the seismic signal and of the mean frequency of the signal generated by the granular flows. The rise phase and the maximum are shown to be independent of the slope angle. The maximum seismic amplitude coincides with the maximum flow speed in the direction normal to the slope but not with the maximum downslope speed. We observe that the shape of the seismic envelope and frequencies as a function of time changes after a critical slope angle, between 10° and 15° with respect to the horizontal, with a decay phase lasting much longer as slope angle increases, due to a change in the flow regime, from a dense to a more

  8. A method for producing digital probabilistic seismic landslide hazard maps

    USGS Publications Warehouse

    Jibson, R.W.; Harp, E.L.; Michael, J.A.

    2000-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include: (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24 000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10 m grid spacing using ARC/INFO GIS software on a UNIX computer. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure. ?? 2000 Elsevier Science B.V. All rights reserved.

  9. Seismic hazard in the Nation's breadbasket

    USGS Publications Warehouse

    Boyd, Oliver; Haller, Kathleen; Luco, Nicolas; Moschetti, Morgan P.; Mueller, Charles; Petersen, Mark D.; Rezaeian, Sanaz; Rubinstein, Justin L.

    2015-01-01

    The USGS National Seismic Hazard Maps were updated in 2014 and included several important changes for the central United States (CUS). Background seismicity sources were improved using a new moment-magnitude-based catalog; a new adaptive, nearest-neighbor smoothing kernel was implemented; and maximum magnitudes for background sources were updated. Areal source zones developed by the Central and Eastern United States Seismic Source Characterization for Nuclear Facilities project were simplified and adopted. The weighting scheme for ground motion models was updated, giving more weight to models with a faster attenuation with distance compared to the previous maps. Overall, hazard changes (2% probability of exceedance in 50 years, across a range of ground-motion frequencies) were smaller than 10% in most of the CUS relative to the 2008 USGS maps despite new ground motion models and their assigned logic tree weights that reduced the probabilistic ground motions by 5–20%.

  10. Seismic Hazard Assessment at Esfaraen‒Bojnurd Railway, North‒East of Iran

    NASA Astrophysics Data System (ADS)

    Haerifard, S.; Jarahi, H.; Pourkermani, M.; Almasian, M.

    2018-01-01

    The objective of this study is to evaluate the seismic hazard at the Esfarayen-Bojnurd railway using the probabilistic seismic hazard assessment (PSHA) method. This method was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. Attenuation equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 1.2 × 1.2 km covering the study area, ground acceleration for every node was calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to return periods of 74, 475 and 2475 years.

  11. Three-dimensional Probabilistic Earthquake Location Applied to 2002-2003 Mt. Etna Eruption

    NASA Astrophysics Data System (ADS)

    Mostaccio, A.; Tuve', T.; Zuccarello, L.; Patane', D.; Saccorotti, G.; D'Agostino, M.

    2005-12-01

    Recorded seismicity for the Mt. Etna volcano, occurred during the 2002-2003 eruption, has been relocated using a probabilistic, non-linear, earthquake location approach. We used the software package NonLinLoc (Lomax et al., 2000) adopting the 3D velocity model obtained by Cocina et al., 2005. We applied our data through different algorithms: (1) via a grid-search; (2) via a Metropolis-Gibbs; and (3) via an Oct-tree. The Oct-Tree algorithm gives efficient, faster and accurate mapping of the PDF (Probability Density Function) of the earthquake location problem. More than 300 seismic events were analyzed in order to compare non-linear location results with the ones obtained by using traditional, linearized earthquake location algorithm such as Hypoellipse, and a 3D linearized inversion (Thurber, 1983). Moreover, we compare 38 focal mechanisms, chosen following stricta criteria selection, with the ones obtained by the 3D and 1D results. Although the presented approach is more of a traditional relocation application, probabilistic earthquake location could be used in routinely survey.

  12. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    USGS Publications Warehouse

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated

  13. A Probabilistic Tsunami Hazard Assessment Methodology and Its Application to Crescent City, CA

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Leveque, R. J.; Waagan, K.; Adams, L.; Lin, G.

    2012-12-01

    A PTHA methodology, based in large part on Probabilistic Seismic Hazard Assessment methods (e.g., Cornell, 1968; SSHAC, 1997; Geist and Parsons, 2005), was previously applied to Seaside, OR (Gonzalez, et al., 2009). This initial version of the method has been updated to include: a revised method to estimate tidal uncertainty; an improved method for generating stochastic realizations to estimate slip distribution uncertainty (Mai and Beroza, 2002; Blair, et al., 2011); additional near-field sources in the Cascadia Subduction Zone, based on the work of Goldfinger, et al. (2012); far-field sources in Japan, based on information updated since the 3 March 2011 Tohoku tsunami (Japan Earthquake Research Committee, 2011). The GeoClaw tsunami model (Berger, et. al, 2011) is used to simulate generation, propagation and inundation. We will discuss this revised PTHA methodology and the results of its application to Crescent City, CA. Berger, M.J., D. L. George, R. J. LeVeque, and K. T. Mandli, The GeoClaw software for depth-averaged flows with adaptive refinement, Adv. Water Res. 34 (2011), pp. 1195-1206. Blair, J.L., McCrory, P.A., Oppenheimer, D.H., and Waldhauser, F. (2011): A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity: U.S. Geological Survey Data Series 633, v.1.0, available at http://pubs.usgs.gov/ds/633/. Cornell, C. A. (1968): Engineering seismic risk analysis, Bull. Seismol. Soc. Am., 58, 1583-1606. Geist, E. L., and T. Parsons (2005): Probabilistic Analysis of Tsunami Hazards, Nat. Hazards, 37 (3), 277-314. Goldfinger, C., Nelson, C.H., Morey, A.E., Johnson, J.E., Patton, J.R., Karabanov, E., Gutiérrez-Pastor, J., Eriksson, A.T., Gràcia, E., Dunhill, G., Enkin, R.J., Dallimore, A., and Vallier, T. (2012): Turbidite event history—Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone: U.S. Geological Survey Professional Paper 1661-F, 170 p. (Available at http://pubs.usgs.gov/pp/pp1661f/). González, F

  14. Evaluation of the Seismic Hazard in Venezuela with a revised seismic catalog that seeks for harmonization along the country borders

    NASA Astrophysics Data System (ADS)

    Rendon, H.; Alvarado, L.; Paolini, M.; Olbrich, F.; González, J.; Ascanio, W.

    2013-05-01

    Probabilistic Seismic Hazard Assessment is a complex endeavor that relies on the quality of the information that comes from different sources: the seismic catalog, active faults parameters, strain rates, etc. Having this in mind, during the last several months, the FUNVISIS seismic hazard group has been working on a review and update of the local data base that form the basis for a reliable PSHA calculation. In particular, the seismic catalog, which provides the necessary information that allows the evaluation of the critical b-value, which controls how seismic occurrence distributes with magnitude, has received particular attention. The seismic catalog is the result of the effort of several generations of researchers along the years; therefore, the catalog necessarily suffers from the lack of consistency, homogeneity and completeness for all ranges of magnitude over any seismic study area. Merging the FUNVISIS instrumental catalog with the ones obtained from international agencies, we present the work that we have been doing to produce a consistent seismic catalog that covers Venezuela entirely, with seismic events starting from 1910 until 2012, and report the magnitude of completeness for the different periods. Also, we present preliminary results on the Seismic Hazard evaluation that takes into account such instrumental catalog, the historical catalog, updated known fault geometries and its correspondent parameters, and the new seismic sources that have been defined accordingly. Within the spirit of the Global Earthquake Model (GEM), all these efforts look for possible bridges with neighboring countries to establish consistent hazard maps across the borders.

  15. Seismic Evaluation of A Historical Structure In Kastamonu - Turkey

    NASA Astrophysics Data System (ADS)

    Pınar, USTA; Işıl ÇARHOĞLU, Asuman; EVCİ, Ahmet

    2018-01-01

    The Kastomonu province is a seismically active zone. the city has many historical buildings made of stone-masonry. In case of any probable future earthquakes, existing buildings may suffer substantial or heavy damages. In the present study, one of the historical traditional house located in Kastamonu were structurally investigated through probabilistic seismic risk assessment methodology. In the study, the building was modeled by using the Finite Element Modeling (FEM) software, SAP2000. Time history analyses were carried out using 10 different ground motion data on the FEM models. Displacements were interpreted, and the results were displayed graphically and discussed.

  16. Characterizing the Benefits of Seismic Isolation for Nuclear Structures: A Framework for Risk-Based Decision Making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolisetti, Chandrakanth; Yu, Chingching; Coleman, Justin

    This report provides a framework for assessing the benefits of seismic isolation and exercises the framework on a Generic Department of Energy Nuclear Facility (GDNF). These benefits are (1) reduction in the risk of unacceptable seismic performance and a dramatic reduction in the probability of unacceptable performance at beyond-design basis shaking, and (2) a reduction in capital cost at sites with moderate to high seismic hazard. The framework includes probabilistic risk assessment and estimates of overnight capital cost for the GDNF.

  17. Quantifying the uncertainty in site amplification modeling and its effects on site-specific seismic-hazard estimation in the upper Mississippi embayment and adjacent areas

    USGS Publications Warehouse

    Cramer, C.H.

    2006-01-01

    The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.

  18. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  19. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  20. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  1. Testing for ontological errors in probabilistic forecasting models of natural systems

    PubMed Central

    Marzocchi, Warner; Jordan, Thomas H.

    2014-01-01

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  2. Probabilistic tsunami hazard assessment in Greece for seismic sources along the segmented Hellenic Arc

    NASA Astrophysics Data System (ADS)

    Novikova, Tatyana; Babeyko, Andrey; Papadopoulos, Gerassimos

    2017-04-01

    Greece and adjacent coastal areas are characterized by a high population exposure to tsunami hazard. The Hellenic Arc is the most active geotectonic structure for the generation of earthquakes and tsunamis. We performed probabilistic tsunami hazard assessment for selected locations of Greek coastlines which are the forecasting points officially used in the tsunami warning operations by the Hellenic National Tsunami Warning Center and the NEAMTWS/IOC/UNESCO. In our analysis we considered seismic sources for tsunami generation along the western, central and eastern segments of the Hellenic Arc. We first created a synthetic catalog as long as 10,000 years for all the significant earthquakes with magnitudes in the range from 6.0 to 8.5, the real events being included in this catalog. For each event included in the synthetic catalog a tsunami was generated and propagated using Boussinesq model. The probability of occurrence for each event was determined by Gutenberg-Richter magnitude-frequency distribution. The results of our study are expressed as hazard curves and hazard maps. The hazard curves were obtained for the selected sites and present the annual probability of exceedance as a function of pick coastal tsunami amplitude. Hazard maps represent the distribution of peak coastal tsunami amplitudes corresponding to a fixed annual probability. In such forms our results can be easily compared to the ones obtained in other studies and further employed for the development of tsunami risk management plans. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.

  3. Central and Eastern United States (CEUS) Seismic Source Characterization (SSC) for Nuclear Facilities Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin J. Coppersmith; Lawrence A. Salomone; Chris W. Fuller

    2012-01-31

    This report describes a new seismic source characterization (SSC) model for the Central and Eastern United States (CEUS). It will replace the Seismic Hazard Methodology for the Central and Eastern United States, EPRI Report NP-4726 (July 1986) and the Seismic Hazard Characterization of 69 Nuclear Plant Sites East of the Rocky Mountains, Lawrence Livermore National Laboratory Model, (Bernreuter et al., 1989). The objective of the CEUS SSC Project is to develop a new seismic source model for the CEUS using a Senior Seismic Hazard Analysis Committee (SSHAC) Level 3 assessment process. The goal of the SSHAC process is to representmore » the center, body, and range of technically defensible interpretations of the available data, models, and methods. Input to a probabilistic seismic hazard analysis (PSHA) consists of both seismic source characterization and ground motion characterization. These two components are used to calculate probabilistic hazard results (or seismic hazard curves) at a particular site. This report provides a new seismic source model. Results and Findings The product of this report is a regional CEUS SSC model. This model includes consideration of an updated database, full assessment and incorporation of uncertainties, and the range of diverse technical interpretations from the larger technical community. The SSC model will be widely applicable to the entire CEUS, so this project uses a ground motion model that includes generic variations to allow for a range of representative site conditions (deep soil, shallow soil, hard rock). Hazard and sensitivity calculations were conducted at seven test sites representative of different CEUS hazard environments. Challenges and Objectives The regional CEUS SSC model will be of value to readers who are involved in PSHA work, and who wish to use an updated SSC model. This model is based on a comprehensive and traceable process, in accordance with SSHAC guidelines in NUREG/CR-6372, Recommendations for

  4. Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data

    NASA Astrophysics Data System (ADS)

    Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.

    2017-12-01

    While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard

  5. Seismic hazard and risk assessment for large Romanian dams situated in the Moldavian Platform

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Popescu, Emilia; Otilia Placinta, Anica; Petruta Constantin, Angela; Toma Danila, Dragos; Borleanu, Felix; Emilian Toader, Victorin; Moldoveanu, Traian

    2016-04-01

    Besides periodical technical inspections, the monitoring and the surveillance of dams' related structures and infrastructures, there are some more seismic specific requirements towards dams' safety. The most important one is the seismic risk assessment that can be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine (2002), and Bureau (2003), taking into account the maximum expected peak ground motions at the dams site - values obtained using probabilistic hazard assessment approaches (Moldovan et al., 2008), the structures vulnerability and the downstream risk characteristics (human, economical, historic and cultural heritage, etc) in the areas that might be flooded in the case of a dam failure. Probabilistic seismic hazard (PSH), vulnerability and risk studies for dams situated in the Moldavian Platform, starting from Izvorul Muntelui Dam, down on Bistrita and following on Siret River and theirs affluent will be realized. The most vulnerable dams will be studied in detail and flooding maps will be drawn to find the most exposed downstream localities both for risk assessment studies and warnings. GIS maps that clearly indicate areas that are potentially flooded are enough for these studies, thus giving information on the number of inhabitants and goods that may be destroyed. Geospatial servers included topography is sufficient to achieve them, all other further studies are not necessary for downstream risk assessment. The results will consist of local and regional seismic information, dams specific characteristics and locations, seismic hazard maps and risk classes, for all dams sites (for more than 30 dams), inundation maps (for the most vulnerable dams from the region) and possible affected localities. The studies realized in this paper have as final goal to provide the local emergency services with warnings of a potential dam failure and ensuing flood as a result of an large earthquake occurrence, allowing further

  6. Seismic hazard and risk assessment in the intraplate environment: The New Madrid seismic zone of the central United States

    USGS Publications Warehouse

    Wang, Z.

    2007-01-01

    Although the causes of large intraplate earthquakes are still not fully understood, they pose certain hazard and risk to societies. Estimating hazard and risk in these regions is difficult because of lack of earthquake records. The New Madrid seismic zone is one such region where large and rare intraplate earthquakes (M = 7.0 or greater) pose significant hazard and risk. Many different definitions of hazard and risk have been used, and the resulting estimates differ dramatically. In this paper, seismic hazard is defined as the natural phenomenon generated by earthquakes, such as ground motion, and is quantified by two parameters: a level of hazard and its occurrence frequency or mean recurrence interval; seismic risk is defined as the probability of occurrence of a specific level of seismic hazard over a certain time and is quantified by three parameters: probability, a level of hazard, and exposure time. Probabilistic seismic hazard analysis (PSHA), a commonly used method for estimating seismic hazard and risk, derives a relationship between a ground motion parameter and its return period (hazard curve). The return period is not an independent temporal parameter but a mathematical extrapolation of the recurrence interval of earthquakes and the uncertainty of ground motion. Therefore, it is difficult to understand and use PSHA. A new method is proposed and applied here for estimating seismic hazard in the New Madrid seismic zone. This method provides hazard estimates that are consistent with the state of our knowledge and can be easily applied to other intraplate regions. ?? 2007 The Geological Society of America.

  7. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  8. Black Thunder Coal Mine and Los Alamos National Laboratory experimental study of seismic energy generated by large scale mine blasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, R.L.; Gross, D.; Pearson, D.C.

    In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic,more » videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.« less

  9. Seismic hazard assessment in the Catania and Siracusa urban areas (Italy) through different approaches

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria

    2010-05-01

    The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed

  10. An in-situ stimulation experiment in crystalline rock - assessment of induced seismicity levels during stimulation and related hazard for nearby infrastructure

    NASA Astrophysics Data System (ADS)

    Gischig, Valentin; Broccardo, Marco; Amann, Florian; Jalali, Mohammadreza; Esposito, Simona; Krietsch, Hannes; Doetsch, Joseph; Madonna, Claudio; Wiemer, Stefan; Loew, Simon; Giardini, Domenico

    2016-04-01

    A decameter in-situ stimulation experiment is currently being performed at the Grimsel Test Site in Switzerland by the Swiss Competence Center for Energy Research - Supply of Electricity (SCCER-SoE). The underground research laboratory lies in crystalline rock at a depth of 480 m, and exhibits well-documented geology that is presenting some analogies with the crystalline basement targeted for the exploitation of deep geothermal energy resources in Switzerland. The goal is to perform a series of stimulation experiments spanning from hydraulic fracturing to controlled fault-slip experiments in an experimental volume approximately 30 m in diameter. The experiments will contribute to a better understanding of hydro-mechanical phenomena and induced seismicity associated with high-pressure fluid injections. Comprehensive monitoring during stimulation will include observation of injection rate and pressure, pressure propagation in the reservoir, permeability enhancement, 3D dislocation along the faults, rock mass deformation near the fault zone, as well as micro-seismicity. The experimental volume is surrounded by other in-situ experiments (at 50 to 500 m distance) and by infrastructure of the local hydropower company (at ~100 m to several kilometres distance). Although it is generally agreed among stakeholders related to the experiments that levels of induced seismicity may be low given the small total injection volumes of less than 1 m3, detailed analysis of the potential impact of the stimulation on other experiments and surrounding infrastructure is essential to ensure operational safety. In this contribution, we present a procedure how induced seismic hazard can be estimated for an experimental situation that is untypical for injection-induced seismicity in terms of injection volumes, injection depths and proximity to affected objects. Both, deterministic and probabilistic methods are employed to estimate that maximum possible and the maximum expected induced

  11. Deaggregation of Probabilistic Ground Motions in the Central and Eastern United States

    USGS Publications Warehouse

    Harmsen, S.; Perkins, D.; Frankel, A.

    1999-01-01

    Probabilistic seismic hazard analysis (PSHA) is a technique for estimating the annual rate of exceedance of a specified ground motion at a site due to known and suspected earthquake sources. The relative contributions of the various sources to the total seismic hazard are determined as a function of their occurrence rates and their ground-motion potential. The separation of the exceedance contributions into bins whose base dimensions are magnitude and distance is called deaggregation. We have deaggregated the hazard analyses for the new USGS national probabilistic ground-motion hazard maps (Frankel et al., 1996). For points on a 0.2?? grid in the central and eastern United States (CEUS), we show color maps of the geographical variation of mean and modal magnitudes (M??, M??) and distances (D??, D??) for ground motions having a 2% chance of exceedance in 50 years. These maps are displayed for peak horizontal acceleration and for spectral response accelerations of 0.2, 0.3, and 1.0 sec. We tabulate M??, D??, M??, and D?? for 49 CEUS cities for 0.2- and 1.0-sec response. Thus, these maps and tables are PSHA-derived estimates of the potential earthquakes that dominate seismic hazard at short and intermediate periods in the CEUS. The contribution to hazard of the New Madrid and Charleston sources dominates over much of the CEUS; for 0.2-sec response, over 40% of the area; for 1.0-sec response, over 80% of the area. For 0.2-sec response, D?? ranges from 20 to 200 km, for 1.0 sec, 30 to 600 km. For sites influenced by New Madrid or Charleston, D is less than the distance to these sources, and M?? is less than the characteristic magnitude of these sources, because averaging takes into account the effect of smaller magnitude and closer sources. On the other hand, D?? is directly the distance to New Madrid or Charleston and M?? for 0.2- and 1.0-sec response corresponds to the dominating source over much of the CEUS. For some cities in the North Atlantic states, short

  12. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  13. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  14. Disaggregated seismic hazard and the elastic input energy spectrum: An approach to design earthquake selection

    NASA Astrophysics Data System (ADS)

    Chapman, Martin Colby

    1998-12-01

    The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression

  15. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-01-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  16. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  17. New seismic hazard maps for Puerto Rico and the U.S. Virgin Islands

    USGS Publications Warehouse

    Mueller, C.; Frankel, A.; Petersen, M.; Leyendecker, E.

    2010-01-01

    The probabilistic methodology developed by the U.S. Geological Survey is applied to a new seismic hazard assessment for Puerto Rico and the U.S. Virgin Islands. Modeled seismic sources include gridded historical seismicity, subduction-interface and strike-slip faults with known slip rates, and two broad zones of crustal extension with seismicity rates constrained by GPS geodesy. We use attenuation relations from western North American and worldwide data, as well as a Caribbean-specific relation. Results are presented as maps of peak ground acceleration and 0.2- and 1.0-second spectral response acceleration for 2% and 10% probabilities of exceedance in 50 years (return periods of about 2,500 and 500 years, respectively). This paper describes the hazard model and maps that were balloted by the Building Seismic Safety Council and recommended for the 2003 NEHRP Provisions and the 2006 International Building Code. ?? 2010, Earthquake Engineering Research Institute.

  18. Seismic Hazard Maps for Seattle, Washington, Incorporating 3D Sedimentary Basin Effects, Nonlinear Site Response, and Rupture Directivity

    USGS Publications Warehouse

    Frankel, Arthur D.; Stephenson, William J.; Carver, David L.; Williams, Robert A.; Odum, Jack K.; Rhea, Susan

    2007-01-01

    This report presents probabilistic seismic hazard maps for Seattle, Washington, based on over 500 3D simulations of ground motions from scenario earthquakes. These maps include 3D sedimentary basin effects and rupture directivity. Nonlinear site response for soft-soil sites of fill and alluvium was also applied in the maps. The report describes the methodology for incorporating source and site dependent amplification factors into a probabilistic seismic hazard calculation. 3D simulations were conducted for the various earthquake sources that can affect Seattle: Seattle fault zone, Cascadia subduction zone, South Whidbey Island fault, and background shallow and deep earthquakes. The maps presented in this document used essentially the same set of faults and distributed-earthquake sources as in the 2002 national seismic hazard maps. The 3D velocity model utilized in the simulations was validated by modeling the amplitudes and waveforms of observed seismograms from five earthquakes in the region, including the 2001 M6.8 Nisqually earthquake. The probabilistic seismic hazard maps presented here depict 1 Hz response spectral accelerations with 10%, 5%, and 2% probabilities of exceedance in 50 years. The maps are based on determinations of seismic hazard for 7236 sites with a spacing of 280 m. The maps show that the most hazardous locations for this frequency band (around 1 Hz) are soft-soil sites (fill and alluvium) within the Seattle basin and along the inferred trace of the frontal fault of the Seattle fault zone. The next highest hazard is typically found for soft-soil sites in the Duwamish Valley south of the Seattle basin. In general, stiff-soil sites in the Seattle basin exhibit higher hazard than stiff-soil sites outside the basin. Sites with shallow bedrock outside the Seattle basin have the lowest estimated hazard for this frequency band.

  19. Staging of the Acoustic Response at Laboratory Modelling of Tidal Influence upon Seismicity

    NASA Astrophysics Data System (ADS)

    Saltykov, Vadim; Patonin, Andrey; Kugaenko, Yulia

    2010-05-01

    INTRODUCTION The seismic radiation is varied through the wide range of seismic energy from seismic emission (high-frequency seismic noise, HFSN) to earthquakes. Some features of external influence response on the different scales allow to consider the medium as a single whole seismoactive object. Earth tide is a bright example of external excited field. Tidal topic has long history in seismology. Results obtained by different scientists are contradictory and ambiguous often. We denoted instability of tidal effect manifestation as possible reason of this situation. In view of the aforesaid it is significant, that tidal effects in weak seismicity and HFSN prove more strongly in the stage of large earthquake preparation [Rykunov et al., 1998, Saltykov et al., 2004, 2007]. It is presumed that the metastable medium has more high tidal sensitivity. For example, sources of prepared earthquakes and extensive near-surface zones of micro-fissuring and dilatancy, which appear during source formation and stretch far enough. [Alekseev et all., 2001, Goldin, 2004, 2005]. Common features of observed effects allow to suggest existence of tidal modulation mechanism, which is similar (may be single) for different seismic scales. Modelling of these processes can improve our understanding of tidal effect nature. LABORATORY EXPERIMENT Results of rock sample destruction experiments under controlling are presented. Acoustic emission (AE) pulses act as analogue of seismic events. Tides are simulated by weak long-period variations added to quasi-stationary subcritical loading. The results of tidal modeling confirmed AE intensity synchronization with external periodic influence with large (5-10%) variations of loading are known [Lockner, Beeler, 1999, Ponomarev et al., 2007]. But real (in nature) tidal strain&stress variations are much less and equal to splits of percent. Therefore, investigation of weak modulation influence upon deformed rock is one of main proposed purposes. Used software

  20. Evaluation of stress and saturation effects on seismic velocity and electrical resistivity - laboratory testing of rock samples

    NASA Astrophysics Data System (ADS)

    Vilhelm, Jan; Jirků, Jaroslav; Slavík, Lubomír; Bárta, Jaroslav

    2016-04-01

    Repository, located in a deep geological formation, is today considered the most suitable solution for disposal of spent nuclear fuel and high-level waste. The geological formations, in combination with an engineered barrier system, should ensure isolation of the waste from the environment for thousands of years. For long-term monitoring of such underground excavations special monitoring systems are developed. In our research we developed and tested monitoring system based on repeated ultrasonic time of flight measurement and electrical resistivity tomography (ERT). As a test site Bedřichov gallery in the northern Bohemia was selected. This underground gallery in granitic rock was excavated using Tunnel Boring Machine (TBM). The seismic high-frequency measurements are performed by pulse-transmission technique directly on the rock wall using one seismic source and three receivers in the distances of 1, 2 and 3 m. The ERT measurement is performed also on the rock wall using 48 electrodes. The spacing between electrodes is 20 centimeters. An analysis of relation of seismic velocity and electrical resistivity on water saturation and stress state of the granitic rock is necessary for the interpretation of both seismic monitoring and ERT. Laboratory seismic and resistivity measurements were performed. One series of experiments was based on uniaxial loading of dry and saturated granitic samples. The relation between stress state and ultrasonic wave velocities was tested separately for dry and saturated rock samples. Other experiments were focused on the relation between electrical resistivity of the rock sample and its saturation level. Rock samples with different porosities were tested. Acknowledgments: This work was partially supported by the Technology Agency of the Czech Republic, project No. TA 0302408

  1. Probabilistic Sensitivity Analysis of Fretting Fatigue (Preprint)

    DTIC Science & Technology

    2009-04-01

    AFRL-RX-WP-TP-2009-4091 PROBABILISTIC SENSITIVITY ANALYSIS OF FRETTING FATIGUE (Preprint) Patrick J. Golden, Harry R. Millwater , and...Sensitivity Analysis of Fretting Fatigue Patrick J. Golden * Air Force Research Laboratory, Wright-Patterson AFB, OH 45433 Harry R. Millwater † and

  2. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  3. Global Infrasound Association Based on Probabilistic Clutter Categorization

    NASA Astrophysics Data System (ADS)

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  4. New strong motion network in Georgia: basis for specifying seismic hazard

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.

    2017-12-01

    Risk created by hazardous natural events is closely related to sustainable development of the society. Global observations have confirmed tendency of growing losses resulting from natural disasters, one of the most dangerous and destructive if which are earthquakes. Georgia is located in seismically active region. So, it is imperative to evaluate probabilistic seismic hazard and seismic risk with proper accuracy. National network of Georgia includes 35 station all of which are seismometers. There are significant gaps in strong motion recordings, which essential for seismic hazard assessment. To gather more accelerometer recordings, we have built a strong motion network distributed on the territory of Georgia. The network includes 6 stations for now, with Basalt 4x datalogger and strong motion sensor Episensor ES-T. For each site, Vs30 and soil resonance frequencies have been measured. Since all but one station (Tabakhmelam near Tbilisi), are located far from power and internet lines special system was created for instrument operation. Solar power is used to supply the system with electricity and GSM/LTE modems for internet access. VPN tunnel was set up using Raspberry pi, for two-way communication with stations. Tabakhmela station is located on grounds of Ionosphere Observatory, TSU and is used as a hub for the network. This location also includes a broadband seismometer and VLF electromagnetic waves observation antenna, for possible earthquake precursor studies. On server, located in Tabakhmela, the continues data is collected from all the stations, for later use. The recordings later will be used in different seismological and engineering problems, namely selecting and creating GMPE model for Caucasus, for probabilistic seismic hazard and seismic risk evaluation. These stations are a start and in the future expansion of strong motion network is planned. Along with this, electromagnetic wave observations will continue and additional antennas will be implemented

  5. Laboratory scale micro-seismic monitoring of rock faulting and injection-induced fault reactivation

    NASA Astrophysics Data System (ADS)

    Sarout, J.; Dautriat, J.; Esteban, L.; Lumley, D. E.; King, A.

    2017-12-01

    The South West Hub CCS project in Western Australia aims to evaluate the feasibility and impact of geosequestration of CO2 in the Lesueur sandstone formation. Part of this evaluation focuses on the feasibility and design of a robust passive seismic monitoring array. Micro-seismicity monitoring can be used to image the injected CO2plume, or any geomechanical fracture/fault activity; and thus serve as an early warning system by measuring low-level (unfelt) seismicity that may precede potentially larger (felt) earthquakes. This paper describes laboratory deformation experiments replicating typical field scenarios of fluid injection in faulted reservoirs. Two pairs of cylindrical core specimens were recovered from the Harvey-1 well at depths of 1924 m and 2508 m. In each specimen a fault is first generated at the in situ stress, pore pressure and temperature by increasing the vertical stress beyond the peak in a triaxial stress vessel at CSIRO's Geomechanics & Geophysics Lab. The faulted specimen is then stabilized by decreasing the vertical stress. The freshly formed fault is subsequently reactivated by brine injection and increase of the pore pressure until slip occurs again. This second slip event is then controlled in displacement and allowed to develop for a few millimeters. The micro-seismic (MS) response of the rock during the initial fracturing and subsequent reactivation is monitored using an array of 16 ultrasonic sensors attached to the specimen's surface. The recorded MS events are relocated in space and time, and correlate well with the 3D X-ray CT images of the specimen obtained post-mortem. The time evolution of the structural changes induced within the triaxial stress vessel is therefore reliably inferred. The recorded MS activity shows that, as expected, the increase of the vertical stress beyond the peak led to an inclined shear fault. The injection of fluid and the resulting increase in pore pressure led first to a reactivation of the pre

  6. The New Italian Seismic Hazard Model

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.

    2017-12-01

    In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme

  7. Risk-targeted versus current seismic design maps for the conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  8. Physically-Based Probabilistic Seismic Hazard Analysis Using Broad-Band Ground Motion Simulation: a Case Study for Prince Islands Fault, Marmara Sea

    NASA Astrophysics Data System (ADS)

    Mert, A.

    2016-12-01

    The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.

  9. Physically based probabilistic seismic hazard analysis using broadband ground motion simulation: a case study for the Prince Islands Fault, Marmara Sea

    NASA Astrophysics Data System (ADS)

    Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali

    2016-08-01

    The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.

  10. Probabilistic analysis of the torsional effects on the tall building resistance due to earthquake even

    NASA Astrophysics Data System (ADS)

    Králik, Juraj; Králik, Juraj

    2017-07-01

    The paper presents the results from the deterministic and probabilistic analysis of the accidental torsional effect of reinforced concrete tall buildings due to earthquake even. The core-column structural system was considered with various configurations in plane. The methodology of the seismic analysis of the building structures in Eurocode 8 and JCSS 2000 is discussed. The possibilities of the utilization the LHS method to analyze the extensive and robust tasks in FEM is presented. The influence of the various input parameters (material, geometry, soil, masses and others) is considered. The deterministic and probability analysis of the seismic resistance of the structure was calculated in the ANSYS program.

  11. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    CRISIS is a computer tool for probabilistic seismic hazard analysis (PSHA), whose development started in the late 1980's at the Instituto de Ingeniería, UNAM, Mexico. It started circulating outside the Mexican borders at the beginning of the 1990's, when it was first distributed as part of SEISAN tools. Throughout the years, CRISIS has been used for seismic hazard studies in several countries in Latin America (Mexico, Guatemala, Belize, El Salvador, Honduras, Nicaragua, Costa Rica, Panama, Colombia, Venezuela, Ecuador, Peru, Argentina and Chile), and in many other countries of the World. CRISIS has always circulated free of charge for non-commercial applications. It is worth noting that CRISIS has been mainly written by people that are, at the same time, PSHA practitioners. Therefore, the development loop has been relatively short, and most of the modifications and improvements have been made to satisfy the needs of the developers themselves. CRISIS has evolved from a rather simple FORTRAN code to a relatively complex program with a friendly graphical interface, able to handle a variety of modeling possibilities for source geometries, seismicity descriptions and ground motion prediction models (GMPM). We will describe some of the improvements made for the newest version of the code: CRISIS 2012.These improvements, some of which were made in the frame of the Italian research project INGV-DPC S2 (http://nuovoprogettoesse2.stru.polimi.it/), funded by the Dipartimento della Protezione Civile (DPC; National Civil Protection Department), include: A wider variety of source geometries A wider variety of seismicity models, including the ability to handle non-Poissonian occurrence models and Poissonian smoothed-seismicity descriptions. Enhanced capabilities for using different kinds of GMPM: attenuation tables, built-in models and generalized attenuation models. In the case of built-in models, there is, by default, a set ready to use in CRISIS, but additional custom GMPMs

  12. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after

  13. LANL seismic screening method for existing buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method andmore » will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.« less

  14. A probabilistic seismic risk assessment procedure for nuclear power plants: (II) Application

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    This paper presents the procedures and results of intensity- and time-based seismic risk assessments of a sample nuclear power plant (NPP) to demonstrate the risk-assessment methodology proposed in its companion paper. The intensity-based assessments include three sets of sensitivity studies to identify the impact of the following factors on the seismic vulnerability of the sample NPP, namely: (1) the description of fragility curves for primary and secondary components of NPPs, (2) the number of simulations of NPP response required for risk assessment, and (3) the correlation in responses between NPP components. The time-based assessment is performed as a series of intensity-based assessments. The studies illustrate the utility of the response-based fragility curves and the inclusion of the correlation in the responses of NPP components directly in the risk computation. ?? 2011 Published by Elsevier B.V.

  15. Considering potential seismic sources in earthquake hazard assessment for Northern Iran

    NASA Astrophysics Data System (ADS)

    Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila

    2014-07-01

    Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.

  16. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  17. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  18. Propagation of the velocity model uncertainties to the seismic event location

    NASA Astrophysics Data System (ADS)

    Gesret, A.; Desassis, N.; Noble, M.; Romary, T.; Maisons, C.

    2015-01-01

    Earthquake hypocentre locations are crucial in many domains of application (academic and industrial) as seismic event location maps are commonly used to delineate faults or fractures. The interpretation of these maps depends on location accuracy and on the reliability of the associated uncertainties. The largest contribution to location and uncertainty errors is due to the fact that the velocity model errors are usually not correctly taken into account. We propose a new Bayesian formulation that integrates properly the knowledge on the velocity model into the formulation of the probabilistic earthquake location. In this work, the velocity model uncertainties are first estimated with a Bayesian tomography of active shot data. We implement a sampling Monte Carlo type algorithm to generate velocity models distributed according to the posterior distribution. In a second step, we propagate the velocity model uncertainties to the seismic event location in a probabilistic framework. This enables to obtain more reliable hypocentre locations as well as their associated uncertainties accounting for picking and velocity model uncertainties. We illustrate the tomography results and the gain in accuracy of earthquake location for two synthetic examples and one real data case study in the context of induced microseismicity.

  19. Fully probabilistic earthquake source inversion on teleseismic scales

    NASA Astrophysics Data System (ADS)

    Stähler, Simon; Sigloch, Karin

    2017-04-01

    solutions treated as a quality-controlled reference, we derive the noise distribution on signal decorrelation D of the broadband seismogram fits between observed and modelled waveforms. The noise on D is found to approximately follow a log-normal distribution, a fortunate fact that readily accommodates the formulation of an empirical likelihood function for D for our multivariate problem. The first and second moments of this multivariate distribution are shown to depend mostly on the signal-to-noise ratio (SNR) of the CC measurements and on the back-azimuthal distances of seismic stations. References: Stähler, S. C. and Sigloch, K.: Fully probabilistic seismic source inversion - Part 1: Efficient parameterisation, Solid Earth, 5, 1055-1069, doi:10.5194/se-5-1055-2014, 2014. Stähler, S. C. and Sigloch, K.: Fully probabilistic seismic source inversion - Part 2: Modelling errors and station covariances, Solid Earth, 7, 1521-1536, doi:10.5194/se-7-1521-2016, 2016.

  20. Maps Showing Seismic Landslide Hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.; Michael, John A.

    2009-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =~300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazard zones were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  1. Maps showing seismic landslide hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.

    2014-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazards were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  2. Stress drop with constant, scale independent seismic efficiency and overshoot

    USGS Publications Warehouse

    Beeler, N.M.

    2001-01-01

    To model dissipated and radiated energy during earthquake stress drop, I calculate dynamic fault slip using a single degree of freedom spring-slider block and a laboratory-based static/kinetic fault strength relation with a dynamic stress drop proportional to effective normal stress. The model is scaled to earthquake size assuming a circular rupture; stiffness varies inversely with rupture radius, and rupture duration is proportional to radius. Calculated seismic efficiency, the ratio of radiated to total energy expended during stress drop, is in good agreement with laboratory and field observations. Predicted overshoot, a measure of how much the static stress drop exceeds the dynamic stress drop, is higher than previously published laboratory and seismic observations and fully elasto-dynamic calculations. Seismic efficiency and overshoot are constant, independent of normal stress and scale. Calculated variation of apparent stress with seismic moment resembles the observational constraints of McGarr [1999].

  3. Seismic hazard in the Intermountain West

    USGS Publications Warehouse

    Haller, Kathleen; Moschetti, Morgan P.; Mueller, Charles; Rezaeian, Sanaz; Petersen, Mark D.; Zeng, Yuehua

    2015-01-01

    The 2014 national seismic-hazard model for the conterminous United States incorporates new scientific results and important model adjustments. The current model includes updates to the historical catalog, which is spatially smoothed using both fixed-length and adaptive-length smoothing kernels. Fault-source characterization improved by adding faults, revising rates of activity, and incorporating new results from combined inversions of geologic and geodetic data. The update also includes a new suite of published ground motion models. Changes in probabilistic ground motion are generally less than 10% in most of the Intermountain West compared to the prior assessment, and ground-motion hazard in four Intermountain West cities illustrates the range and magnitude of change in the region. Seismic hazard at reference sites in Boise and Reno increased as much as 10%, whereas hazard in Salt Lake City decreased 5–6%. The largest change was in Las Vegas, where hazard increased 32–35%.

  4. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

  5. Seismic Hazard Maps for the Maltese Archipelago: Preliminary Results

    NASA Astrophysics Data System (ADS)

    D'Amico, S.; Panzera, F.; Galea, P. M.

    2013-12-01

    The Maltese islands form an archipelago of three major islands lying in the Sicily channel at about 140 km south of Sicily and 300 km north of Libya. So far very few investigations have been carried out on seismicity around the Maltese islands and no maps of seismic hazard for the archipelago are available. Assessing the seismic hazard for the region is currently of prime interest for the near-future development of industrial and touristic facilities as well as for urban expansion. A culture of seismic risk awareness has never really been developed in the country, and the public perception is that the islands are relatively safe, and that any earthquake phenomena are mild and infrequent. However, the Archipelago has been struck by several moderate/large events. Although recent constructions of a certain structural and strategic importance have been built according to high engineering standards, the same probably cannot be said for all residential buildings, many higher than 3 storeys, which have mushroomed rapidly in recent years. Such buildings are mostly of unreinforced masonry, with heavy concrete floor slabs, which are known to be highly vulnerable to even moderate ground shaking. We can surely state that in this context planning and design should be based on available national hazard maps. Unfortunately, these kinds of maps are not available for the Maltese islands. In this paper we attempt to compute a first and preliminary probabilistic seismic hazard assessment of the Maltese islands in terms of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) at different periods. Seismic hazard has been computed using the Esteva-Cornell (1968) approach which is the most widely utilized probabilistic method. It is a zone-dependent approach: seismotectonic and geological data are used coupled with earthquake catalogues to identify seismogenic zones within which earthquakes occur at certain rates. Therefore the earthquake catalogues can be reduced to the

  6. Updating the USGS seismic hazard maps for Alaska

    USGS Publications Warehouse

    Mueller, Charles; Briggs, Richard; Wesson, Robert L.; Petersen, Mark D.

    2015-01-01

    The U.S. Geological Survey makes probabilistic seismic hazard maps and engineering design maps for building codes, emergency planning, risk management, and many other applications. The methodology considers all known earthquake sources with their associated magnitude and rate distributions. Specific faults can be modeled if slip-rate or recurrence information is available. Otherwise, areal sources are developed from earthquake catalogs or GPS data. Sources are combined with ground-motion estimates to compute the hazard. The current maps for Alaska were developed in 2007, and included modeled sources for the Alaska-Aleutian megathrust, a few crustal faults, and areal seismicity sources. The megathrust was modeled as a segmented dipping plane with segmentation largely derived from the slip patches of past earthquakes. Some megathrust deformation is aseismic, so recurrence was estimated from seismic history rather than plate rates. Crustal faults included the Fairweather-Queen Charlotte system, the Denali–Totschunda system, the Castle Mountain fault, two faults on Kodiak Island, and the Transition fault, with recurrence estimated from geologic data. Areal seismicity sources were developed for Benioff-zone earthquakes and for crustal earthquakes not associated with modeled faults. We review the current state of knowledge in Alaska from a seismic-hazard perspective, in anticipation of future updates of the maps. Updated source models will consider revised seismicity catalogs, new information on crustal faults, new GPS data, and new thinking on megathrust recurrence, segmentation, and geometry. Revised ground-motion models will provide up-to-date shaking estimates for crustal earthquakes and subduction earthquakes in Alaska.

  7. Re-evaluation and updating of the seismic hazard of Lebanon

    NASA Astrophysics Data System (ADS)

    Huijer, Carla; Harajli, Mohamed; Sadek, Salah

    2016-01-01

    This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.

  8. Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels

    NASA Astrophysics Data System (ADS)

    Kurniasih, R.; Sujadi, I.

    2017-09-01

    This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.

  9. Fast, Nonlinear, Fully Probabilistic Inversion of Large Geophysical Problems

    NASA Astrophysics Data System (ADS)

    Curtis, A.; Shahraeeni, M.; Trampert, J.; Meier, U.; Cho, G.

    2010-12-01

    Almost all Geophysical inverse problems are in reality nonlinear. Fully nonlinear inversion including non-approximated physics, and solving for probability distribution functions (pdf’s) that describe the solution uncertainty, generally requires sampling-based Monte-Carlo style methods that are computationally intractable in most large problems. In order to solve such problems, physical relationships are usually linearized leading to efficiently-solved, (possibly iterated) linear inverse problems. However, it is well known that linearization can lead to erroneous solutions, and in particular to overly optimistic uncertainty estimates. What is needed across many Geophysical disciplines is a method to invert large inverse problems (or potentially tens of thousands of small inverse problems) fully probabilistically and without linearization. This talk shows how very large nonlinear inverse problems can be solved fully probabilistically and incorporating any available prior information using mixture density networks (driven by neural network banks), provided the problem can be decomposed into many small inverse problems. In this talk I will explain the methodology, compare multi-dimensional pdf inversion results to full Monte Carlo solutions, and illustrate the method with two applications: first, inverting surface wave group and phase velocities for a fully-probabilistic global tomography model of the Earth’s crust and mantle, and second inverting industrial 3D seismic data for petrophysical properties throughout and around a subsurface hydrocarbon reservoir. The latter problem is typically decomposed into 104 to 105 individual inverse problems, each solved fully probabilistically and without linearization. The results in both cases are sufficiently close to the Monte Carlo solution to exhibit realistic uncertainty, multimodality and bias. This provides far greater confidence in the results, and in decisions made on their basis.

  10. Forecasting induced seismicity rate and Mmax using calibrated numerical models

    NASA Astrophysics Data System (ADS)

    Dempsey, D.; Suckale, J.

    2016-12-01

    At Groningen, The Netherlands, several decades of induced seismicity from gas extraction has culminated in a M 3.6 event (mid 2012). From a public safety and commercial perspective, it is desirable to anticipate future seismicity outcomes at Groningen. One way to quantify earthquake risk is Probabilistic Seismic Hazard Analysis (PSHA), which requires an estimate of the future seismicity rate and its magnitude frequency distribution (MFD). This approach is effective at quantifying risk from tectonic events because the seismicity rate, once measured, is almost constant over timescales of interest. In contrast, rates of induced seismicity vary significantly over building lifetimes, largely in response to changes in injection or extraction. Thus, the key to extending PSHA to induced earthquakes is to estimate future changes of the seismicity rate in response to some proposed operating schedule. Numerical models can describe the physical link between fluid pressure, effective stress change, and the earthquake process (triggering and propagation). However, models with predictive potential of individual earthquakes face the difficulty of characterizing specific heterogeneity - stress, strength, roughness, etc. - at locations of interest. Modeling catalogs of earthquakes provides a means of averaging over this uncertainty, focusing instead on the collective features of the seismicity, e.g., its rate and MFD. The model we use incorporates fluid pressure and stress changes to describe nucleation and crack-like propagation of earthquakes on stochastically characterized 1D faults. This enables simulation of synthetic catalogs of induced seismicity from which the seismicity rate, location and MFD are extracted. A probability distribution for Mmax - the largest event in some specified time window - is also computed. Because the model captures the physics linking seismicity to changes in the reservoir, earthquake observations and operating information can be used to calibrate a

  11. Global Infrasound Association Based on Probabilistic Clutter Categorization

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Mialle, P.

    2015-12-01

    The IDC collects waveforms from a global network of infrasound sensors maintained by the IMS, and automatically detects signal onsets and associates them to form event hypotheses. However, a large number of signal onsets are due to local clutter sources such as microbaroms (from standing waves in the oceans), waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NET-VISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydro-acoustic and infrasound processing built on a unified probabilistic framework. Notes: The attached figure shows all the unassociated arrivals detected at IMS station I09BR for 2012 distributed by azimuth and center frequency. (The title displays the bandwidth of the kernel density estimate along the azimuth and frequency dimensions).This plot shows multiple micro-barom sources as well as other sources of infrasound clutter. A diverse clutter-field such as this one is quite common for most IMS infrasound stations, and it highlights the dangers of forming events without due consideration of this source of noise. References: [1] Infrasound categorization Towards a statistics-based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NET-VISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013.

  12. Long term seismic noise acquisition and analysis with tunable monolithic horizontal sensors at the INFN Gran Sasso National Laboratory

    NASA Astrophysics Data System (ADS)

    Acernese, F.; De Rosa, R.; Giordano, G.; Romano, R.; Barone, F.

    2012-04-01

    In this paper we present the scientific data recorded by tunable mechanical monolithic horizontal seismometers located in the Gran Sasso National Laboratory of the INFN, within thermally insulating enclosures onto concrete slabs connected to the bedrock. The main goals of this long term test are a preliminary seismic characterization of the site in the frequency band 10-5÷1Hz and the acquisition of all the relevant information for the optimization of the sensors.

  13. Long term seismic noise acquisition and analysis with tunable monolithic horizontal sensors at the INFN Gran Sasso National Laboratory

    NASA Astrophysics Data System (ADS)

    Acernese, F.; Canonico, R.; De Rosa, R.; Giordano, G.; Romano, R.; Barone, F.

    2012-10-01

    In this paper we present the scientific data recorded by tunable mechanical monolithic horizontal seismometers located in the Gran Sasso National Laboratory of the INFN, within thermally insulating enclosures onto concrete slabs connected to the bedrock. The main goals of this long term test are a preliminary seismic characterization of the site in the frequency band 10-7÷1Hz and the acquisition of all the relevant information for the optimization of the sensors.

  14. Seismic Characterization of EGS Reservoirs

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.

    2014-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  15. Borehole seismic monitoring of seismic stimulation at OccidentalPermian Ltd's -- South Wason Clear Fork Unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daley, Tom; Majer, Ernie

    2007-04-30

    Seismic stimulation is a proposed enhanced oil recovery(EOR) technique which uses seismic energy to increase oil production. Aspart of an integrated research effort (theory, lab and field studies),LBNL has been measuring the seismic amplitude of various stimulationsources in various oil fields (Majer, et al., 2006, Roberts,et al.,2001, Daley et al., 1999). The amplitude of the seismic waves generatedby a stimulation source is an important parameter for increased oilmobility in both theoretical models and laboratory core studies. Theseismic amplitude, typically in units of seismic strain, can be measuredin-situ by use of a borehole seismometer (geophone). Measuring thedistribution of amplitudes within amore » reservoir could allow improved designof stimulation source deployment. In March, 2007, we provided in-fieldmonitoring of two stimulation sources operating in Occidental (Oxy)Permian Ltd's South Wasson Clear Fork (SWCU) unit, located near DenverCity, Tx. The stimulation source is a downhole fluid pulsation devicedeveloped by Applied Seismic Research Corp. (ASR). Our monitoring used aborehole wall-locking 3-component geophone operating in two nearbywells.« less

  16. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  17. Seismic Anisotropy from Surface Refraction Measurements

    NASA Astrophysics Data System (ADS)

    Vilhelm, J.; Hrdá, J.; Klíma, K.; Lokajícek, T.; Pros, Z.

    2003-04-01

    The contribution deals with the methods of determining P and S wave velocities in the shallow refraction seismics. The comparison of a P-wave anisotropy from samples and field surface measurement is performed. The laboratory measurement of the P-wave velocity is realized as omni directional ultrasound measurement on oriented spherical samples (diameter 5 cm) under a hydrostatic pressure up to 400 MPa. The field measurement is based on the processing of at least one pair of reversed time-distance curves of refracted waves. Different velocity calculation techniques are involved including tomographic approach from the surface. It is shown that field seismic measurement can reflect internal rock fabric (lineation, mineral anisotropy) as well as effects connected with the fracturing and weathering. The elastic constants derived from laboratory measurements exhibit transversal isotropy. For the estimation of anisotropy influence we perform ray-tracing by the software package ANRAY (Consortium Seismic Waves in Complex 3-D Structures). The use of P and S wave anisotropy measurement to determine hard rock hydro-geological collector (water resource) is presented. In a relatively homogeneous lutaceous sedimentary medium we identified a transversally isotropic layer which exhibits increased value of permeability (transmisivity). The seismic measurement is realized by three component geophones with both vertical and shear seismic sources. VLF and resistivity profiling accompany the filed survey.

  18. Probabilistic Reasoning Over Seismic Time Series: Volcano Monitoring by Hidden Markov Models at Mt. Etna

    NASA Astrophysics Data System (ADS)

    Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea; Montalto, Placido; Patanè, Domenico; Privitera, Eugenio

    2016-07-01

    From January 2011 to December 2015, Mt. Etna was mainly characterized by a cyclic eruptive behavior with more than 40 lava fountains from New South-East Crater. Using the RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area, an automatic recognition of the different states of volcanic activity (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN) has been applied for monitoring purposes. Since values of the RMS time series calculated on the seismic signal are generated from a stochastic process, we can try to model the system generating its sampled values, assumed to be a Markov process, using Hidden Markov Models (HMMs). HMMs analysis seeks to recover the sequence of hidden states from the observations. In our framework, observations are characters generated by the Symbolic Aggregate approXimation (SAX) technique, which maps RMS time series values with symbols of a pre-defined alphabet. The main advantages of the proposed framework, based on HMMs and SAX, with respect to other automatic systems applied on seismic signals at Mt. Etna, are the use of multiple stations and static thresholds to well characterize the volcano states. Its application on a wide seismic dataset of Etna volcano shows the possibility to guess the volcano states. The experimental results show that, in most of the cases, we detected lava fountains in advance.

  19. Numerical modeling of seismic anomalies at impact craters on a laboratory scale

    NASA Astrophysics Data System (ADS)

    Wuennemann, K.; Grosse, C. U.; Hiermaier, S.; Gueldemeister, N.; Moser, D.; Durr, N.

    2011-12-01

    Almost all terrestrial impact craters exhibit a typical geophysical signature. The usually observed circular negative gravity anomaly and reduced seismic velocities in the vicinity of crater structures are presumably related to an approximately hemispherical zone underneath craters where rocks have experienced intense brittle plastic deformation and fracturing during formation (see Fig.1). In the framework of the "MEMIN" (multidisciplinary experimental and modeling impact crater research network) project we carried out hypervelocity cratering experiments at the Fraunhofer Institute for High-Speed Dynamics on a decimeter scale to study the spatiotemporal evolution of the damage zone using ultrasound, acoustic emission techniques, and numerical modeling of crater formation. 2.5-10 mm iron projectiles were shot at 2-5.5 km/s on dry and water-saturated sandstone targets. The target material was characterized before, during and after the impact with high spatial resolution acoustic techniques to detect the extent of the damage zone, the state of rocks therein and to record the growth of cracks. The ultrasound measurements are applied analog to seismic surveys at natural craters but used on a different - i.e. much smaller - scale. We compare the measured data with dynamic models of crater formation, shock, plastic and elastic wave propagation, and tensile/shear failure of rocks in the impacted sandstone blocks. The presence of porosity and pore water significantly affects the propagation of waves. In particular the crushing of pores due to shock compression has to be taken into account. We present preliminary results showing good agreement between experiments and numerical model. In a next step we plan to use the numerical models to upscale the results from laboratory dimensions to the scale of natural impact craters.

  20. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  1. Forecasting probabilistic seismic shaking for greater Tokyo from 400 years of intensity observations

    USGS Publications Warehouse

    Bozkurt, S.B.; Stein, R.S.; Toda, S.

    2007-01-01

    The long recorded history of earthquakes in Japan affords an opportunity to forecast seismic shaking exclusively from past shaking. We calculate the time-averaged (Poisson) probability of severe shaking by using more than 10,000 intensity observations recorded since AD 1600 in a 350 km-wide box centered on Tokyo. Unlike other hazard-assessment methods, source and site effects are included without modeling, and we do not need to know the size or location of any earthquake nor the location and slip rate of any fault. The two key assumptions are that the slope of the observed frequency-intensity relation at every site is the same, and that the 400-year record is long enough to encompass the full range of seismic behavior. Tests we conduct here suggest that both assumptions are sound. The resulting 30-year probability of IJMA ??? 6 shaking (??? PGA ??? 0.4 g or MMI ??? IX) is 30%-40% in Tokyo, Kawasaki, and Yokohama, and 10% 15% in Chiba and Tsukuba. This result means that there is a 30% chance that 4 million people will be subjected to IJMA ??? 6 shaking during an average 30-year period. We also produce exceedance maps of PGA for building-code regulations, and calculate short-term hazard associated with a hypothetical catastrophe bond. Our results resemble an independent assessment developed from conventional seismic hazard analysis for greater Tokyo. ?? 2007, Earthquake Engineering Research Institute.

  2. Mammoth Mountain, California broadband seismic experiment

    NASA Astrophysics Data System (ADS)

    Dawson, P. B.; Pitt, A. M.; Wilkinson, S. K.; Chouet, B. A.; Hill, D. P.; Mangan, M.; Prejean, S. G.; Read, C.; Shelly, D. R.

    2013-12-01

    Mammoth Mountain is a young cumulo-volcano located on the southwest rim of Long Valley caldera, California. Current volcanic processes beneath Mammoth Mountain are manifested in a wide range of seismic signals, including swarms of shallow volcano-tectonic earthquakes, upper and mid-crustal long-period earthquakes, swarms of brittle-failure earthquakes in the lower crust, and shallow (3-km depth) very-long-period earthquakes. Diffuse emissions of C02 began after a magmatic dike injection beneath the volcano in 1989, and continue to present time. These indications of volcanic unrest drive an extensive monitoring effort of the volcano by the USGS Volcano Hazards Program. As part of this effort, eleven broadband seismometers were deployed on Mammoth Mountain in November 2011. This temporary deployment is expected to run through the fall of 2013. These stations supplement the local short-period and broadband seismic stations of the Northern California Seismic Network (NCSN) and provide a combined network of eighteen broadband stations operating within 4 km of the summit of Mammoth Mountain. Data from the temporary stations are not available in real-time, requiring the merging of the data from the temporary and permanent networks, timing of phases, and relocation of seismic events to be accomplished outside of the standard NCSN processing scheme. The timing of phases is accomplished through an interactive Java-based phase-picking routine, and the relocation of seismicity is achieved using the probabilistic non-linear software package NonLinLoc, distributed under the GNU General Public License by Alomax Scientific. Several swarms of shallow volcano-tectonic earthquakes, spasmodic bursts of high-frequency earthquakes, a few long-period events located within or below the edifice of Mammoth Mountain and numerous mid-crustal long-period events have been recorded by the network. To date, about 900 of the ~2400 events occurring beneath Mammoth Mountain since November 2011 have

  3. A GIS-based time-dependent seismic source modeling of Northern Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  4. Probabilistic In Situ Stress Estimation and Forecasting using Sequential Data Assimilation

    NASA Astrophysics Data System (ADS)

    Fichtner, A.; van Dinther, Y.; Kuensch, H. R.

    2017-12-01

    Our physical understanding and forecasting ability of earthquakes, and other solid Earth dynamic processes, is significantly hampered by limited indications on the evolving state of stress and strength on faults. Integrating observations and physics-based numerical modeling to quantitatively estimate this evolution of a fault's state is crucial. However, systematic attempts are limited and tenuous, especially in light of the scarcity and uncertainty of natural data and the difficulty of modelling the physics governing earthquakes. We adopt the statistical framework of sequential data assimilation - extensively developed for weather forecasting - to efficiently integrate observations and prior knowledge in a forward model, while acknowledging errors in both. To prove this concept we perform a perfect model test in a simplified subduction zone setup, where we assimilate synthetic noised data on velocities and stresses from a single location. Using an Ensemble Kalman Filter, these data and their errors are assimilated to update 150 ensemble members from a Partial Differential Equation-driven seismic cycle model. Probabilistic estimates of fault stress and dynamic strength evolution capture the truth exceptionally well. This is possible, because the sampled error covariance matrix contains prior information from the physics that relates velocities, stresses and pressure at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed such that fault coupling can be updated to either inhibit or trigger events. In the subsequent forecast step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next event. At subsequent assimilation steps, the system's forecasting ability turns out to be significantly better than that of a periodic recurrence model (requiring an alarm 17% vs. 68% of the time). This thus provides distinct

  5. Evaluation of Seismic Risk of Siberia Territory

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The outcomes of modern geophysical researches of the Geophysical Survey SB RAS, directed on study of geodynamic situation in large industrial and civil centers on the territory of Siberia with the purpose of an evaluation of seismic risk of territories and prediction of origin of extreme situations of natural and man-caused character, are pre- sented in the paper. First of all it concerns the testing and updating of a geoinformation system developed by Russian Emergency Ministry designed for calculations regarding the seismic hazard and response to distructive earthquakes. The GIS database contains the catalogues of earthquakes and faults, seismic zonation maps, vectorized city maps, information on industrial and housing fund, data on character of building and popula- tion in inhabited places etc. The geoinformation system allows to solve on a basis of probabilistic approaches the following problems: - estimating the earthquake impact, required forces, facilities and supplies for life-support of injured population; - deter- mining the consequences of failures on chemical and explosion-dangerous objects; - optimization problems on assurance technology of conduct of salvage operations. Using this computer program, the maps of earthquake risk have been constructed for several seismically dangerous regions of Siberia. These maps display the data on the probable amount of injured people and relative economic damage from an earthquake, which can occur in various sites of the territory according to the map of seismic zona- tion. The obtained maps have allowed determining places where the detailed seismo- logical observations should be arranged. Along with it on the territory of Siberia the wide-ranging investigations with use of new methods of evaluation of physical state of industrial and civil establishments (buildings and structures, hydroelectric power stations, bridges, dams, etc.), high-performance detailed electromagnetic researches of ground conditions of city

  6. Assessment of pre-crisis and syn-crisis seismic hazard at Campi Flegrei and Mt. Vesuvius volcanoes, Campania, southern Italy

    NASA Astrophysics Data System (ADS)

    Convertito, Vincenzo; Zollo, Aldo

    2011-08-01

    In this study, we address the issue of short-term to medium-term probabilistic seismic hazard analysis for two volcanic areas, Campi Flegrei caldera and Mt. Vesuvius in the Campania region of southern Italy. Two different phases of the volcanic activity are considered. The first, which we term the pre-crisis phase, concerns the present quiescent state of the volcanoes that is characterized by low-to-moderate seismicity. The second phase, syn-crisis, concerns the unrest phase that can potentially lead to eruption. For the Campi Flegrei case study, we analyzed the pattern of seismicity during the 1982-1984 ground uplift episode (bradyseism). For Mt. Vesuvius, two different time-evolutionary models for seismicity were adopted, corresponding to different ways in which the volcano might erupt. We performed a site-specific analysis, linked with the hazard map, to investigate the effects of input parameters, in terms of source geometry, mean activity rate, periods of data collection, and return periods, for the syn-crisis phase. The analysis in the present study of the pre-crisis phase allowed a comparison of the results of probabilistic seismic hazard analysis for the two study areas with those provided in the Italian national hazard map. For the Mt. Vesuvius area in particular, the results show that the hazard can be greater than that reported in the national hazard map when information at a local scale is used. For the syn-crisis phase, the main result is that the data recorded during the early months of the unrest phase are substantially representative of the seismic hazard during the whole duration of the crisis.

  7. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and

  8. Explaining differences between bioaccumulation measurements in laboratory and field data through use of a probabilistic modeling approach

    USGS Publications Warehouse

    Selck, Henriette; Drouillard, Ken; Eisenreich, Karen; Koelmans, Albert A.; Palmqvist, Annemette; Ruus, Anders; Salvito, Daniel; Schultz, Irv; Stewart, Robin; Weisbrod, Annie; van den Brink, Nico W.; van den Heuvel-Greve, Martine

    2012-01-01

    In the regulatory context, bioaccumulation assessment is often hampered by substantial data uncertainty as well as by the poorly understood differences often observed between results from laboratory and field bioaccumulation studies. Bioaccumulation is a complex, multifaceted process, which calls for accurate error analysis. Yet, attempts to quantify and compare propagation of error in bioaccumulation metrics across species and chemicals are rare. Here, we quantitatively assessed the combined influence of physicochemical, physiological, ecological, and environmental parameters known to affect bioaccumulation for 4 species and 2 chemicals, to assess whether uncertainty in these factors can explain the observed differences among laboratory and field studies. The organisms evaluated in simulations including mayfly larvae, deposit-feeding polychaetes, yellow perch, and little owl represented a range of ecological conditions and biotransformation capacity. The chemicals, pyrene and the polychlorinated biphenyl congener PCB-153, represented medium and highly hydrophobic chemicals with different susceptibilities to biotransformation. An existing state of the art probabilistic bioaccumulation model was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance) and chemical concentration in the diet became more important particularly for the most persistent compound, PCB-153. These results suggest that variation in bioaccumulation

  9. Explaining differences between bioaccumulation measurements in laboratory and field data through use of a probabilistic modeling approach.

    PubMed

    Selck, Henriette; Drouillard, Ken; Eisenreich, Karen; Koelmans, Albert A; Palmqvist, Annemette; Ruus, Anders; Salvito, Daniel; Schultz, Irv; Stewart, Robin; Weisbrod, Annie; van den Brink, Nico W; van den Heuvel-Greve, Martine

    2012-01-01

    In the regulatory context, bioaccumulation assessment is often hampered by substantial data uncertainty as well as by the poorly understood differences often observed between results from laboratory and field bioaccumulation studies. Bioaccumulation is a complex, multifaceted process, which calls for accurate error analysis. Yet, attempts to quantify and compare propagation of error in bioaccumulation metrics across species and chemicals are rare. Here, we quantitatively assessed the combined influence of physicochemical, physiological, ecological, and environmental parameters known to affect bioaccumulation for 4 species and 2 chemicals, to assess whether uncertainty in these factors can explain the observed differences among laboratory and field studies. The organisms evaluated in simulations including mayfly larvae, deposit-feeding polychaetes, yellow perch, and little owl represented a range of ecological conditions and biotransformation capacity. The chemicals, pyrene and the polychlorinated biphenyl congener PCB-153, represented medium and highly hydrophobic chemicals with different susceptibilities to biotransformation. An existing state of the art probabilistic bioaccumulation model was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance) and chemical concentration in the diet became more important particularly for the most persistent compound, PCB-153. These results suggest that variation in bioaccumulation

  10. Seismic and Restoration Assessment of Monumental Masonry Structures

    PubMed Central

    Asteris, Panagiotis G.; Douvika, Maria G.; Apostolopoulou, Maria; Moropoulou, Antonia

    2017-01-01

    Masonry structures are complex systems that require detailed knowledge and information regarding their response under seismic excitations. Appropriate modelling of a masonry structure is a prerequisite for a reliable earthquake-resistant design and/or assessment. However, modelling a real structure with a robust quantitative (mathematical) representation is a very difficult, complex and computationally-demanding task. The paper herein presents a new stochastic computational framework for earthquake-resistant design of masonry structural systems. The proposed framework is based on the probabilistic behavior of crucial parameters, such as material strength and seismic characteristics, and utilizes fragility analysis based on different failure criteria for the masonry material. The application of the proposed methodology is illustrated in the case of a historical and monumental masonry structure, namely the assessment of the seismic vulnerability of the Kaisariani Monastery, a byzantine church that was built in Athens, Greece, at the end of the 11th to the beginning of the 12th century. Useful conclusions are drawn regarding the effectiveness of the intervention techniques used for the reduction of the vulnerability of the case-study structure, by means of comparison of the results obtained. PMID:28767073

  11. Seismic and Restoration Assessment of Monumental Masonry Structures.

    PubMed

    Asteris, Panagiotis G; Douvika, Maria G; Apostolopoulou, Maria; Moropoulou, Antonia

    2017-08-02

    Masonry structures are complex systems that require detailed knowledge and information regarding their response under seismic excitations. Appropriate modelling of a masonry structure is a prerequisite for a reliable earthquake-resistant design and/or assessment. However, modelling a real structure with a robust quantitative (mathematical) representation is a very difficult, complex and computationally-demanding task. The paper herein presents a new stochastic computational framework for earthquake-resistant design of masonry structural systems. The proposed framework is based on the probabilistic behavior of crucial parameters, such as material strength and seismic characteristics, and utilizes fragility analysis based on different failure criteria for the masonry material. The application of the proposed methodology is illustrated in the case of a historical and monumental masonry structure, namely the assessment of the seismic vulnerability of the Kaisariani Monastery, a byzantine church that was built in Athens, Greece, at the end of the 11th to the beginning of the 12th century. Useful conclusions are drawn regarding the effectiveness of the intervention techniques used for the reduction of the vulnerability of the case-study structure, by means of comparison of the results obtained.

  12. A test of various site-effect parameterizations in probabilistic seismic hazard analyses of southern California

    USGS Publications Warehouse

    Field, E.H.; Petersen, M.D.

    2000-01-01

    We evaluate the implications of several attenuation relationships, including three customized for southern California, in terms of accounting for site effects in probabilistic seismic hazard studies. The analysis is carried out at 43 sites along a profile spanning the Los Angeles basin with respect to peak acceleration, and 0.3-, 1.0-, and 3.0-sec response spectral acceleration values that have a 10% chance of being exceeded in 50 years. The variability among currently viable attenuation relationships (espistemic uncertainty) is an approximate factor of 2. Biases between several commonly used attenuation relationships and southern California strong-motion data imply hazard differences that exceed 10%. However, correcting each relationship for the southern California bias does not necessarily bring hazard estimates into better agreement. A detailed subclassification of site types (beyond rock versus soil) is found to be both justified by data and to make important distinctions in terms of hazard levels. A basin depth effect is also shown to be important, implying a difference of up to a factor of 2 in ground motion between the deepest and shallowest parts of the Los Angeles basin. In fact, for peak acceleration, the basin-depth effect is even more influential than the surface site condition. Questions remain, however, whether basin depth is a proxy for some other site attribute such as distance from the basin edge. The reduction in prediction error (sigma) produced by applying detailed site and/or basin-depth corrections does not have an important influence on the hazard. In fact, the sigma reduction is less than epistemic uncertainties on sigma itself. Due to data limitations, it is impossible to determine which attenuation relationship is best. However, our results do indicate which site conditions seem most influential. This information should prove useful to those developing or updating attenuation relationships and to those attempting to make more refined

  13. Laboratory Investigation of the Effect of Water-Saturation on Seismic Wave Dispersion in Carbonates

    NASA Astrophysics Data System (ADS)

    Li, W.; Pyrak-Nolte, L. J.

    2009-12-01

    In subsurface rock, fluid content changes with time through natural causes or because of human interactions, such as extraction or sequestration of fluids. The ability to monitor, seismically, fluid migration in the subsurface requires an understanding of the effects that the degree of saturation and spatial distribution of fluids have on wave propagation in rock. In this study, we find that the seismic dispersion of a dry carbonate rock can be masked by saturating the sample. We used a laboratory mini-seismic array to monitor fluid invasion and withdrawal in a carbonate rock with fabric-controlled layering. Experiments were performed on prismatic samples of Austin Chalk measuring 50mm x 50mm x 100mm. The epoxy-sealed samples contained an inlet and an outlet port to enable fluid invasion/withdrawal along the long axis of the sample. Water was infused and withdrawn from the sample at a rate of 1ml/hr. The mini-seismic array consisted of a set of 12 piezoelectric contact transducers, each with a central frequency 1.0 MHz. Three compressional wave source-receiver pairs and three shear wave source-receiver pairs were used to probe along the length of the sample prior to invasion and during invasion and withdrawal of water from the sample. A pressure transducer was used to record the fluid pressure simultaneously with the full transmitted wave forms every 15-30 minutes. A wavelet analysis determined the effect of fluid invasion on velocity dispersion. We observed that the compressional wave dispersion was more sensitive to changes in saturation than the shear wave dispersion. When the sample was unsaturated, the high frequency components of the compressional wave (1.2MHz to 2MHz) had lower velocities (~ 2750m/s) than the low frequency components, which decrease monotonically from 2890 m/s for 0.2MHz to 1.2 MHz. As water infused the sample, the dispersion weakened. When the sample as fully saturated, the compressional wave velocity was frequency independent. The

  14. Application of Gumbel I and Monte Carlo methods to assess seismic hazard in and around Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2018-05-01

    A proper assessment of seismic hazard is of considerable importance in order to achieve suitable building construction criteria. This paper presents probabilistic seismic hazard assessment in and around Pakistan (23° N-39° N; 59° E-80° E) in terms of peak ground acceleration (PGA). Ground motion is calculated in terms of PGA for a return period of 475 years using a seismogenic-free zone method of Gumbel's first asymptotic distribution of extreme values and Monte Carlo simulation. Appropriate attenuation relations of universal and local types have been used in this study. The results show that for many parts of Pakistan, the expected seismic hazard is relatively comparable with the level specified in the existing PGA maps.

  15. A new approach to geographic partitioning of probabilistic seismic hazard using seismic source distance with earthquake extreme and perceptibility statistics: an application to the southern Balkan region

    NASA Astrophysics Data System (ADS)

    Bayliss, T. J.

    2016-02-01

    The southeastern European cities of Sofia and Thessaloniki are explored as example site-specific scenarios by geographically zoning their individual localized seismic sources based on the highest probabilities of magnitude exceedance. This is with the aim of determining the major components contributing to each city's seismic hazard. Discrete contributions from the selected input earthquake catalogue are investigated to determine those areas that dominate each city's prevailing seismic hazard with respect to magnitude and source-to-site distance. This work is based on an earthquake catalogue developed and described in a previously published paper by the author and components of a magnitude probability density function. Binned magnitude and distance classes are defined using a joint magnitude-distance distribution. The prevailing seismicity to each city-as defined by a child data set extracted from the parent earthquake catalogue for each city considered-is divided into distinct constrained data bins of small discrete magnitude and source-to-site distance intervals. These are then used to describe seismic hazard in terms of uni-variate modal values; that is, M* and D* which are the modal magnitude and modal source-to-site distance in each city's local historical seismicity. This work highlights that Sofia's dominating seismic hazard-that is, the modal magnitudes possessing the highest probabilities of occurrence-is located in zones confined to two regions at 60-80 km and 170-180 km from this city, for magnitude intervals of 5.75-6.00 Mw and 6.00-6.25 Mw respectively. Similarly, Thessaloniki appears prone to highest levels of hazard over a wider epicentral distance interval, from 80 to 200 km in the moment magnitude range 6.00-6.25 Mw.

  16. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peace, Gerald; Goering, Timothy James; Miller, Mark Laverne

    2007-01-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations whenmore » data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.« less

  17. Epistemic uncertainty in California-wide synthetic seismicity simulations

    USGS Publications Warehouse

    Pollitz, Fred F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock–mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ∼12 square kilometers in size, has been rediscretized into Graphic patches, each of ∼1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M∼5–8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  18. Epistemic uncertainty in California-wide synthetic seismicity simulations

    USGS Publications Warehouse

    Pollitz, F.F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock-mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ~12 square kilometers in size, has been rediscretized into ~100;000 patches, each of ~1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M ~ 5-8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  19. Setting the Stage for Harmonized Risk Assessment by Seismic Hazard Harmonization in Europe (SHARE)

    NASA Astrophysics Data System (ADS)

    Woessner, Jochen; Giardini, Domenico; SHARE Consortium

    2010-05-01

    Probabilistic seismic hazard assessment (PSHA) is arguably one of the most useful products that seismology can offer to society. PSHA characterizes the best available knowledge on the seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results form the baseline for informed decision making, such as building codes or insurance rates and provide essential input to each risk assessment application. Several large scale national and international projects have recently been launched aimed at improving and harmonizing PSHA standards around the globe. SHARE (www.share-eu.org) is the European Commission funded project in the Framework Programme 7 (FP-7) that will create an updated, living seismic hazard model for the Euro-Mediterranean region. SHARE is a regional component of the Global Earthquake Model (GEM, www.globalquakemodel.org), a public/private partnership initiated and approved by the Global Science Forum of the OECD-GSF. GEM aims to be the uniform, independent and open access standard to calculate and communicate earthquake hazard and risk worldwide. SHARE itself will deliver measurable progress in all steps leading to a harmonized assessment of seismic hazard - in the definition of engineering requirements, in the collection of input data, in procedures for hazard assessment, and in engineering applications. SHARE scientists will create a unified framework and computational infrastructure for seismic hazard assessment and produce an integrated European probabilistic seismic hazard assessment (PSHA) model and specific scenario based modeling tools. The results will deliver long-lasting structural impact in areas of societal and economic relevance, they will serve as reference for the Eurocode 8 (EC8) application, and will provide homogeneous input for the correct seismic safety assessment for critical industry, such as the energy infrastructures and the re-insurance sector. SHARE will cover the whole European territory, the

  20. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  1. Berkeley Seismological Laboratory Seismic Moment Tensor Report for the August 6, 2007 M3.9 Seismic event in central Utah

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S; Dreger, D; Hellweg, P

    2007-08-08

    We have performed a complete moment tensor analysis of the seismic event, which occurred on Monday August 6, 2007 at 08:48:40 UTC 21 km from Mt.Pleasant, Utah. In our analysis we utilized complete three-component seismic records recorded by the USArray, University of Utah, and EarthScope seismic arrays. The seismic waveform data was integrated to displacement and filtered between 0.02 to 0.10 Hz following instrument removal. We used the Song et al. (1996) velocity model to compute Green's functions used in the moment tensor inversion. A map of the stations we used and the location of the event is shown inmore » Figure 1. In our moment tensor analysis we assumed a shallow source depth of 1 km consistent with the shallow depth reported for this event. As shown in Figure 2 the results point to a source mechanism with negligible double-couple radiation and is composed of dominant CLVD and implosive isotropic components. The total scalar seismic moment is 2.12e22 dyne cm corresponding to a moment magnitude (Mw) of 4.2. The long-period records are very well matched by the model (Figure 2) with a variance reduction of 73.4%. An all dilational (down) first motion radiation pattern is predicted by the moment tensor solution, and observations of first motions are in agreement.« less

  2. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions

  3. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  4. Active and passive seismic methods for characterization and monitoring of unstable rock masses: field surveys, laboratory tests and modeling.

    NASA Astrophysics Data System (ADS)

    Colombero, Chiara; Baillet, Laurent; Comina, Cesare; Jongmans, Denis; Vinciguerra, Sergio

    2016-04-01

    Appropriate characterization and monitoring of potentially unstable rock masses may provide a better knowledge of the active processes and help to forecast the evolution to failure. Among the available geophysical methods, active seismic surveys are often suitable to infer the internal structure and the fracturing conditions of the unstable body. For monitoring purposes, although remote-sensing techniques and in-situ geotechnical measurements are successfully tested on landslides, they may not be suitable to early forecast sudden rapid rockslides. Passive seismic monitoring can help for this purpose. Detection, classification and localization of microseismic events within the prone-to-fall rock mass can provide information about the incipient failure of internal rock bridges. Acceleration to failure can be detected from an increasing microseismic event rate. The latter can be compared with meteorological data to understand the external factors controlling stability. On the other hand, seismic noise recorded on prone-to-fall rock slopes shows that the temporal variations in spectral content and correlation of ambient vibrations can be related to both reversible and irreversible changes within the rock mass. We present the results of the active and passive seismic data acquired at the potentially unstable granitic cliff of Madonna del Sasso (NW Italy). Down-hole tests, surface refraction and cross-hole tomography were carried out for the characterization of the fracturing state of the site. Field surveys were implemented with laboratory determination of physico-mechanical properties on rock samples and measurements of the ultrasonic pulse velocity. This multi-scale approach led to a lithological interpretation of the seismic velocity field obtained at the site and to a systematic correlation of the measured velocities with physical properties (density and porosity) and macroscopic features of the granitic cliff (fracturing, weathering and anisotropy). Continuous

  5. Probabilistic and Scenario Seismic and Liquefaction Hazard Analysis of the Mississippi Embayment Incorporating Nonlinear Site Effects

    NASA Astrophysics Data System (ADS)

    Cramer, C. H.; Dhar, M. S.

    2017-12-01

    The influence of deep sediment deposits of the Mississippi Embayment (ME) on the propagation of seismic waves is poorly understood and remains a major source of uncertainty for site response analysis. Many researchers have studied the effects of these deposits on seismic hazard of the area using available information at the time. In this study, we have used updated and newly available resources for seismic and liquefaction hazard analyses of the ME. We have developed an improved 3D geological model. Additionally, we used surface geological maps from Cupples and Van Arsdale (2013) to prepare liquefaction hazard maps. Both equivalent linear and nonlinear site response codes were used to develop site amplification distributions for use in generating hazard maps. The site amplification distributions are created using the Monte Carlo approach of Cramer et al. (2004, 2006) on a 0.1-degree grid. The 2014 National Seismic Hazard model and attenuation relations (Petersen et al., 2014) are used to prepare seismic hazard maps. Then liquefaction hazard maps are generated using liquefaction probability curves from Holzer (2011) and Cramer et al. (2015). Equivalent linear response (w/ increased precision, restricted nonlinear behavior with depth) shows similar hazard for the ME compared to nonlinear analysis (w/o pore pressure) results. At short periods nonlinear deamplification dominates the hazard, but at long periods resonance amplification dominates. The liquefaction hazard tends to be high in Holocene and late Pleistocene lowland sediments, even with lowered ground water levels, and low in Pleistocene loess of the uplands. Considering pore pressure effects in nonlinear site response analysis at a test site on the lowlands shows amplification of ground motion at short periods. PGA estimates from ME liquefaction and MMI observations are in the 0.25 to 0.4 g range. Our estimated M7.5 PGA hazard within 10 km of the fault can exceed this. Ground motion observations from

  6. Hazard Monitoring of Growing Lava Flow Fields Using Seismic Tremor

    NASA Astrophysics Data System (ADS)

    Eibl, E. P. S.; Bean, C. J.; Jónsdottir, I.; Hoskuldsson, A.; Thordarson, T.; Coppola, D.; Witt, T.; Walter, T. R.

    2017-12-01

    An effusive eruption in 2014/15 created a 85 km2 large lava flow field in a remote location in the Icelandic highlands. The lava flows did not threaten any settlements or paved roads but they were nevertheless interdisciplinarily monitored in detail. Images from satellites and aircraft, ground based video monitoring, GPS and seismic recordings allowed the monitoring and reconstruction of a detailed time series of the growing lava flow field. While the use of satellite images and probabilistic modelling of lava flows are quite common tools to monitor the current and forecast the future growth direction, here we show that seismic recordings can be of use too. We installed a cluster of seismometers at 15 km from the vents and recorded the ground vibrations associated with the eruption. This seismic tremor was not only generated below the vents, but also at the edges of the growing lava flow field and indicated the parts of the lava flow field that were most actively growing. Whilst the time resolution is in the range of days for satellites, seismic stations easily sample continuously at 100 Hz and could therefore provide a much better resolution and estimate of the lava flow hazard in real-time.

  7. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  8. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  9. Evaluating the Use of Declustering for Induced Seismicity Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Michael, A. J.

    2016-12-01

    The recent dramatic seismicity rate increase in the central and eastern US (CEUS) has motivated the development of seismic hazard assessments for induced seismicity (e.g., Petersen et al., 2016). Standard probabilistic seismic hazard assessment (PSHA) relies fundamentally on the assumption that seismicity is Poissonian (Cornell, BSSA, 1968); therefore, the earthquake catalogs used in PSHA are typically declustered (e.g., Petersen et al., 2014) even though this may remove earthquakes that may cause damage or concern (Petersen et al., 2015; 2016). In some induced earthquake sequences in the CEUS, the standard declustering can remove up to 90% of the sequence, reducing the estimated seismicity rate by a factor of 10 compared to estimates from the complete catalog. In tectonic regions the reduction is often only about a factor of 2. We investigate how three declustering methods treat induced seismicity: the window-based Gardner-Knopoff (GK) algorithm, often used for PSHA (Gardner and Knopoff, BSSA, 1974); the link-based Reasenberg algorithm (Reasenberg, JGR,1985); and a stochastic declustering method based on a space-time Epidemic-Type Aftershock Sequence model (Ogata, JASA, 1988; Zhuang et al., JASA, 2002). We apply these methods to three catalogs that likely contain some induced seismicity. For the Guy-Greenbrier, AR earthquake swarm from 2010-2013, declustering reduces the seismicity rate by factors of 6-14, depending on the algorithm. In northern Oklahoma and southern Kansas from 2010-2015, the reduction varies from factors of 1.5-20. In the Salton Trough of southern California from 1975-2013, the rate is reduced by factors of 3-20. Stochastic declustering tends to remove the most events, followed by the GK method, while the Reasenberg method removes the fewest. Given that declustering and choice of algorithm have such a large impact on the resulting seismicity rate estimates, we suggest that more accurate hazard assessments may be found using the complete catalog.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Termmore » Seismic Program.« less

  11. The Caucasus Seismic Network (CNET): Seismic Structure of the Greater and Lesser Caucasus

    NASA Astrophysics Data System (ADS)

    Sandvol, E. A.; Mackey, K. G.; Nabelek, J.; Yetermishli, G.; Godoladze, T.; Babayan, H.; Malovichko, A.

    2017-12-01

    The Greater Caucasus are a portion of the Alpine-Himalayan mountain belt that has undergone rapid uplift in the past 5 million years, thus serving as a unique natural laboratory to study the early stages of orogenesis. Relatively lower resolution seismic velocity models of this region show contradictory lateral variability. Furthermore, recent waveform modeling of seismograms has clearly demonstrated the presence of deep earthquakes (with a maximum hypocentral depth of 175 km) below the Greater Caucasus. The region has been largely unexplored in terms of the detailed uppermost mantle and crustal seismic structure due in part to the disparate data sets that have not yet been merged as well as key portions being sparsely instrumented. We have established collaborative agreements across the region. Building on these agreements we recently deployed a major multi-national seismic array across the Greater Caucasus to address fundamental questions about the nature of continental deformation in this poorly understood region. Our seismic array has two components: (1) a grid of stations spanning the entire Caucasus and (2) two seismic transects consisting of stations spaced at distances of less than 10 km that cross the Greater Caucasus. In addition to the temporary stations, we are working to integrate data from the national networks to produce high resolution images of the seismic structure. Using data from over 106 new seismic stations in Azerbaijan, Armenia, Russia, and Georgia, we hope to gain a better understanding of the recent uplift ( 5 Ma) of the Greater Caucasus and the nature of seismogenic deformation in the region.

  12. Seismic Sources for the Territory of Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.

    2011-12-01

    The southern Caucasus is an earthquake prone region where devastating earthquakes have repeatedly caused significant loss of lives, infrastructure and buildings. High geodynamic activity of the region expressed in both seismic and aseismic deformations, is conditioned by the still-ongoing convergence of lithospheric plates and northward propagation of the Afro-Arabian continental block at a rate of several cm/year. The geometry of tectonic deformations in the region is largely determined by the wedge-shaped rigid Arabian block intensively intended into the relatively mobile Middle East-Caucasian region. Georgia is partner of ongoing regional project EMME. The main objective of EMME is calculation of Earthquake hazard uniformly with heights standards. One approach used in the project is the probabilistic seismic hazard assessment. In this approach the first parameter requirement is the definition of seismic source zones. Seismic sources can be either faults or area sources. Seismoactive structures of Georgia are identified mainly on the basis of the correlation between neotectonic structures of the region and earthquakes. Requirements of modern PSH software to geometry of faults is very high. As our knowledge of active faults geometry is not sufficient, area sources were used. Seismic sources are defined as zones that are characterized with more or less uniform seismicity. Poor knowledge of the processes occurring in deep of the Earth is connected with complexity of direct measurement. From this point of view the reliable data obtained from earthquake fault plane solution is unique for understanding the character of a current tectonic life of investigated area. There are two methods of identification if seismic sources. The first is the seimsotectonic approach, based on identification of extensive homogeneous seismic sources (SS) with the definition of probability of occurrence of maximum earthquake Mmax. In the second method the identification of seismic sources

  13. Evaluation of seismic testing for quality assurance of lime-stabilized soil.

    DOT National Transportation Integrated Search

    2013-08-01

    This study sought to determine the technical feasibility of using seismic techniques to measure the : laboratory and field seismic modulus of lime-stabilized soils (LSS), and to compare/correlate test results : from bench-top (free-free resonance) se...

  14. Challenges Ahead for Nuclear Facility Site-Specific Seismic Hazard Assessment in France: The Alternative Energies and the Atomic Energy Commission (CEA) Vision

    NASA Astrophysics Data System (ADS)

    Berge-Thierry, C.; Hollender, F.; Guyonnet-Benaize, C.; Baumont, D.; Ameri, G.; Bollinger, L.

    2017-09-01

    Seismic analysis in the context of nuclear safety in France is currently guided by a pure deterministic approach based on Basic Safety Rule ( Règle Fondamentale de Sûreté) RFS 2001-01 for seismic hazard assessment, and on the ASN/2/01 Guide that provides design rules for nuclear civil engineering structures. After the 2011 Tohohu earthquake, nuclear operators worldwide were asked to estimate the ability of their facilities to sustain extreme seismic loads. The French licensees then defined the `hard core seismic levels', which are higher than those considered for design or re-assessment of the safety of a facility. These were initially established on a deterministic basis, and they have been finally justified through state-of-the-art probabilistic seismic hazard assessments. The appreciation and propagation of uncertainties when assessing seismic hazard in France have changed considerably over the past 15 years. This evolution provided the motivation for the present article, the objectives of which are threefold: (1) to provide a description of the current practices in France to assess seismic hazard in terms of nuclear safety; (2) to discuss and highlight the sources of uncertainties and their treatment; and (3) to use a specific case study to illustrate how extended source modeling can help to constrain the key assumptions or parameters that impact upon seismic hazard assessment. This article discusses in particular seismic source characterization, strong ground motion prediction, and maximal magnitude constraints, according to the practice of the French Atomic Energy Commission. Due to increases in strong motion databases in terms of the number and quality of the records in their metadata and the uncertainty characterization, several recently published empirical ground motion prediction models are eligible for seismic hazard assessment in France. We show that propagation of epistemic and aleatory uncertainties is feasible in a deterministic approach, as in a

  15. Probabilistic evaluation of seismic isolation effect with respect to siting of a fusion reactor facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeda, Masatoshi; Komura, Toshiyuki; Hirotani, Tsutomu

    1995-12-01

    Annual failure probabilities of buildings and equipment were roughly evaluated for two fusion-reactor-like buildings, with and without seismic base isolation, in order to examine the effectiveness of the base isolation system regarding siting issues. The probabilities are calculated considering nonlinearity and rupture of isolators. While the probability of building failure for both buildings on the same site was almost equal, the function failures for equipment showed that the base-isolated building had higher reliability than the non-isolated building. Even if the base-isolated building alone is located on a higher seismic hazard area, it could compete favorably with the ordinary one inmore » reliability of equipment.« less

  16. Modelling induced seismicity due to fluid injection

    NASA Astrophysics Data System (ADS)

    Murphy, S.; O'Brien, G. S.; Bean, C. J.; McCloskey, J.; Nalbant, S. S.

    2011-12-01

    Injection of fluid into the subsurface alters the stress in the crust and can induce earthquakes. The science of assessing the risk of induced seismicity from such ventures is still in its infancy despite public concern. We plan to use a fault network model in which stress perturbations due to fluid injection induce earthquakes. We will use this model to investigate the role different operational and geological factors play in increasing seismicity in a fault system due to fluid injection. The model is based on a quasi-dynamic relationship between stress and slip coupled with a rate and state fiction law. This allows us to model slip on fault interfaces over long periods of time (i.e. years to 100's years). With the use of the rate and state friction law the nature of stress release during slipping can be altered through variation of the frictional parameters. Both seismic and aseismic slip can therefore be simulated. In order to add heterogeneity along the fault plane a fractal variation in the frictional parameters is used. Fluid injection is simulated using the lattice Boltzmann method whereby pore pressure diffuses throughout a permeable layer from the point of injection. The stress perturbation this causes on the surrounding fault system is calculated using a quasi-static solution for slip dislocation in an elastic half space. From this model we can generate slip histories and seismicity catalogues covering 100's of years for predefined fault networks near fluid injection sites. Given that rupture is a highly non-linear process, comparison between models with different input parameters (e.g. fault network statistics and injection rates) will be based on system wide features (such as the Gutenberg-Richter b-values), rather than specific seismic events. Our ultimate aim is that our model produces seismic catalogues similar to those observed over real injection sites. Such validation would pave the way to probabilistic estimation of reactivation risk for

  17. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  18. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    NASA Technical Reports Server (NTRS)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  19. Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.P.; Stover, R.L.; Hashimoto, P.S.

    1989-01-01

    Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.

  20. Quantitative modeling of reservoir-triggered seismicity

    NASA Astrophysics Data System (ADS)

    Hainzl, S.; Catalli, F.; Dahm, T.; Heinicke, J.; Woith, H.

    2017-12-01

    Reservoir-triggered seismicity might occur as the response to the crustal stress caused by the poroelastic response to the weight of the water volume and fluid diffusion. Several cases of high correlations have been found in the past decades. However, crustal stresses might be altered by many other processes such as continuous tectonic stressing and coseismic stress changes. Because reservoir-triggered stresses decay quickly with distance, even tidal or rainfall-triggered stresses might be of similar size at depth. To account for simultaneous stress sources in a physically meaningful way, we apply a seismicity model based on calculated stress changes in the crust and laboratory-derived friction laws. Based on the observed seismicity, the model parameters can be determined by maximum likelihood method. The model leads to quantitative predictions of the variations of seismicity rate in space and time which can be used for hypothesis testing and forecasting. For case studies in Talala (India), Val d'Agri (Italy) and Novy Kostel (Czech Republic), we show the comparison of predicted and observed seismicity, demonstrating the potential and limitations of the approach.

  1. A Fusion Model of Seismic and Hydro-Acoustic Propagation for Treaty Monitoring

    NASA Astrophysics Data System (ADS)

    Arora, Nimar; Prior, Mark

    2014-05-01

    We present an extension to NET-VISA (Network Processing Vertically Integrated Seismic Analysis), which is a probabilistic generative model of the propagation of seismic waves and their detection on a global scale, to incorporate hydro-acoustic data from the IMS (International Monitoring System) network. The new model includes the coupling of seismic waves into the ocean's SOFAR channel, as well as the propagation of hydro-acoustic waves from underwater explosions. The generative model is described in terms of multiple possible hypotheses -- seismic-to-hydro-acoustic, under-water explosion, other noise sources such as whales singing or icebergs breaking up -- that could lead to signal detections. We decompose each hypothesis into conditional probability distributions that are carefully analyzed and calibrated. These distributions include ones for detection probabilities, blockage in the SOFAR channel (including diffraction, refraction, and reflection around obstacles), energy attenuation, and other features of the resulting waveforms. We present a study of the various features that are extracted from the hydro-acoustic waveforms, and their correlations with each other as well the source of the energy. Additionally, an inference algorithm is presented that concurrently infers the seismic and under-water events, and associates all arrivals (aka triggers), both from seismic and hydro-acoustic stations, to the appropriate event, and labels the path taken by the wave. Finally, our results demonstrate that this fusion of seismic and hydro-acoustic data leads to very good performance. A majority of the under-water events that IDC (International Data Center) analysts built in 2010 are correctly located, and the arrivals that correspond to seismic-to-hydroacoustic coupling, the T phases, are mostly correctly identified. There is no loss in the accuracy of seismic events, in fact, there is a slight overall improvement.

  2. Seismic risk assessment for Poiana Uzului (Romania) buttress dam on Uz river

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Toma-Danila, Dragos; Paerele, Cosmin Marian; Emilian Toader, Victorin; Petruta Constantin, Angela; Ghita, Cristian

    2017-04-01

    The most important specific requirements towards dams' safety is the seismic risk assessment. This objective will be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine, 2002, and Bureau (2003), taking into account the maximum expected peak ground motions at dams' site, the structures vulnerability and the downstream risk characteristics. The maximum expected values for ground motions at dams' site have been obtained using probabilistic seismic hazard assessment approaches. The structural vulnerability was obtained from dams' characteristics (age, high, water volume) and the downstream risk was assessed using human, economical, touristic, historic and cultural heritage information from the areas that might be flooded in the case of a dam failure. A couple of flooding scenarios have been performed. The results of the work consist of local and regional seismic information, specific characteristics of dam, seismic hazard values for different return periods and risk classes. The studies realized in this paper have as final goal to provide in the near future the local emergency services with warnings of a potential dam failure and ensuing flood as a result of a large earthquake occurrence, allowing further public training for evacuation. Acknowledgments This work was partially supported by the Partnership in Priority Areas Program - PNII, under MEN-UEFISCDI, DARING Project no. 69/2014 and the Nucleu Program - PN 16-35, Project no. 03 01 and 01 06.

  3. A new probabilistic seismic hazard assessment for greater Tokyo

    USGS Publications Warehouse

    Stein, R.S.; Toda, S.; Parsons, T.; Grunewald, E.; Blong, R.; Sparks, S.; Shah, H.; Kennedy, J.

    2006-01-01

    Tokyo and its outlying cities are home to one-quarter of Japan's 127 million people. Highly destructive earthquakes struck the capital in 1703, 1855 and 1923, the last of which took 105 000 lives. Fuelled by greater Tokyo's rich seismological record, but challenged by its magnificent complexity, our joint Japanese-US group carried out a new study of the capital's earthquake hazards. We used the prehistoric record of great earthquakes preserved by uplifted marine terraces and tsunami deposits (17 M???8 shocks in the past 7000 years), a newly digitized dataset of historical shaking (10 000 observations in the past 400 years), the dense modern seismic network (300 000 earthquakes in the past 30 years), and Japan's GeoNet array (150 GPS vectors in the past 10 years) to reinterpret the tectonic structure, identify active faults and their slip rates and estimate their earthquake frequency. We propose that a dislodged fragment of the Pacific plate is jammed between the Pacific, Philippine Sea and Eurasian plates beneath the Kanto plain on which Tokyo sits. We suggest that the Kanto fragment controls much of Tokyo's seismic behaviour for large earthquakes, including the damaging 1855 M???7.3 Ansei-Edo shock. On the basis of the frequency of earthquakes beneath greater Tokyo, events with magnitude and location similar to the M??? 7.3 Ansei-Edo event have a ca 20% likelihood in an average 30 year period. In contrast, our renewal (time-dependent) probability for the great M??? 7.9 plate boundary shocks such as struck in 1923 and 1703 is 0.5% for the next 30 years, with a time-averaged 30 year probability of ca 10%. The resulting net likelihood for severe shaking (ca 0.9g peak ground acceleration (PGA)) in Tokyo, Kawasaki and Yokohama for the next 30 years is ca 30%. The long historical record in Kanto also affords a rare opportunity to calculate the probability of shaking in an alternative manner exclusively from intensity observations. This approach permits robust estimates

  4. A new probabilistic seismic hazard assessment for greater Tokyo.

    PubMed

    Stein, Ross S; Toda, Shinji; Parsons, Tom; Grunewald, Elliot

    2006-08-15

    Tokyo and its outlying cities are home to one-quarter of Japan's 127 million people. Highly destructive earthquakes struck the capital in 1703, 1855 and 1923, the last of which took 105,000 lives. Fuelled by greater Tokyo's rich seismological record, but challenged by its magnificent complexity, our joint Japanese-US group carried out a new study of the capital's earthquake hazards. We used the prehistoric record of great earthquakes preserved by uplifted marine terraces and tsunami deposits (17 M approximately 8 shocks in the past 7000 years), a newly digitized dataset of historical shaking (10000 observations in the past 400 years), the dense modern seismic network (300,000 earthquakes in the past 30 years), and Japan's GeoNet array (150 GPS vectors in the past 10 years) to reinterpret the tectonic structure, identify active faults and their slip rates and estimate their earthquake frequency. We propose that a dislodged fragment of the Pacific plate is jammed between the Pacific, Philippine Sea and Eurasian plates beneath the Kanto plain on which Tokyo sits. We suggest that the Kanto fragment controls much of Tokyo's seismic behaviour for large earthquakes, including the damaging 1855 M approximately 7.3 Ansei-Edo shock. On the basis of the frequency of earthquakes beneath greater Tokyo, events with magnitude and location similar to the M approximately 7.3 Ansei-Edo event have a ca 20% likelihood in an average 30 year period. In contrast, our renewal (time-dependent) probability for the great M > or = 7.9 plate boundary shocks such as struck in 1923 and 1703 is 0.5% for the next 30 years, with a time-averaged 30 year probability of ca 10%. The resulting net likelihood for severe shaking (ca 0.9 g peak ground acceleration (PGA)) in Tokyo, Kawasaki and Yokohama for the next 30 years is ca 30%. The long historical record in Kanto also affords a rare opportunity to calculate the probability of shaking in an alternative manner exclusively from intensity observations

  5. Microseismic monitoring of soft-rock landslide: contribution of a 3D velocity model for the location of seismic sources.

    NASA Astrophysics Data System (ADS)

    Floriane, Provost; Jean-Philippe, Malet; Cécile, Doubre; Julien, Gance; Alessia, Maggi; Agnès, Helmstetter

    2015-04-01

    Characterizing the micro-seismic activity of landslides is an important parameter for a better understanding of the physical processes controlling landslide behaviour. However, the location of the seismic sources on landslides is a challenging task mostly because of (a) the recording system geometry, (b) the lack of clear P-wave arrivals and clear wave differentiation, (c) the heterogeneous velocities of the ground. The objective of this work is therefore to test whether the integration of a 3D velocity model in probabilistic seismic source location codes improves the quality of the determination especially in depth. We studied the clay-rich landslide of Super-Sauze (French Alps). Most of the seismic events (rockfalls, slidequakes, tremors...) are generated in the upper part of the landslide near the main scarp. The seismic recording system is composed of two antennas with four vertical seismometers each located on the east and west sides of the seismically active part of the landslide. A refraction seismic campaign was conducted in August 2014 and a 3D P-wave model has been estimated using the Quasi-Newton tomography inversion algorithm. The shots of the seismic campaign are used as calibration shots to test the performance of the different location methods and to further update the 3D velocity model. Natural seismic events are detected with a semi-automatic technique using a frequency threshold. The first arrivals are picked using a kurtosis-based method and compared to the manual picking. Several location methods were finally tested. We compared a non-linear probabilistic method coupled with the 3D P-wave model and a beam-forming method inverted for an apparent velocity. We found that the Quasi-Newton tomography inversion algorithm provides results coherent with the original underlaying topography. The velocity ranges from 500 m.s-1 at the surface to 3000 m.s-1 in the bedrock. For the majority of the calibration shots, the use of a 3D velocity model

  6. Probabilistic tsunami hazard assessment for the Makran region with focus on maximum magnitude assumption

    NASA Astrophysics Data System (ADS)

    Hoechner, Andreas; Babeyko, Andrey Y.; Zamora, Natalia

    2016-06-01

    Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.

  7. Probabilistic tsunami hazard assessment for the Makran region with focus on maximum magnitude assumption

    NASA Astrophysics Data System (ADS)

    Hoechner, A.; Babeyko, A. Y.; Zamora, N.

    2015-09-01

    Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.

  8. Global Seismic Cross-Correlation Results: Characterizing Repeating Seismic Events

    NASA Astrophysics Data System (ADS)

    Vieceli, R.; Dodge, D. A.; Walter, W. R.

    2016-12-01

    Increases in seismic instrument quality and coverage have led to increased knowledge of earthquakes, but have also revealed the complex and diverse nature of earthquake ruptures. Nonetheless, some earthquakes are sufficiently similar to each other that they produce correlated waveforms. Such repeating events have been used to investigate interplate coupling of subduction zones [e.g. Igarashi, 2010; Yu, 2013], study spatio-temporal changes in slip rate at plate boundaries [e.g. Igarashi et al., 2003], observe variations in seismic wave propagation velocities in the crust [e.g. Schaff and Beroza, 2004; Sawazaki et al., 2015], and assess inner core rotation [e.g. Yu, 2016]. The characterization of repeating events on a global scale remains a very challenging problem. An initial global seismic cross-correlation study used over 310 million waveforms from nearly 3.8 million events recorded between 1970 and 2013 to determine an initial look at global correlated seismicity [Dodge and Walter, 2015]. In this work, we analyze the spatial and temporal distribution of the most highly correlated event clusters or "multiplets" from the Dodge and Walter [2015] study. We examine how the distributions and characteristics of multiplets are effected by tectonic environment, source-station separation, and frequency band. Preliminary results suggest that the distribution of multiplets does not correspond to the tectonic environment in any obvious way, nor do they always coincide with the occurrence of large earthquakes. Future work will focus on clustering correlated pairs and working to reduce the bias introduced by non-uniform seismic station coverage and data availability. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  9. The Seismicity activity toward east of Bogotá D. C., Colombia

    NASA Astrophysics Data System (ADS)

    Chicangana, G.; Vargas, C. A.; Gomez-Capera, A.; Pedraza, P.; Mora-Paez, H.; Salcedo, E.; Caneva, A.

    2013-12-01

    In the eastern flank of Eastern Cordillera very close to Bogotá D.C metropolitan area at least in last 450 years five magnitude 5.0 or higher earthquakes has occur. These were confirmed by both historical and instrumental seismicity information. Among these earthquakes, the first one in Colombian historical times was occur at March 16th, 1644 and was sense toward south of Santa Fé de Bogotá. Then on October 18th, 1743 occurred with a current probabilistic magnitude greater than 6.5 an earthquake that transcended in this region due to the economic slump and loss of lives that it caused. Recently the Quetame Earthquake with M = 5.9 occur on May 24th, 2008, that destroyed the Quetame town. This last earthquake was registered locally by Colombian Seismological Network (RSNC). In this study we realized an analysis over this seismicity activity both by historical chronicles with macroseismic estimation data, the seismicity record obtained mainly by the Colombian National Seismological Network (RSNC) data for the 1993-2012 lapse, for searching the seismogenics sources that produced this seismicity activity. So, with these results we show the tectonic panorama of this region indicating of this manner the faults that possibility can be potentially seismic actives. For this we have considered mainly geomorphologic features associated to the faults activity additionally corroborated with GPS velocities data of GEORED project of Colombian Geological Survey.

  10. Rippability Assessment of Weathered Sedimentary Rock Mass using Seismic Refraction Methods

    NASA Astrophysics Data System (ADS)

    Ismail, M. A. M.; Kumar, N. S.; Abidin, M. H. Z.; Madun, A.

    2018-04-01

    Rippability or ease of excavation in sedimentary rocks is a significant aspect of the preliminary work of any civil engineering project. Rippability assessment was performed in this study to select an available ripping machine to rip off earth materials using the seismic velocity chart provided by Caterpillar. The research area is located at the proposed construction site for the development of a water reservoir and related infrastructure in Kampus Pauh Putra, Universiti Malaysia Perlis. The research was aimed at obtaining seismic velocity, P-wave (Vp) using a seismic refraction method to produce a 2D tomography model. A 2D seismic model was used to delineate the layers into the velocity profile. The conventional geotechnical method of using a borehole was integrated with the seismic velocity method to provide appropriate correlation. The correlated data can be used to categorize machineries for excavation activities based on the available systematic analysis procedure to predict rock rippability. The seismic velocity profile obtained was used to interpret rock layers within the ranges labelled as rippable, marginal, and non-rippable. Based on the seismic velocity method the site can be classified into loose sand stone to moderately weathered rock. Laboratory test results shows that the site’s rock material falls between low strength and high strength. Results suggest that Caterpillar’s smallest ripper, namely, D8R, can successfully excavate materials based on the test results integration from seismic velocity method and laboratory test.

  11. Broadband seismic noise attenuation versus depth at the Albuquerque Seismological Laboratory

    USGS Publications Warehouse

    Hutt, Charles R.; Ringler, Adam; Gee, Lind

    2017-01-01

    Seismic noise induced by atmospheric processes such as wind and pressure changes can be a major contributor to the background noise observed in many seismograph stations, especially those installed at or near the surface. Cultural noise such as vehicle traffic or nearby buildings with air handling equipment also contributes to seismic background noise. Such noise sources fundamentally limit our ability to resolve earthquake‐generated signals. Many previous seismic noise versus depth studies focused separately on either high‐frequency (>1  Hz">>1  Hz) or low‐frequency (<0.05  Hz"><0.05  Hz) bands. In this study, we use modern high‐quality broadband (BB) and very broadband (VBB) seismometers installed at depths ranging from 1.5 to 188 m at the Albuquerque Seismological Laboratory to evaluate noise attenuation as a function of depth over a broad range of frequencies (0.002–50 Hz). Many modern seismometer deployments use BB or VBB seismometers installed at various depths, depending on the application. These depths range from one‐half meter or less in aftershock study deployments, to one or two meters in the Incorporated Research Institutions for Seismology Transportable Array (TA), to a few meters (shallow surface vaults) up to 100 m or more (boreholes) in the permanent observatories of the Global Seismographic Network (GSN). It is important for managers and planners of these and similar arrays and networks of seismograph stations to understand the attenuation of surface‐generated noise versus depth so that they can achieve desired performance goals within their budgets as well as their frequency band of focus. The results of this study will assist in decisions regarding BB and VBB seismometer installation depths. In general, we find that greater installation depths are better and seismometer emplacement in hard rock is better than in soil. Attenuation for any given depth varies with frequency. More specifically, we find that the

  12. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Abstract Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  13. Probabilistic Multi-Hazard Assessment of Dry Cask Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bencturk, Bora; Padgett, Jamie; Uddin, Rizwan

    systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environmentsmore » are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.« less

  14. Violations of Gutenberg-Richter Relation in Anthropogenic Seismicity

    NASA Astrophysics Data System (ADS)

    Urban, Pawel; Lasocki, Stanislaw; Blascheck, Patrick; do Nascimento, Aderson Farias; Van Giang, Nguyen; Kwiatek, Grzegorz

    2016-05-01

    Anthropogenic seismicity (AS) is the undesired dynamic rockmass response to technological processes. AS environments are shallow hence their heterogeneities have important impact on AS. Moreover, AS is controlled by complex and changeable technological factors. This complicated origin of AS explains why models used in tectonic seismicity may be not suitable for AS. We study here four cases of AS, testing statistically whether the magnitudes follow the Gutenberg-Richter relation or not. The considered cases include the data from Mponeng gold mine in South Africa, the data observed during stimulation of geothermal well Basel 1 in Switzerland, the data from Acu water reservoir region in Brazil and the data from Song Tranh 2 hydropower plant region in Vietnam. The cases differ in inducing technologies, in the duration of periods in which they were recorded, and in the ranges of magnitudes. In all four cases the observed frequency-magnitude distributions statistically significantly differ from the Gutenberg-Richter relation. Although in all cases the Gutenberg-Richter b value changed in time, this factor turns out to be not responsible for the discovered deviations from the Gutenberg-Richter-born exponential distribution model. Though the deviations from Gutenberg-Richter law are not big, they substantially diminish the accuracy of assessment of seismic hazard parameters. It is demonstrated that the use of non-parametric kernel estimators of magnitude distribution functions improves significantly the accuracy of hazard estimates and, therefore, these estimators are recommended to be used in probabilistic analyses of seismic hazard caused by AS.

  15. Albuquerque Seismological Laboratory--50 years of global seismology

    USGS Publications Warehouse

    Hutt, C.R.; Peterson, Jon; Gee, Lind; Derr, John; Ringler, Adam; Wilson, David

    2011-01-01

    The U.S. Geological Survey Albuquerque Seismological Laboratory is about 15 miles southeast of Albuquerque on the Pueblo of Isleta, adjacent to Kirtland Air Force Base. The Albuquerque Seismological Laboratory supports the Global Seismographic Network Program and the Advanced National Seismic System through the installation, operation, and maintenance of seismic stations around the world and serves as the premier seismological instrumentation test facility for the U.S. Government.

  16. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best

  17. Frequency Distribution of Seismic Intensity in Japan between 1950 and 2009

    NASA Astrophysics Data System (ADS)

    Kato, M.; Kohayakawa, Y.

    2012-12-01

    JMA Seismic Intensity is an index of seismic ground motion which is frequently used and reported in the media. While it is always difficult to represent complex ground motion with one index, the fact that it is widely accepted in the society makes the use of JMA Seismic Intensity preferable when seismologists communicate with the public and discuss hazard assessment and risk management. With the introduction on JMA Instrumental Intensity in 1996, the number of seismic intensity observation sites has substantially increased and the spatial coverage has improved vastly. Together with a long history of non-instrumental intensity records, the intensity data represent some aspects of the seismic ground motion in Japan. We investigate characteristics of seismic ground motion between 1950 and 2009 utilizing JMA Seismic Intensity Database. Specifically we are interested in the frequency distribution of intensity recordings. Observations of large intensity is rare compared to those of small intensity, and previous studies such as Ikegami [1961] demonstrated that frequency distribution of observed intensity obeys an exponential law, which is equivalent to the Ishimoto-Iida law [Ishimoto & Iida, 1939]. Such behavior could be used to empirically construct probabilistic seismic hazard maps [e.g., Kawasumi, 1951]. For the recent instrumental intensity data as well as pre-instrumental data, we are able to confirm that Ishimoto-Iida law explains the observation. Exponents of the Ishimoto-Iida law, or slope of the exponential law in the semi-log plot, is approximately 0.5. At stations with long recordings, there is no apparent difference between pre-instrumental and instrumental intensities when Ishimoto-Iida law is used as a measure. Numbers of average intensity reports per year and exponents of the frequency distribution curve vary regionally and local seismicity is apparently the controlling factor. The observed numbers of large intensity is slightly less than extrapolated and

  18. Seismic waves in 3-D: from mantle asymmetries to reliable seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Panza, Giuliano F.; Romanelli, Fabio

    2014-10-01

    A global cross-section of the Earth parallel to the tectonic equator (TE) path, the great circle representing the equator of net lithosphere rotation, shows a difference in shear wave velocities between the western and eastern flanks of the three major oceanic rift basins. The low-velocity layer in the upper asthenosphere, at a depth range of 120 to 200 km, is assumed to represent the decoupling between the lithosphere and the underlying mantle. Along the TE-perturbed (TE-pert) path, a ubiquitous LVZ, about 1,000-km-wide and 100-km-thick, occurs in the asthenosphere. The existence of the TE-pert is a necessary prerequisite for the existence of a continuous global flow within the Earth. Ground-shaking scenarios were constructed using a scenario-based method for seismic hazard analysis (NDSHA), using realistic and duly validated synthetic time series, and generating a data bank of several thousands of seismograms that account for source, propagation, and site effects. Accordingly, with basic self-organized criticality concepts, NDSHA permits the integration of available information provided by the most updated seismological, geological, geophysical, and geotechnical databases for the site of interest, as well as advanced physical modeling techniques, to provide a reliable and robust background for the development of a design basis for cultural heritage and civil infrastructures. Estimates of seismic hazard obtained using the NDSHA and standard probabilistic approaches are compared for the Italian territory, and a case-study is discussed. In order to enable a reliable estimation of the ground motion response to an earthquake, three-dimensional velocity models have to be considered, resulting in a new, very efficient, analytical procedure for computing the broadband seismic wave-field in a 3-D anelastic Earth model.

  19. Newberry Seismic Deployment Fieldwork Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Templeton, D C

    2012-03-21

    This report summarizes the seismic deployment of Lawrence Livermore National Laboratory (LLNL) Geotech GS-13 short-period seismometers at the Newberry Enhanced Geothermal System (EGS) Demonstration site located in Central Oregon. This Department of Energy (DOE) demonstration project is managed by AltaRock Energy Inc. AltaRock Energy had previously deployed Geospace GS-11D geophones at the Newberry EGS Demonstration site, however the quality of the seismic data was somewhat low. The purpose of the LLNL deployment was to install more sensitive sensors which would record higher quality seismic data for use in future seismic studies, such as ambient noise correlation, matched field processing earthquakemore » detection studies, and general EGS microearthquake studies. For the LLNL deployment, seven three-component seismic stations were installed around the proposed AltaRock Energy stimulation well. The LLNL seismic sensors were connected to AltaRock Energy Gueralp CMG-DM24 digitizers, which are powered by AltaRock Energy solar panels and batteries. The deployment took four days in two phases. In phase I, the sites were identified, a cavity approximately 3 feet deep was dug and a flat concrete pad oriented to true North was made for each site. In phase II, we installed three single component GS-13 seismometers at each site, quality controlled the data to ensure that each station was recording data properly, and filled in each cavity with native soil.« less

  20. The use of belief-based probabilistic methods in volcanology: Scientists' views and implications for risk assessments

    NASA Astrophysics Data System (ADS)

    Donovan, Amy; Oppenheimer, Clive; Bravo, Michael

    2012-12-01

    This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.

  1. Challenges in making a seismic hazard map for Alaska and the Aleutians

    USGS Publications Warehouse

    Wesson, R.L.; Boyd, O.S.; Mueller, C.S.; Frankel, A.D.; Freymueller, J.T.

    2008-01-01

    We present a summary of the data and analyses leading to the revision of the time-independent probabilistic seismic hazard maps of Alaska and the Aleutians. These maps represent a revision of existing maps based on newly obtained data, and reflect best current judgments about methodology and approach. They have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States, and will be proposed for adoption in future revisions to the International Building Code. We present example maps for peak ground acceleration, 0.2 s spectral amplitude (SA), and 1.0 s SA at a probability level of 2% in 50 years (annual probability of 0.000404). In this summary, we emphasize issues encountered in preparation of the maps that motivate or require future investigation and research.

  2. Advancing internal erosion monitoring using seismic methods in field and laboratory studies

    NASA Astrophysics Data System (ADS)

    Parekh, Minal L.

    embankment surface. Analysis of root mean squared amplitude and AE threshold counts indicated activity focused at the toe in locations matching the sand boils. This analysis also compared the various detection methods employed at the 2012 test to discuss a timeline of detection related to observable behaviors of the structure. The second area of research included designing and fabricating an instrumented laboratory apparatus for investigating active seismic wave propagation through soil samples. This dissertation includes a description of the rigid wall permeameter, instrumentation, control, and acquisitions systems along with descriptions of the custom-fabricated seismic sensors. A series of experiments (saturated sand, saturated sand with a known static anomaly placed near the center of the sample, and saturated sand with a diminishing anomaly near the center of the sample) indicated that shear wave velocity changes reflected changes in the state of stress of the soil. The mean effective stress was influenced by the applied vertical axial load, the frictional interaction between the soil and permeameter wall, and the degree of preloading. The frictional resistance was sizeable at the sidewall of the permeameter and decreased the mean effective stress with depth. This study also included flow tests to monitor changes in shear wave velocities as the internal erosion process started and developed. Shear wave velocity decreased at voids or lower density zones in the sample and increased as arching redistributes loads, though the two conditions compete. Finally, the social and political contexts surrounding nondestructive inspection were considered. An analogous approach utilized by the aerospace industry was introduced: a case study comparing the path toward adopting nondestructive tools as standard practices in monitoring aircraft safety. Additional lessons for dam and levee safety management were discussed from a Science, Technology, Engineering, and Policy (STEP

  3. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  4. Delineation of tectonic provinces of New York state as a component of seismic-hazard evaluation

    USGS Publications Warehouse

    Fakundiny, R.H.

    2004-01-01

    Seismic-hazard evaluations in the eastern United States must be based on interpretations of the composition and form of Proterozoic basement-rock terranes and overlying Paleozoic strata, and on factors that can cause relative movements among their units, rather than Phanerozoic orogenic structures, which may be independent of modern tectonics. The tectonic-province concept is a major part of both probabilistic and deterministic seismic-hazard evaluations, yet those that have been proposed to date have not attempted to geographically correlate modern earthquakes with regional basement structure. Comparison of basement terrane (megablock) boundaries with the spatial pattern of modern seismicity may lead to the mechanically sound definition of tectonic provinces, and thus, better seismic-hazard evaluation capability than is currently available. Delineation of megablock boundaries will require research on the many factors that affect their structure and movement. This paper discusses and groups these factors into two broad categories-megablock tectonics in relation to seismicity and regional horizontal-compressive stresses, with megablock tectonics divided into subcategories of basement, overlying strata, regional lineaments, basement tectonic terranes, earthquake epicenter distribution, and epeirogeny, and compressive stresses divided into pop-ups and the contemporary maximum horizontal-compressive stress field. A list presenting four to nine proposed research topics for each of these categories is given at the end.

  5. Seismic isolation of buildings using composite foundations based on metamaterials

    NASA Astrophysics Data System (ADS)

    Casablanca, O.; Ventura, G.; Garescı, F.; Azzerboni, B.; Chiaia, B.; Chiappini, M.; Finocchio, G.

    2018-05-01

    Metamaterials can be engineered to interact with waves in entirely new ways, finding application on the nanoscale in various fields such as optics and acoustics. In addition, acoustic metamaterials can be used in large-scale experiments for filtering and manipulating seismic waves (seismic metamaterials). Here, we propose seismic isolation based on a device that combines some properties of seismic metamaterials (e.g., periodic mass-in-mass systems) with that of a standard foundation positioned right below the building for isolation purposes. The concepts on which this solution is based are the local resonance and a dual-stiffness structure that preserves large (small) rigidity for compression (shear) effects. In other words, this paper introduces a different approach to seismic isolation by using certain principles of seismic metamaterials. The experimental demonstrator tested on the laboratory scale exhibits a spectral bandgap that begins at 4.5 Hz. Within the bandgap, it filters more than 50% of the seismic energy via an internal dissipation process. Our results open a path toward the seismic resilience of buildings and a critical infrastructure to shear seismic waves, achieving higher efficiency compared to traditional seismic insulators and passive energy-dissipation systems.

  6. A Seismic Source Model for Central Europe and Italy

    NASA Astrophysics Data System (ADS)

    Nyst, M.; Williams, C.; Onur, T.

    2006-12-01

    We present a seismic source model for Central Europe (Belgium, Germany, Switzerland, and Austria) and Italy, as part of an overall seismic risk and loss modeling project for this region. A separate presentation at this conference discusses the probabilistic seismic hazard and risk assessment (Williams et al., 2006). Where available we adopt regional consensus models and adjusts these to fit our format, otherwise we develop our own model. Our seismic source model covers the whole region under consideration and consists of the following components: 1. A subduction zone environment in Calabria, SE Italy, with interface events between the Eurasian and African plates and intraslab events within the subducting slab. The subduction zone interface is parameterized as a set of dipping area sources that follow the geometry of the surface of the subducting plate, whereas intraslab events are modeled as plane sources at depth; 2. The main normal faults in the upper crust along the Apennines mountain range, in Calabria and Central Italy. Dipping faults and (sub-) vertical faults are parameterized as dipping plane and line sources, respectively; 3. The Upper and Lower Rhine Graben regime that runs from northern Italy into eastern Belgium, parameterized as a combination of dipping plane and line sources, and finally 4. Background seismicity, parameterized as area sources. The fault model is based on slip rates using characteristic recurrence. The modeling of background and subduction zone seismicity is based on a compilation of several national and regional historic seismic catalogs using a Gutenberg-Richter recurrence model. Merging the catalogs encompasses the deletion of double, fake and very old events and the application of a declustering algorithm (Reasenberg, 2000). The resulting catalog contains a little over 6000 events, has an average b-value of -0.9, is complete for moment magnitudes 4.5 and larger, and is used to compute a gridded a-value model (smoothed historical

  7. From Geodetic Imaging of Seismic and Aseismic Fault Slip to Dynamic Modeling of the Seismic Cycle

    NASA Astrophysics Data System (ADS)

    Avouac, Jean-Philippe

    2015-05-01

    Understanding the partitioning of seismic and aseismic fault slip is central to seismotectonics as it ultimately determines the seismic potential of faults. Thanks to advances in tectonic geodesy, it is now possible to develop kinematic models of the spatiotemporal evolution of slip over the seismic cycle and to determine the budget of seismic and aseismic slip. Studies of subduction zones and continental faults have shown that aseismic creep is common and sometimes prevalent within the seismogenic depth range. Interseismic coupling is generally observed to be spatially heterogeneous, defining locked patches of stress accumulation, to be released in future earthquakes or aseismic transients, surrounded by creeping areas. Clay-rich tectonites, high temperature, and elevated pore-fluid pressure seem to be key factors promoting aseismic creep. The generally logarithmic time evolution of afterslip is a distinctive feature of creeping faults that suggests a logarithmic dependency of fault friction on slip rate, as observed in laboratory friction experiments. Most faults can be considered to be paved with interlaced patches where the friction law is either rate-strengthening, inhibiting seismic rupture propagation, or rate-weakening, allowing for earthquake nucleation. The rate-weakening patches act as asperities on which stress builds up in the interseismic period; they might rupture collectively in a variety of ways. The pattern of interseismic coupling can help constrain the return period of the maximum- magnitude earthquake based on the requirement that seismic and aseismic slip sum to match long-term slip. Dynamic models of the seismic cycle based on this conceptual model can be tuned to reproduce geodetic and seismological observations. The promise and pitfalls of using such models to assess seismic hazard are discussed.

  8. Overview of Seismic Noise and it’s Relevance to Personnel Detection

    DTIC Science & Technology

    2008-04-01

    production sites. Young et al. (1996) measured seismic noise with seismometers at the surface and within boreholes at three sites, and generated...ER D C/ CR R EL T R -0 8 -5 Overview of Seismic Noise and its Relevance to Personnel Detection Lindamae Peck April 2008 C ol d R...April 2008 Overview of Seismic Noise and its Relevance to Personnel Detection Lindamae Peck Cold Regions Research and Engineering Laboratory

  9. Seismic Sources and Recurrence Rates as Adopted by USGS Staff for the Production of the 1982 and 1990 Probabilistic Ground Motion Maps for Alaska and the Conterminous United States

    USGS Publications Warehouse

    Hanson, Stanley L.; Perkins, David M.

    1995-01-01

    The construction of a probabilistic ground-motion hazard map for a region follows a sequence of analyses beginning with the selection of an earthquake catalog and ending with the mapping of calculated probabilistic ground-motion values (Hanson and others, 1992). An integral part of this process is the creation of sources used for the calculation of earthquake recurrence rates and ground motions. These sources consist of areas and lines that are representative of geologic or tectonic features and faults. After the design of the sources, it is necessary to arrange the coordinate points in a particular order compatible with the input format for the SEISRISK-III program (Bender and Perkins, 1987). Source zones are usually modeled as a point-rupture source. Where applicable, linear rupture sources are modeled with articulated lines, representing known faults, or a field of parallel lines, representing a generalized distribution of hypothetical faults. Based on the distribution of earthquakes throughout the individual source zones (or a collection of several sources), earthquake recurrence rates are computed for each of the sources, and a minimum and maximum magnitude is assigned. Over a period of time from 1978 to 1980 several conferences were held by the USGS to solicit information on regions of the United States for the purpose of creating source zones for computation of probabilistic ground motions (Thenhaus, 1983). As a result of these regional meetings and previous work in the Pacific Northwest, (Perkins and others, 1980), California continental shelf, (Thenhaus and others, 1980), and the Eastern outer continental shelf, (Perkins and others, 1979) a consensus set of source zones was agreed upon and subsequently used to produce a national ground motion hazard map for the United States (Algermissen and others, 1982). In this report and on the accompanying disk we provide a complete list of source areas and line sources as used for the 1982 and later 1990 seismic

  10. Mobile seismic exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dräbenstedt, A., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de; Seyfried, V.; Cao, X.

    2016-06-28

    Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDVmore » measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.« less

  11. Electric resistivity and seismic refraction tomography: a challenging joint underwater survey at Äspö Hard Rock Laboratory

    NASA Astrophysics Data System (ADS)

    Ronczka, Mathias; Hellman, Kristofer; Günther, Thomas; Wisén, Roger; Dahlin, Torleif

    2017-06-01

    Tunnelling below water passages is a challenging task in terms of planning, pre-investigation and construction. Fracture zones in the underlying bedrock lead to low rock quality and thus reduced stability. For natural reasons, they tend to be more frequent at water passages. Ground investigations that provide information on the subsurface are necessary prior to the construction phase, but these can be logistically difficult. Geophysics can help close the gaps between local point information by producing subsurface images. An approach that combines seismic refraction tomography and electrical resistivity tomography has been tested at the Äspö Hard Rock Laboratory (HRL). The aim was to detect fracture zones in a well-known but logistically challenging area from a measuring perspective. The presented surveys cover a water passage along part of a tunnel that connects surface facilities with an underground test laboratory. The tunnel is approximately 100 m below and 20 m east of the survey line and gives evidence for one major and several minor fracture zones. The geological and general test site conditions, e.g. with strong power line noise from the nearby nuclear power plant, are challenging for geophysical measurements. Co-located positions for seismic and ERT sensors and source positions are used on the 450 m underwater section of the 700 m profile. Because of a large transition zone that appeared in the ERT result and the missing coverage of the seismic data, fracture zones at the southern and northern parts of the underwater passage cannot be detected by separated inversion. Synthetic studies show that significant three-dimensional (3-D) artefacts occur in the ERT model that even exceed the positioning errors of underwater electrodes. The model coverage is closely connected to the resolution and can be used to display the model uncertainty by introducing thresholds to fade-out regions of medium and low resolution. A structural coupling cooperative inversion

  12. Absolute earthquake locations using 3-D versus 1-D velocity models below a local seismic network: example from the Pyrenees

    NASA Astrophysics Data System (ADS)

    Theunissen, T.; Chevrot, S.; Sylvander, M.; Monteiller, V.; Calvet, M.; Villaseñor, A.; Benahmed, S.; Pauchet, H.; Grimaud, F.

    2018-03-01

    Local seismic networks are usually designed so that earthquakes are located inside them (primary azimuthal gap <<180°) and close to the seismic stations (0-100 km). With these local or near-regional networks (0°-5°), many seismological observatories still routinely locate earthquakes using 1-D velocity models. Moving towards 3-D location algorithms requires robust 3-D velocity models. This work takes advantage of seismic monitoring spanning more than 30 yr in the Pyrenean region. We investigate the influence of a well-designed 3-D model with station corrections including basins structure and the geometry of the Mohorovicic discontinuity on earthquake locations. In the most favourable cases (GAP < 180° and distance to the first station lower than 15 km), results using 1-D velocity models are very similar to 3-D results. The horizontal accuracy in the 1-D case can be higher than in the 3-D case if lateral variations in the structure are not properly resolved. Depth is systematically better resolved in the 3-D model even on the boundaries of the seismic network (GAP > 180° and distance to the first station higher than 15 km). Errors on velocity models and accuracy of absolute earthquake locations are assessed based on a reference data set made of active seismic, quarry blasts and passive temporary experiments. Solutions and uncertainties are estimated using the probabilistic approach of the NonLinLoc (NLLoc) software based on Equal Differential Time. Some updates have been added to NLLoc to better focus on the final solution (outlier exclusion, multiscale grid search, S-phases weighting). Errors in the probabilistic approach are defined to take into account errors on velocity models and on arrival times. The seismicity in the final 3-D catalogue is located with a horizontal uncertainty of about 2.0 ± 1.9 km and a vertical uncertainty of about 3.0 ± 2.0 km.

  13. Two types of seismicity accompanying hydraulic fracturing in Harrison County, Ohio - implications for seismic hazard and seismogenic mechanism

    NASA Astrophysics Data System (ADS)

    Kozlowska, M.; Brudzinski, M.; Friberg, P. A.; Skoumal, R.; Baxter, N. D.; Currie, B.

    2017-12-01

    While induced seismicity in the United States has mainly been attributed to wastewater disposal, Eastern Ohio has provided cases of seismicity induced by both hydraulic fracturing (HF) and wastewater disposal. In this study, we investigate five cases of seismicity associated with HF in Harrison County, OH. Because of their temporal and spatial isolation from other injection activities, this provide an ideal setting for studying the relationships between high pressure injection and earthquakes. Our analysis reveals two distinct groups of seismicity. Deeper earthquakes occur in the Precambrian crystalline basement, reach larger magnitudes (M>2), have lower b-values (<1), and continue for weeks following stimulation shut down. Shallower earthquakes, on the other hand, occur in Paleozoic sedimentary rocks 400 m below HF, are limited to smaller magnitudes (M<1), have higher b-values (>1.5), and lack post-stimulation activity. We seek the physical explanation of observed difference in earthquakes character and hypothesize that the maturity of faults is the main factor determining sequences b-values. Based on published results of laboratory experiments and fault modeling, we interpret the deep seismicity as slip on more mature faults in the older crystalline rocks and the shallow seismicity as slip on immature faults in the younger, lower viscosity sedimentary rocks. This suggests that HF inducing seismicity on deeper, more mature faults poses higher seismic hazards. The analysis of water and gas production data from these wells suggests that wells inducing deeper seismicity produced more water than wells with shallow seismicity. This indicates more extensive hydrologic connections outside the target reservoir, which may explain why gas production drops more quickly for wells with deeper seismicity. Despite these indications that hydraulic pressure fluctuations induce seismicity, we also find only 2-3 hours between onset of stimulation of HF wells and seismicity that is

  14. Investigation of Nonlinear Site Response and Seismic Compression from Case History Analysis and Laboratory Testing

    NASA Astrophysics Data System (ADS)

    Yee, Eric

    In this thesis I address a series of issues related to ground failure and ground motions during earthquakes. A major component is the evaluation of cyclic volumetric strain behavior of unsaturated soils, more commonly known as seismic compression, from advanced laboratory testing. Another major component is the application of nonlinear and equivalent linear ground response analyses to large-strain problems involving highly nonlinear dynamic soil behavior. These two components are merged in the analysis of a truly unique and crucial field case history of nonlinear site response and seismic compression. My first topic concerns dynamic soil testing for relatively small strain dynamic soil properties such as threshold strains, gammatv. Such testing is often conducted using specialized devices such as dual-specimen simple-shear, as devices configured for large strain testing produce noisy signals in the small strain range. Working with a simple shear device originally developed for large-strain testing, I extend its low-strain capabilities by characterizing noisy signals and utilizing several statistical methods to extract meaningful responses in the small strain range. I utilize linear regression of a transformed variable to estimate the cyclic shear strain from a noisy signal and the confidence interval on its amplitude. I utilize Kernel regression with the Nadaraya-Watson estimator and a Gaussian kernel to evaluate vertical strain response. A practical utilization of these techniques is illustrated by evaluating threshold shear strains for volume change with a procedure that takes into account uncertainties in the measured shear and vertical strains. My second topic concerns the seismic compression characteristics of non-plastic and low-plasticity silty sands with varying fines content (10 ≤ FC ≤ 60%). Simple shear testing was performed on various sand-fines mixtures at a range of modified Proctor relative compaction levels ( RC) and degrees-of-saturation (S

  15. Probabilistic seismic hazard in the San Francisco Bay area based on a simplified viscoelastic cycle model of fault interactions

    USGS Publications Warehouse

    Pollitz, F.F.; Schwartz, D.P.

    2008-01-01

    We construct a viscoelastic cycle model of plate boundary deformation that includes the effect of time-dependent interseismic strain accumulation, coseismic strain release, and viscoelastic relaxation of the substrate beneath the seismogenic crust. For a given fault system, time-averaged stress changes at any point (not on a fault) are constrained to zero; that is, kinematic consistency is enforced for the fault system. The dates of last rupture, mean recurrence times, and the slip distributions of the (assumed) repeating ruptures are key inputs into the viscoelastic cycle model. This simple formulation allows construction of stress evolution at all points in the plate boundary zone for purposes of probabilistic seismic hazard analysis (PSHA). Stress evolution is combined with a Coulomb failure stress threshold at representative points on the fault segments to estimate the times of their respective future ruptures. In our PSHA we consider uncertainties in a four-dimensional parameter space: the rupture peridocities, slip distributions, time of last earthquake (for prehistoric ruptures) and Coulomb failure stress thresholds. We apply this methodology to the San Francisco Bay region using a recently determined fault chronology of area faults. Assuming single-segment rupture scenarios, we find that fature rupture probabilities of area faults in the coming decades are the highest for the southern Hayward, Rodgers Creek, and northern Calaveras faults. This conclusion is qualitatively similar to that of Working Group on California Earthquake Probabilities, but the probabilities derived here are significantly higher. Given that fault rupture probabilities are highly model-dependent, no single model should be used to assess to time-dependent rupture probabilities. We suggest that several models, including the present one, be used in a comprehensive PSHA methodology, as was done by Working Group on California Earthquake Probabilities.

  16. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  17. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  18. Processing Approaches for DAS-Enabled Continuous Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Dou, S.; Wood, T.; Freifeld, B. M.; Robertson, M.; McDonald, S.; Pevzner, R.; Lindsey, N.; Gelvin, A.; Saari, S.; Morales, A.; Ekblaw, I.; Wagner, A. M.; Ulrich, C.; Daley, T. M.; Ajo Franklin, J. B.

    2017-12-01

    Distributed Acoustic Sensing (DAS) is creating a "field as laboratory" capability for seismic monitoring of subsurface changes. By providing unprecedented spatial and temporal sampling at a relatively low cost, DAS enables field-scale seismic monitoring to have durations and temporal resolutions that are comparable to those of laboratory experiments. Here we report on seismic processing approaches developed during data analyses of three case studies all using DAS-enabled seismic monitoring with applications ranging from shallow permafrost to deep reservoirs: (1) 10-hour downhole monitoring of cement curing at Otway, Australia; (2) 2-month surface monitoring of controlled permafrost thaw at Fairbanks, Alaska; (3) multi-month downhole and surface monitoring of carbon sequestration at Decatur, Illinois. We emphasize the data management and processing components relevant to DAS-based seismic monitoring, which include scalable approaches to data management, pre-processing, denoising, filtering, and wavefield decomposition. DAS has dramatically increased the data volume to the extent that terabyte-per-day data loads are now typical, straining conventional approaches to data storage and processing. To achieve more efficient use of disk space and network bandwidth, we explore improved file structures and data compression schemes. Because noise floor of DAS measurements is higher than that of conventional sensors, optimal processing workflow involving advanced denoising, deconvolution (of the source signatures), and stacking approaches are being established to maximize signal content of DAS data. The resulting workflow of data management and processing could accelerate the broader adaption of DAS for continuous monitoring of critical processes.

  19. Application-driven ground motion prediction equation for seismic hazard assessments in non-cratonic moderate-seismicity areas

    NASA Astrophysics Data System (ADS)

    Bindi, D.; Cotton, F.; Kotha, S. R.; Bosse, C.; Stromeyer, D.; Grünthal, G.

    2017-09-01

    We present a ground motion prediction equation (GMPE) for probabilistic seismic hazard assessments (PSHA) in low-to-moderate seismicity areas, such as Germany. Starting from the NGA-West2 flat-file (Ancheta et al. in Earthquake Spectra 30:989-1005, 2014), we develop a model tailored to the hazard application in terms of data selection and implemented functional form. In light of such hazard application, the GMPE is derived for hypocentral distance (along with the Joyner-Boore one), selecting recordings at sites with vs30 ≥ 360 m/s, distances within 300 km, and magnitudes in the range 3 to 8 (being 7.4 the maximum magnitude for the PSHA in the target area). Moreover, the complexity of the considered functional form is reflecting the availability of information in the target area. The median predictions are compared with those from the NGA-West2 models and with one recent European model, using the Sammon's map constructed for different scenarios. Despite the simplification in the functional form, the assessed epistemic uncertainty in the GMPE median is of the order of those affecting the NGA-West2 models for the magnitude range of interest of the hazard application. On the other hand, the simplification of the functional form led to an increment of the apparent aleatory variability. In conclusion, the GMPE developed in this study is tailored to the needs for applications in low-to-moderate seismic areas and for short return periods (e.g., 475 years); its application in studies where the hazard is involving magnitudes above 7.4 and for long return periods is not advised.

  20. Stress Memory in Fluid-Filled Fractures - Insights From Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Dura-Gomez, I.; Talwani, P.

    2007-05-01

    Detailed studies of reservoir and injection induced seismicity provide an opportunity to study the characteristics of the fractures associated with fluid-induced seismicity. In 1996, we noted that the first three series of earthquakes with M greater or equal than 5.0 in the vicinity of the Koyna reservoir, occurred only when the reservoir levels had exceeded the previous maxima. In subsequent years, three more similar episodes were noted in the vicinity of the Koyna and the nearby Warna reservoir, without a single repetition in the epicentral location. This behavior was similar to Kaiser effect observed in the laboratory. A similar behavior has been observed in many cases of injection induced seismicity. At the Denver arsenal well in the 1960s and the Soultz, France, hot rock site in the 1990s, among others, seismicity only occurred when the differential pressure, (downhole well borehole pressure excess over the ambient natural pressure) reached a threshold value. These threshold values differed for different wells and depths. The seismicity stopped when the differential pressure was lowered below the threshold value. These observations show that the stress memory (associated with Kaiser effect) observed in the laboratory with small samples, is also displayed in nature where the volume of rocks is from hundreds to thousands of cu.km. The fluid-filled seismogenic fractures near the reservoirs and bore wells, associated with fluid-induced seismicity, seems to behave like a finely tuned, sensitive system that "remember" the largest stress perturbation they have been subjected to. Here we present these observations of stress memory in fluid-filled fractures associated with induced seismicity and suggest possible causes.

  1. Seismic hazard in the Istanbul metropolitan area: A preliminary re-evaluation

    USGS Publications Warehouse

    Kalkan, E.; Gulkan, Polat; Ozturk, N.Y.; Celebi, M.

    2008-01-01

    In 1999, two destructive earthquakes (M7.4 Kocaeli and M7.2 Duzce) occurred in the north west of Turkey and resulted in major stress-drops on the western segment of the North Anatolian Fault system where it continues under the Marmara Sea. These undersea fault segments were recently explored using bathymetric and reflection surveys. These recent findings helped to reshape the seismotectonic environment of the Marmara basin, which is a perplexing tectonic domain. Based on collected new information, seismic hazard of the Marmara region, particularly Istanbul Metropolitan Area and its vicinity, were re-examined using a probabilistic approach. Two seismic source and alternate recurrence models combined with various indigenous and foreign attenuation relationships were adapted within a logic tree formulation to quantify and project the regional exposure on a set of hazard maps. The hazard maps show the peak horizontal ground acceleration and spectral acceleration at 1.0 s. These acceleration levels were computed for 2 and 10 % probabilities of transcendence in 50 years.

  2. Multicomponent seismic reservoir characterization of a steam-assisted gravity drainage (SAGD) heavy oil project, Athabasca oil sands, Alberta

    NASA Astrophysics Data System (ADS)

    Schiltz, Kelsey Kristine

    Steam-assisted gravity drainage (SAGD) is an in situ heavy oil recovery method involving the injection of steam in horizontal wells. Time-lapse seismic analysis over a SAGD project in the Athabasca oil sands deposit of Alberta reveals that the SAGD steam chamber has not developed uniformly. Core data confirm the presence of low permeability shale bodies within the reservoir. These shales can act as barriers and baffles to steam and limit production by prohibiting steam from accessing the full extent of the reservoir. Seismic data can be used to identify these shale breaks prior to siting new SAGD well pairs in order to optimize field development. To identify shale breaks in the study area, three types of seismic inversion and a probabilistic neural network prediction were performed. The predictive value of each result was evaluated by comparing the position of interpreted shales with the boundaries of the steam chamber determined through time-lapse analysis. The P-impedance result from post-stack inversion did not contain enough detail to be able to predict the vertical boundaries of the steam chamber but did show some predictive value in a spatial sense. P-impedance from pre-stack inversion exhibited some meaningful correlations with the steam chamber but was misleading in many crucial areas, particularly the lower reservoir. Density estimated through the application of a probabilistic neural network (PNN) trained using both PP and PS attributes identified shales most accurately. The interpreted shales from this result exhibit a strong relationship with the boundaries of the steam chamber, leading to the conclusion that the PNN method can be used to make predictions about steam chamber growth. In this study, reservoir characterization incorporating multicomponent seismic data demonstrated a high predictive value and could be useful in evaluating future well placement.

  3. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  4. Location error uncertainties - an advanced using of probabilistic inverse theory

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2016-04-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analyzed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. While estimating of the earthquake foci location is relatively simple a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling, and apriori uncertainties. In this presentation we addressed this task when statistics of observational and/or modeling errors are unknown. This common situation requires introduction of apriori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland we illustrate an approach based on an analysis of Shanon's entropy calculated for the aposteriori distribution. We show that this meta-characteristic of the aposteriori distribution carries some information on uncertainties of the solution found.

  5. Gas slug ascent through changes in conduit diameter: Laboratory insights into a volcano-seismic source process in low-viscosity magmas

    USGS Publications Warehouse

    James, M.R.; Lane, S.J.; Chouet, B.A.

    2006-01-01

    Seismic signals generated during the flow and degassing of low-viscosity magmas include long-period (LP) and very-long-period (VLP) events, whose sources are often attributed to dynamic fluid processes within the conduit. We present the results of laboratory experiments designed to investigate whether the passage of a gas slug through regions of changing conduit diameter could act as a suitable source mechanism. A vertical, liquid-filled glass tube featuring a concentric diameter change was used to provide canonical insights into potentially deep or shallow seismic sources. As gas slugs ascend the tube, we observe systematic pressure changes varying with slug size, liquid depth, tube diameter, and liquid viscosity. Gas slugs undergoing an abrupt flow pattern change upon entering a section of significantly increased tube diameter induce a transient pressure decrease in and above the flare and an associated pressure increase below it, which stimulates acoustic and inertial resonant oscillations. When the liquid flow is not dominantly controlled by viscosity, net vertical forces on the apparatus are also detected. The net force is a function of the magnitude of the pressure transients generated and the tube geometry, which dictates where, and hence when, the traveling pressure pulses can couple into the tube. In contrast to interpretations of related volcano-seismic data, where a single downward force is assumed to result from an upward acceleration of the center of mass in the conduit, our experiments suggest that significant downward forces can result from the rapid deceleration of relatively small volumes of downward-moving liquid. Copyright 2006 by the American Geophysical Union.

  6. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  7. Seismic assessment of Technical Area V (TA-V).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medrano, Carlos S.

    The Technical Area V (TA-V) Seismic Assessment Report was commissioned as part of Sandia National Laboratories (SNL) Self Assessment Requirement per DOE O 414.1, Quality Assurance, for seismic impact on existing facilities at Technical Area-V (TA-V). SNL TA-V facilities are located on an existing Uniform Building Code (UBC) Seismic Zone IIB Site within the physical boundary of the Kirtland Air Force Base (KAFB). The document delineates a summary of the existing facilities with their safety-significant structure, system and components, identifies DOE Guidance, conceptual framework, past assessments and the present Geological and Seismic conditions. Building upon the past information and themore » evolution of the new seismic design criteria, the document discusses the potential impact of the new standards and provides recommendations based upon the current International Building Code (IBC) per DOE O 420.1B, Facility Safety and DOE G 420.1-2, Guide for the Mitigation of Natural Phenomena Hazards for DOE Nuclear Facilities and Non-Nuclear Facilities.« less

  8. Advanced Seismic While Drilling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert Radtke; John Fontenot; David Glowka

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology ofmore » a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured

  9. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  10. Seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel

    NASA Astrophysics Data System (ADS)

    Kaláb, Zdeněk; Šílený, Jan; Lednická, Markéta

    2017-07-01

    This paper deals with the seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel in the Czech Republic. The basic source of data for historical earthquakes up to 1990 was the seismic website [1-]. The most intense earthquake described occurred on September 15, 1590 in the Niederroesterreich region (Austria) in the historical period; its reported intensity is Io = 8-9. The source of the contemporary seismic data for the period since 1991 to the end of 2014 was the website [11]. It may be stated based on the databases and literature review that in the period from 1900, no earthquake exceeding magnitude 5.1 originated in the territory of the Czech Republic. In order to evaluate seismicity and to assess the impact of seismic effects at depths of hypothetical deep geological repository for the next time period, the neo-deterministic method was selected as an extension of the probabilistic method. Each one out of the seven survey areas were assessed by the neo-deterministic evaluation of the seismic wave-field excited by selected individual events and determining the maximum loading. Results of seismological databases studies and neo-deterministic analysis of Čihadlo locality are presented.

  11. INVESTIGATING THE EFFECT OF MICROBIAL GROWTH AND BIOFILM FORMATION ON SEISMIC WAVE PROPAGATION IN SEDIMENT

    EPA Science Inventory

    Previous laboratory investigations have demonstrated that the seismic methods are sensitive to microbially-induced changes in porous media through the generation of biogenic gases and biomineralization. The seismic signatures associated with microbial growth and biofilm formation...

  12. Probabilistic assessment of landslide tsunami hazard for the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Pampell-Manis, A.; Horrillo, J.; Shigihara, Y.; Parambath, L.

    2016-01-01

    The devastating consequences of recent tsunamis affecting Indonesia and Japan have prompted a scientific response to better assess unexpected tsunami hazards. Although much uncertainty exists regarding the recurrence of large-scale tsunami events in the Gulf of Mexico (GoM), geological evidence indicates that a tsunami is possible and would most likely come from a submarine landslide triggered by an earthquake. This study customizes for the GoM a first-order probabilistic landslide tsunami hazard assessment. Monte Carlo Simulation (MCS) is employed to determine landslide configurations based on distributions obtained from observational submarine mass failure (SMF) data. Our MCS approach incorporates a Cholesky decomposition method for correlated landslide size parameters to capture correlations seen in the data as well as uncertainty inherent in these events. Slope stability analyses are performed using landslide and sediment properties and regional seismic loading to determine landslide configurations which fail and produce a tsunami. The probability of each tsunamigenic failure is calculated based on the joint probability of slope failure and probability of the triggering earthquake. We are thus able to estimate sizes and return periods for probabilistic maximum credible landslide scenarios. We find that the Cholesky decomposition approach generates landslide parameter distributions that retain the trends seen in observational data, improving the statistical validity and relevancy of the MCS technique in the context of landslide tsunami hazard assessment. Estimated return periods suggest that probabilistic maximum credible SMF events in the north and northwest GoM have a recurrence of 5000-8000 years, in agreement with age dates of observed deposits.

  13. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites for the surroundings of Katerloch cave, Austria

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Kalmár, János

    2016-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, and infrastructure costs for addressing these hazards are huge as well. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test Probabilistic Seismic Hazard Assessments (PSHA) is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occurring over time intervals that are at least as long as a seismic cycle. Such information would be very valuable, even if it concerned only a single site. Long-term information can in principle be gained from intact stalagmites in natural karstic caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on a case study from the Katerloch cave close to the city of Graz, Austria. A specially-shaped (candle stick style: high, slim, and more or less cylindrical form) intact and vulnerable stalagmites (IVSTM) in the Katerloch cave has been examined in 2013 and 2014. This IVSTM is suitable for estimating the upper limit for horizontal peak ground acceleration generated by pre-historic earthquakes. For this cave, we have extensive information about ages (e.g., Boch et al., 2006, 2010). The approach, used in our study, yields significant new constraints on seismic hazard, as the intactness of the stalagmites suggests that tectonic structures close to Katerloch cave, i.p. the Mur-Mürz fault did not generate very strong paleoearthquakes in the last few thousand years

  14. A model of seismic coda arrivals to suppress spurious events.

    NASA Astrophysics Data System (ADS)

    Arora, N.; Russell, S.

    2012-04-01

    We describe a model of coda arrivals which has been added to NET-VISA (Network processing Vertically Integrated Seismic Analysis) our probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. The scattered energy that follows a seismic phase arrival tends to deceive typical STA/LTA based arrival picking software into believing that a real seismic phase has been detected. These coda arrivals which tend to follow all seismic phases cause most network processing software including NET-VISA to believe that multiple events have taken place. It is not a simple matter of ignoring closely spaced arrivals since arrivals from multiple events can indeed overlap. The current practice in NET-VISA of pruning events within a small space-time neighborhood of a larger event works reasonably well, but it may mask real events produced in an after-shock sequence. Our new model allows any seismic arrival, even coda arrivals, to trigger a subsequent coda arrival. The probability of such a triggered arrival depends on the amplitude of the triggering arrival. Although real seismic phases are more likely to generate such coda arrivals. Real seismic phases also tend to generate coda arrivals with more strongly correlated parameters, for example azimuth and slowness. However, the SNR (Signal to Noise Ratio) of a coda arrival immediately following a phase arrival tends to be lower because of the nature of the SNR calculation. We have calibrated our model on historical statistics of such triggered arrivals and our inference accounts for them while searching for the best explanation of seismic events their association to the arrivals and the coda arrivals. We have tested our new model on one week of global seismic data spanning March 22, 2009 to March 29, 2009. Our model was trained on two and half months of data from April 5, 2009 to June 20, 2009. We use the LEB bulletin produced by the IDC (International Data Center) as the ground truth

  15. Analytical Prediction of the Seismic Response of a Reinforced Concrete Containment Vessel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, R.J.; Rashid, Y.R.; Cherry, J.L.

    Under the sponsorship of the Ministry of International Trade and Industry (MITI) of Japan, the Nuclear Power Engineering Corporation (NUPEC) is investigating the seismic behavior of a Reinforced Concrete Containment Vessel (RCCV) through scale-model testing using the high-performance shaking table at the Tadotsu Engineering Laboratory. A series of tests representing design-level seismic ground motions was initially conducted to gather valuable experimental measurements for use in design verification. Additional tests will be conducted with increasing amplifications of the seismic input until a structural failure of the test model occurs. In a cooperative program with NUPEC, the US Nuclear Regulatory Commission (USNRC),more » through Sandia National Laboratories (SNL), is conducting analytical research on the seismic behavior of RCCV structures. As part of this program, pretest analytical predictions of the model tests are being performed. The dynamic time-history analysis utilizes a highly detailed concrete constitutive model applied to a three-dimensional finite element representation of the test structure. This paper describes the details of the analysis model and provides analysis results.« less

  16. Geothermal Induced Seismicity National Environmental Policy Act Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, Aaron L; Cook, Jeffrey J; Beckers, Koenraad J

    In 2016, the U.S. Bureau of Land Management (BLM) contracted with the National Renewable Energy Laboratory (NREL) to assist the BLM in developing and building upon tools to better understand and evaluate induced seismicity caused by geothermal projects. This review of NEPA documents for four geothermal injection or EGS projects reveals the variety of approaches to analyzing and mitigating induced seismicity. With the exception of the Geysers, where induced seismicity has been observed and monitored for an extended period of time due to large volumes of water being piped in to recharge the hydrothermal reservoir, induced seismicity caused by geothermalmore » projects is a relative new area of study. As this review highlights, determining the level of mitigation required for induced seismic events has varied based on project location, when the review took place, whether the project utilized the International Energy Agency or DOE IS protocols, and the federal agency conducting the review. While the NEPA reviews were relatively consistent for seismic monitoring and historical evaluation of seismic events near the project location, the requirements for public outreach and mitigation for induced seismic events once stimulation has begun varied considerably between the four projects. Not all of the projects were required to notify specific community groups or local government entities before beginning the project, and only one of the reviews specifically stated the project proponent would hold meetings with the public to answer questions or address concerns.« less

  17. Circum-Pacific seismic potential: 1989-1999

    USGS Publications Warehouse

    Nishenko, S.P.

    1991-01-01

    The seismic potential for 96 segments of simple plate boundaries around the circum-Pacific region is presented in terms of the conditional probability for the occurrence of either large or great interplate earthquakes during the next 5, 10, and 20 years (i.e., 1989-1994, 1989-1999 and 1989-2009). This study represents the first probabilistic summary of seismic potential on this scale, and involves the comparison of plate boundary segments that exhibit varying recurrence times, magnitudes, and tectonic regimes. Presenting these data in a probabilistic framework provides a basis for the uniform comparison of seismic hazard between these differing fault segments, as well as accounting for individual variations in recurrence time along a specific fault segment, and uncertainties in the determination of the average recurrence time. The definition of specific segments along simple plate boundaries relies on the mapping of earthquake rupture zones as defined by the aftershock distributions of prior large and great earthquakes, and historic descriptions of felt intensities and damage areas. The 96 segments are chosen to represent areas likely to be ruptured by "characteristic" earthquakes of a specified size or magnitude. The term characteristic implies repeated breakage of a plate boundary segment by large or great earthquakes whose source dimensions are similar from cycle to cycle. This definition does not exclude the possibility that occasionally adjacent characteristic earthquake segments may break together in a single, larger event. Conversely, a segment may also break in a series of smaller ruptures. Estimates of recurrence times and conditional probabilities for characteristic earthquakes along segments of simple plate boundaries are based on 1) the historic and instrumental record of large and great earthquake occurrence; 2) paleoseismic evidence of recurrence from radiometric dating of Holocene features produced by earthquakes; 3) direct calculations of recurrence

  18. Maturity of nearby faults influences seismic hazard from hydraulic fracturing.

    PubMed

    Kozłowska, Maria; Brudzinski, Michael R; Friberg, Paul; Skoumal, Robert J; Baxter, Nicholas D; Currie, Brian S

    2018-02-20

    Understanding the causes of human-induced earthquakes is paramount to reducing societal risk. We investigated five cases of seismicity associated with hydraulic fracturing (HF) in Ohio since 2013 that, because of their isolation from other injection activities, provide an ideal setting for studying the relations between high-pressure injection and earthquakes. Our analysis revealed two distinct groups: ( i ) deeper earthquakes in the Precambrian basement, with larger magnitudes (M > 2), b-values < 1, and many post-shut-in earthquakes, versus ( ii ) shallower earthquakes in Paleozoic rocks ∼400 m below HF, with smaller magnitudes (M < 1), b-values > 1.5, and few post-shut-in earthquakes. Based on geologic history, laboratory experiments, and fault modeling, we interpret the deep seismicity as slip on more mature faults in older crystalline rocks and the shallow seismicity as slip on immature faults in younger sedimentary rocks. This suggests that HF inducing deeper seismicity may pose higher seismic hazards. Wells inducing deeper seismicity produced more water than wells with shallow seismicity, indicating more extensive hydrologic connections outside the target formation, consistent with pore pressure diffusion influencing seismicity. However, for both groups, the 2 to 3 h between onset of HF and seismicity is too short for typical fluid pressure diffusion rates across distances of ∼1 km and argues for poroelastic stress transfer also having a primary influence on seismicity.

  19. Maturity of nearby faults influences seismic hazard from hydraulic fracturing

    NASA Astrophysics Data System (ADS)

    Kozłowska, Maria; Brudzinski, Michael R.; Friberg, Paul; Skoumal, Robert J.; Baxter, Nicholas D.; Currie, Brian S.

    2018-02-01

    Understanding the causes of human-induced earthquakes is paramount to reducing societal risk. We investigated five cases of seismicity associated with hydraulic fracturing (HF) in Ohio since 2013 that, because of their isolation from other injection activities, provide an ideal setting for studying the relations between high-pressure injection and earthquakes. Our analysis revealed two distinct groups: (i) deeper earthquakes in the Precambrian basement, with larger magnitudes (M > 2), b-values < 1, and many post–shut-in earthquakes, versus (ii) shallower earthquakes in Paleozoic rocks ˜400 m below HF, with smaller magnitudes (M < 1), b-values > 1.5, and few post–shut-in earthquakes. Based on geologic history, laboratory experiments, and fault modeling, we interpret the deep seismicity as slip on more mature faults in older crystalline rocks and the shallow seismicity as slip on immature faults in younger sedimentary rocks. This suggests that HF inducing deeper seismicity may pose higher seismic hazards. Wells inducing deeper seismicity produced more water than wells with shallow seismicity, indicating more extensive hydrologic connections outside the target formation, consistent with pore pressure diffusion influencing seismicity. However, for both groups, the 2 to 3 h between onset of HF and seismicity is too short for typical fluid pressure diffusion rates across distances of ˜1 km and argues for poroelastic stress transfer also having a primary influence on seismicity.

  20. Students’ difficulties in probabilistic problem-solving

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  1. Dating previously balanced rocks in seismically active parts of California and Nevada

    USGS Publications Warehouse

    Bell, J.W.; Brune, J.N.; Liu, T.; Zreda, M.; Yount, J.C.

    1998-01-01

    Precariously balanced boulders that could be knocked down by strong earthquake ground motion are found in some seismically active areas of southern California and Nevada. In this study we used two independent surface-exposure dating techniques - rock-varnish microlamination and cosmogenic 36Cl dating methodologies - to estimate minimum- and maximum-limiting ages, respectively, of the precarious boulders and by inference the elapsed time since the sites were shaken down. The results of the exposure dating indicate that all of the precarious rocks are >10.5 ka and that some may be significantly older. At Victorville and Jacumba, California, these results show that the precarious rocks have not been knocked down for at least 10.5 k.y., a conclusion in apparent conflict with some commonly used probabilistic seismic hazard maps. At Yucca Mountain, Nevada, the ages of the precarious rocks are >10.5 to >27.0 ka, providing an independent measure of the minimum time elapsed since faulting occurred on the Solitario Canyon fault.

  2. Deployment of the Oklahoma borehole seismic experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P.E.; Rock, D.W.

    1989-01-20

    This paper discusses the Oklahoma borehole seismic experiment, currently in operation, set up by members of the Lawrence Livermore National Laboratory Treaty Verification Program and the Oklahoma Geophysical Observatory to determine deep-borehole seismic characteristics in geology typical of large regions in the Soviet Union. We evaluated and logged an existing 772-m deep borehole on the Observatory site by running caliper, cement bonding, casing inspection, and hole-deviation logs. Two Teledyne Geotech borehole-clamping seismometers were placed at various depths and spacings in the deep borehole. Currently, they are deployed at 727 and 730 m. A Teledyne Geotech shallow-borehole seismometer was mounted inmore » a 4.5-m hole, one meter from the deep borehole. The seismometers' system coherency were tested and found to be excellent to 35 Hz. We have recorded seismic noise, quarry blasts, regional earthquakes and teleseisms in the present configuration. We will begin a study of seismic noise and attenuation as a function of depth in the near future. 7 refs., 18 figs.« less

  3. Assessment of the Seismic Risk in the City of Yerevan and its Mitigation by Application of Innovative Seismic Isolation Technologies

    NASA Astrophysics Data System (ADS)

    Melkumyan, Mikayel G.

    2011-03-01

    It is obvious that the problem of precise assessment and/or analysis of seismic hazard (SHA) is quite a serious issue, and seismic risk reduction considerably depends on it. It is well known that there are two approaches in seismic hazard analysis, namely, deterministic (DSHA) and probabilistic (PSHA). The latter utilizes statistical estimates of earthquake parameters. However, they may not exist in a specific region, and using PSHA it is difficult to take into account local aspects, such as specific regional geology and site effects, with sufficient precision. For this reason, DSHA is preferable in many cases. After the destructive 1988 Spitak earthquake, the SHA of the territory of Armenia has been revised and increased. The distribution pattern for seismic risk in Armenia is given. Maximum seismic risk is concentrated in the region of the capital, the city of Yerevan, where 40% of the republic's population resides. We describe the method used for conducting seismic resistance assessment of the existing reinforced concrete (R/C) buildings. Using this assessment, as well as GIS technology, the coefficients characterizing the seismic risk of destruction were calculated for almost all buildings of Yerevan City. The results of the assessment are presented. It is concluded that, presently, there is a particularly pressing need for strengthening existing buildings. We then describe non-conventional approaches to upgrading the earthquake resistance of existing multistory R/C frame buildings by means of Additional Isolated Upper Floor (AIUF) and of existing stone and frame buildings by means of base isolation. In addition, innovative seismic isolation technologies were developed and implemented in Armenia for construction of new multistory multifunctional buildings. The advantages of these technologies are listed in the paper. It is worth noting that the aforementioned technologies were successfully applied for retrofitting an existing 100-year-old bank building in

  4. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  5. Gas Reservoir Identification Basing on Deep Learning of Seismic-print Characteristics

    NASA Astrophysics Data System (ADS)

    Cao, J.; Wu, S.; He, X.

    2016-12-01

    Reservoir identification based on seismic data analysis is the core task in oil and gas geophysical exploration. The essence of reservoir identification is to identify the properties of rock pore fluid. We developed a novel gas reservoir identification method named seismic-print analysis by imitation of the vocal-print analysis techniques in speaker identification. The term "seismic-print" is referred to the characteristics of the seismic waveform which can identify determinedly the property of the geological objectives, for instance, a nature gas reservoir. Seismic-print can be characterized by one or a few parameters named as seismic-print parameters. It has been proven that gas reservoirs are of characteristics of negative 1-order cepstrum coefficient anomaly and Positive 2-order cepstrum coefficient anomaly, concurrently. The method is valid for sandstone gas reservoir, carbonate reservoir and shale gas reservoirs, and the accuracy rate may reach up to 90%. There are two main problems to deal with in the application of seismic-print analysis method. One is to identify the "ripple" of a reservoir on the seismogram, and another is to construct the mapping relationship between the seismic-print and the gas reservoirs. Deep learning developed in recent years is of the ability to reveal the complex non-linear relationship between the attribute and the data, and of ability to extract automatically the features of the objective from the data. Thus, deep learning could been used to deal with these two problems. There are lots of algorithms to carry out deep learning. The algorithms can be roughly divided into two categories: Belief Networks Network (DBNs) and Convolutional Neural Network (CNN). DBNs is a probabilistic generative model, which can establish a joint distribution of the observed data and tags. CNN is a feedforward neural network, which can be used to extract the 2D structure feature of the input data. Both DBNs and CNN can be used to deal with seismic data

  6. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  7. Interseismic Coupling, Co- and Post-seismic Slip: a Stochastic View on the Northern Chilean Subduction Zone

    NASA Astrophysics Data System (ADS)

    Jolivet, R.; Duputel, Z.; Simons, M.; Jiang, J.; Riel, B. V.; Moore, A. W.; Owen, S. E.

    2017-12-01

    Mapping subsurface fault slip during the different phases of the seismic cycle provides a probe of the mechanical properties and the state of stress along these faults. We focus on the northern Chile megathrust where first order estimates of interseismic fault locking suggests little to no overlap between regions slipping seismically versus those that are dominantly aseismic. However, published distributions of slip, be they during seismic or aseismic phases, rely on unphysical regularization of the inverse problem, thereby cluttering attempts to quantify the degree of overlap between seismic and aseismic slip. Considering all the implications of aseismic slip on our understanding of the nucleation, propagation and arrest of seismic ruptures, it is of utmost importance to quantify our confidence in the current description of fault coupling. Here, we take advantage of 20 years of InSAR observations and more than a decade of GPS measurements to derive probabilistic maps of inter-seismic coupling, as well as co-seismic and post-seismic slip along the northern Chile subduction megathrust. A wide InSAR velocity map is derived using a novel multi-pixel time series analysis method accounting for orbital errors, atmospheric noise and ground deformation. We use AlTar, a massively parallel Monte Carlo Markov Chain algorithm exploiting the acceleration capabilities of Graphic Processing Units, to derive the probability density functions (PDF) of slip. In northern Chile, we find high probabilities for a complete release of the elastic strain accumulated since the 1877 earthquake by the 2014, Iquique earthquake and for the presence of a large, independent, locked asperity left untapped by recent events, north of the Mejillones peninsula. We evaluate the probability of overlap between the co-, inter- and post-seismic slip and consider the potential occurrence of slow, aseismic slip events along this portion of the subduction zone.

  8. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  9. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Sullivan, T. J.

    2012-04-01

    consists of the following four main analysis stages: (i) probabilistic seismic hazard analysis to give the mean occurrence rate of earthquake events having an intensity greater than a threshold value, (ii) structural analysis to estimate the global structural response, given a certain value of seismic intensity, (iii) damage analysis, in which fragility functions are used to express the probability that a building component exceeds a damage state, as a function of the global structural response, (iv) loss analysis, in which the overall performance is assessed based on the damage state of all components. This final step gives estimates of the mean annual frequency with which various repair cost levels (or other decision variables) are exceeded. The realisation of this framework does suggest that risk-based seismic design is now possible. However, comparing current code approaches with the proposed PBEE framework, it becomes apparent that mainstream consulting engineers would have to go through a massive learning curve in order to apply the new procedures in practice. With this in mind, it is proposed that simplified loss-based seismic design procedures are a logical means of helping the engineering profession transition from what are largely deterministic seismic design procedures in current codes, to more rational risk-based seismic design methodologies. Examples are provided to illustrate the likely benefits of adopting loss-based seismic design approaches in practice.

  10. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume

  11. Seismic hazard maps for Haiti

    USGS Publications Warehouse

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  12. High-resolution surface wave tomography of the European crust and uppermost mantle from ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Lu, Yang; Stehly, Laurent; Paul, Anne; AlpArray Working Group

    2018-05-01

    Taking advantage of the large number of seismic stations installed in Europe, in particular in the greater Alpine region with the AlpArray experiment, we derive a new high-resolution 3-D shear-wave velocity model of the European crust and uppermost mantle from ambient noise tomography. The correlation of up to four years of continuous vertical-component seismic recordings from 1293 broadband stations (10° W-35° E, 30° N-75° N) provides Rayleigh wave group velocity dispersion data in the period band 5-150 s at more than 0.8 million virtual source-receiver pairs. Two-dimensional Rayleigh wave group velocity maps are estimated using adaptive parameterization to accommodate the strong heterogeneity of path coverage. A probabilistic 3-D shear-wave velocity model, including probability densities for the depth of layer boundaries and S-wave velocity values, is obtained by non-linear Bayesian inversion. A weighted average of the probabilistic model is then used as starting model for the linear inversion step, providing the final Vs model. The resulting S-wave velocity model and Moho depth are validated by comparison with previous geophysical studies. Although surface-wave tomography is weakly sensitive to layer boundaries, vertical cross-sections through our Vs model and the associated probability of presence of interfaces display striking similarities with reference controlled-source (CSS) and receiver-function sections across the Alpine belt. Our model even provides new structural information such as a ˜8 km Moho jump along the CSS ECORS-CROP profile that was not imaged by reflection data due to poor penetration across a heterogeneous upper crust. Our probabilistic and final shear wave velocity models have the potential to become new reference models of the European crust, both for crustal structure probing and geophysical studies including waveform modeling or full waveform inversion.

  13. SEISRISK II; a computer program for seismic hazard estimation

    USGS Publications Warehouse

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  14. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over

  15. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  16. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  17. Seismic measurements of the internal properties of fault zones

    USGS Publications Warehouse

    Mooney, W.D.; Ginzburg, A.

    1986-01-01

    The internal properties within and adjacent to fault zones are reviewed, principally on the basis of laboratory, borehole, and seismic refraction and reflection data. The deformation of rocks by faulting ranges from intragrain microcracking to severe alteration. Saturated microcracked and mildly fractured rocks do not exhibit a significant reduction in velocity, but, from borehole measurements, densely fractured rocks do show significantly reduced velocities, the amount of reduction generally proportional to the fracture density. Highly fractured rock and thick fault gouge along the creeping portion of the San Andreas fault are evidenced by a pronounced seismic low-velocity zone (LVZ), which is either very thin or absent along locked portions of the fault. Thus there is a correlation between fault slip behavior and seismic velocity structure within the fault zone; high pore pressure within the pronounced LVZ may be conductive to fault creep. Deep seismic reflection data indicate that crustal faults sometimes extend through the entire crust. Models of these data and geologic evidence are consistent with a composition of deep faults consisting of highly foliated, seismically anisotropic mylonites. ?? 1986 Birkha??user Verlag, Basel.

  18. A new view for the geodynamics of Ecuador: Implication in seismogenic source definition and seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Yepes, Hugo; Audin, Laurence; Alvarado, Alexandra; Beauval, Céline; Aguilar, Jorge; Font, Yvonne; Cotton, Fabrice

    2016-05-01

    A new view of Ecuador's complex geodynamics has been developed in the course of modeling seismic source zones for probabilistic seismic hazard analysis. This study focuses on two aspects of the plates' interaction at a continental scale: (a) age-related differences in rheology between Farallon and Nazca plates—marked by the Grijalva rifted margin and its inland projection—as they subduct underneath central Ecuador, and (b) the rapidly changing convergence obliquity resulting from the convex shape of the South American northwestern continental margin. Both conditions satisfactorily explain several characteristics of the observed seismicity and of the interseismic coupling. Intermediate-depth seismicity reveals a severe flexure in the Farallon slab as it dips and contorts at depth, originating the El Puyo seismic cluster. The two slabs position and geometry below continental Ecuador also correlate with surface expressions observable in the local and regional geology and tectonics. The interseismic coupling is weak and shallow south of the Grijalva rifted margin and increases northward, with a heterogeneous pattern locally associated to the Carnegie ridge subduction. High convergence obliquity is responsible for the North Andean Block northeastward movement along localized fault systems. The Cosanga and Pallatanga fault segments of the North Andean Block-South American boundary concentrate most of the seismic moment release in continental Ecuador. Other inner block faults located along the western border of the inter-Andean Depression also show a high rate of moderate-size earthquake production. Finally, a total of 19 seismic source zones were modeled in accordance with the proposed geodynamic and neotectonic scheme.

  19. Probabilistic classifiers with high-dimensional data

    PubMed Central

    Kim, Kyung In; Simon, Richard

    2011-01-01

    For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946

  20. Sequential Data Assimilation for Seismicity: a Proof of Concept

    NASA Astrophysics Data System (ADS)

    van Dinther, Ylona; Fichtner, Andreas; Kuensch, Hansruedi

    2016-04-01

    Our probabilistic forecasting ability and physical understanding of earthquakes is significantly hampered by limited indications on the current and evolving state of stress and strength on faults. This information is typically thought to be beyond our resolution capabilities based on surface data. We show that the state of stress and strength are actually obtainable for settings with one dominant fault. State variables and their uncertainties are obtained using Ensemble Kalman Filtering, a sequential data assimilation technique extensively developed for weather forecasting purposes. Through the least-squares solution of Bayes theorem erroneous data is for the first time assimilated to update a Partial Differential Equation-driven seismic cycle model. This visco-elasto-plastic continuum forward model solves Navier-Stokes equations with a rate-dependent friction coefficient (van Dinther et al., JGR, 2013). To prove the concept of this weather - earthquake forecasting bridge we perform a perfect model test. Synthetic numerical data from a single analogue borehole is assimilated into 20 ensemble models over 14 cycles of analogue earthquakes. Since we know the true state of the numerical data model, a quantitative and qualitative evaluation shows that meaningful information on the stress and strength of the unobserved fault is typically already available, once data from a single, shallow borehole is assimilated over part of a seismic cycle. This is possible, since the sampled error covariance matrix contains prior information on the physics that relates velocities, stresses, and pressures at the surface to those at the fault. During the analysis step stress and strength distributions are thus reconstructed in such a way that fault coupling can be updated to either inhibit or trigger events. In the subsequent forward propagation step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the

  1. Applications of the seismic hazard model of Italy: from a new building code to the L'Aquila trial against seismologists

    NASA Astrophysics Data System (ADS)

    Meletti, C.

    2013-05-01

    In 2003, a large national project fur updating the seismic hazard map and the seismic zoning in Italy started, according to the rules fixed by an Ordinance by Italian Prime Minister. New input elements for probabilistic seismic hazard assessment were compiled: the earthquake catalogue, the seismogenic zonation, the catalogue completeness, a set of new attenuation relationships. The map of expected PGA on rock soil condition with 10% probability of exceedance is the new reference seismic hazard map for Italy (http://zonesismiche.mi.ingv.it). In the following, further 9 probabilities of exceedance and the uniform hazard spectra up to 2 seconds together with the disaggregation of the PGA was also released. A comprehensive seismic hazard model that fully describes the seismic hazard in Italy was then available, accessible by a webGis application (http://esse1-gis.mi.ingv.it/en.php). The detailed information make possible to change the approach for evaluating the proper seismic action for designing: from a zone-dependent approach (in Italy there were 4 seismic zones, each one with a single design spectrum) to a site-dependent approach: the design spectrum is now defined at each site of a grid of about 11000 points covering the whole national territory. The new building code becomes mandatory only after the 6 April 2009 L'Aquila earthquake, the first strong event in Italy after the release of the seismic hazard map. The large number of recordings and the values of the experienced accelerations suggested the comparisons between the recorded spectra and spectra defined in the seismic codes Even if such comparisons could be robust only after several consecutive 50-year periods of observation and in a probabilistic approach it is not a single observation that can validate or not the hazard estimate, some of the comparisons that can be undertaken between the observed ground motions and the hazard model used for the seismic code have been performed and have shown that the

  2. The application of refraction seismics in alpine permafrost studies

    NASA Astrophysics Data System (ADS)

    Draebing, Daniel

    2017-04-01

    Permafrost studies in alpine environments focus on landslides from permafrost-affected rockwalls, landslide deposits or periglacial sediment dynamics. Mechanical properties of soils or rocks are influenced by permafrost and changed strength properties affect these periglacial processes. To assess the effects of permafrost thaw and degradation, monitoring techniques for permafrost distribution and active-layer thaw are required. Seismic wave velocities are sensitive to freezing and, therefore, refraction seismics presents a valuable tool to investigate permafrost in alpine environments. In this study, (1) laboratory and field applications of refraction seismics in alpine environments are reviewed and (2) data are used to quantify effects of rock properties (e.g. lithology, porosity, anisotropy, saturation) on p-wave velocities. In the next step, (3) influence of environmental factors are evaluated and conclusions drawn on permafrost differentiation within alpine periglacial landforms. This study shows that p-wave velocity increase is susceptible to porosity which is pronounced in high-porosity rocks. In low-porosity rocks, p-wave velocity increase is controlled by anisotropy decrease due to ice pressure (Draebing and Krautblatter, 2012) which enables active-layer and permafrost differentiation at rockwall scale (Krautblatter and Draebing, 2014; Draebing et al., 2016). However, discontinuity distribution can result in high anisotropy effects on seismic velocities which can impede permafrost differentiation (Phillips et al., 2016). Due to production or deposition history, porosity can show large spatial differences in deposited landforms. Landforms with large boulders such as rock glaciers and moraines show highest p-wave velocity differences between active-layer and permafrost which facilitates differentiation (Draebing, 2016). Saturation with water is essential for the successful application of refraction seismics for permafrost detection and can be controlled at

  3. A probabilistic estimate of maximum acceleration in rock in the contiguous United States

    USGS Publications Warehouse

    Algermissen, Sylvester Theodore; Perkins, David M.

    1976-01-01

    This paper presents a probabilistic estimate of the maximum ground acceleration to be expected from earthquakes occurring in the contiguous United States. It is based primarily upon the historic seismic record which ranges from very incomplete before 1930 to moderately complete after 1960. Geologic data, primarily distribution of faults, have been employed only to a minor extent, because most such data have not been interpreted yet with earthquake hazard evaluation in mind.The map provides a preliminary estimate of the relative hazard in various parts of the country. The report provides a method for evaluating the relative importance of the many parameters and assumptions in hazard analysis. The map and methods of evaluation described reflect the current state of understanding and are intended to be useful for engineering purposes in reducing the effects of earthquakes on buildings and other structures.Studies are underway on improved methods for evaluating the relativ( earthquake hazard of different regions. Comments on this paper are invited to help guide future research and revisions of the accompanying map.The earthquake hazard in the United States has been estimated in a variety of ways since the initial effort by Ulrich (see Roberts and Ulrich, 1950). In general, the earlier maps provided an estimate of the severity of ground shaking or damage but the frequency of occurrence of the shaking or damage was not given. Ulrich's map showed the distribution of expected damage in terms of no damage (zone 0), minor damage (zone 1), moderate damage (zone 2), and major damage (zone 3). The zones were not defined further and the frequency of occurrence of damage was not suggested. Richter (1959) and Algermissen (1969) estimated the ground motion in terms of maximum Modified Mercalli intensity. Richter used the terms "occasional" and "frequent" to characterize intensity IX shaking and Algermissen included recurrence curves for various parts of the country in the paper

  4. Model uncertainties of the 2002 update of California seismic hazard maps

    USGS Publications Warehouse

    Cao, T.; Petersen, M.D.; Frankel, A.D.

    2005-01-01

    In this article we present and explore the source and ground-motion model uncertainty and parametric sensitivity for the 2002 update of the California probabilistic seismic hazard maps. Our approach is to implement a Monte Carlo simulation that allows for independent sampling from fault to fault in each simulation. The source-distance dependent characteristics of the uncertainty maps of seismic hazard are explained by the fundamental uncertainty patterns from four basic test cases, in which the uncertainties from one-fault and two-fault systems are studied in detail. The California coefficient of variation (COV, ratio of the standard deviation to the mean) map for peak ground acceleration (10% of exceedance in 50 years) shows lower values (0.1-0.15) along the San Andreas fault system and other class A faults than along class B faults (0.2-0.3). High COV values (0.4-0.6) are found around the Garlock, Anacapa-Dume, and Palos Verdes faults in southern California and around the Maacama fault and Cascadia subduction zone in northern California.

  5. Precursory changes in seismic velocity for the spectrum of earthquake failure modes

    PubMed Central

    Scuderi, M.M.; Marone, C.; Tinti, E.; Di Stefano, G.; Collettini, C.

    2016-01-01

    Temporal changes in seismic velocity during the earthquake cycle have the potential to illuminate physical processes associated with fault weakening and connections between the range of fault slip behaviors including slow earthquakes, tremor and low frequency earthquakes1. Laboratory and theoretical studies predict changes in seismic velocity prior to earthquake failure2, however tectonic faults fail in a spectrum of modes and little is known about precursors for those modes3. Here we show that precursory changes of wave speed occur in laboratory faults for the complete spectrum of failure modes observed for tectonic faults. We systematically altered the stiffness of the loading system to reproduce the transition from slow to fast stick-slip and monitored ultrasonic wave speed during frictional sliding. We find systematic variations of elastic properties during the seismic cycle for both slow and fast earthquakes indicating similar physical mechanisms during rupture nucleation. Our data show that accelerated fault creep causes reduction of seismic velocity and elastic moduli during the preparatory phase preceding failure, which suggests that real time monitoring of active faults may be a means to detect earthquake precursors. PMID:27597879

  6. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  7. Laboratory simulations of fluid-induced seismicity in shallow volcanic faults

    NASA Astrophysics Data System (ADS)

    Fazio, Marco; Benson, Philip; Vinciguerra, Sergio; Meredith, Philip

    2015-04-01

    Seismicity is a key tool used for monitoring fracturing and faulting in around volcanoes, with a particular emphasis placed on the frequency (Long period or Low Frequency, LF events) thought to be due to fluid movement, as compared to Volcano-Tectonic activity driven by pure fracture. To better understand these fundamental processes this research presents new rock deformation experiments designed to simulate shallow volcano-tectonic pressure/temperature conditions, linking pore fluid flow to the induced seismicity. A particular emphasis is placed on the conditions of pressure and temperature required to stimulate LF activity. Our setup imposes a rapid pore pressure release or "venting" via a small pre-drilled axial conduit to stimulate rapid fluid movement through an established fracture damage zone via a two stage process. Firstly experiments are conducted to generate a through-going shear fracture, with pore fluid connectivity to this fracture enhanced via the axial conduit. The shear failure is imaged via AE location with ~mm scale accuracy. The second stage vents pore fluid pressure via an electrical solenoid valve. We find that this second stage is accompanied by a swarm of LF activity akin to Long Period (LP) activity on active volcanoes. We find that a significant change in the dominant frequency of LF events is recorded as pore fluid pressure decrease through, and beyond, the water boiling point and the transition between LF and VLF occurred at the pressure at which the superheated water turn to vapour. In addition, we observe a significant dependence of the recorded LF upon the fluid flow rate. Finally, we present new data using low frequency (200 kHz) AE sensors, in conjunction with our standard 1 MHz-central-frequency sensors, which permit us to better constraint LF and VLF events with lower attenuation, and hence an improved characterization of these LF seismic signals. Data are used to forecast the final time of failure via the fracture forecast

  8. Laboratory simulations of fluid-induced seismicity in shallow volcanic faults

    NASA Astrophysics Data System (ADS)

    Fazio, M.; Benson, P. M.; Vinciguerra, S.

    2014-12-01

    Seismicity is a key tool used for monitoring fracturing and faulting in around volcanoes, with a particular emphasis placed on the frequency (Long period or Low Frequency, LF events) thought to be due to fluid movement, as compared to Volcano-Tectonic activity driven by pure fracture. To better understand these fundamental processes this research presents new rock deformation experiments designed to simulate shallow volcano-tectonic pressure/temperature conditions, linking pore fluid flow to the induced seismicity. A particular emphasis is placed on the conditions of pressure and temperature required to stimulate LF activity. Our setup imposes a rapid pore pressure release or "venting" via a small pre-drilled axial conduit to stimulate rapid fluid movement through an established fracture damage zone via a two stage process. Firstly experiments are conducted to generate a through-going shear fracture, with pore fluid connectivity to this fracture enhanced via the axial conduit. The shear failure is imaged via AE location with ~mm scale accuracy. The second stage vents pore fluid pressure via an electrical solenoid valve. We find that this second stage is accompanied by a swarm of LF activity akin to Long Period (LP) activity on active volcanoes. We find that a significant change in the dominant frequency of LF events is recorded as pore fluid pressure decrease through, and beyond, the water boiling point and the transition between LF and VLF occurred at the pressure at which the superheated water turn to vapour. In addition, we observe a significant dependence of the recorded LF upon the fluid flow rate. Finally, we present new data using low frequency (200 kHz) AE sensors, in conjunction with our standard 1 MHz-central-frequency sensors, which permit us to better constraint LF and VLF events with lower attenuation, and hence an improved characterization of these LF seismic signals. Data are used to forecast the final time of failure via the fracture forecast

  9. Some comparisons between mining-induced and laboratory earthquakes

    USGS Publications Warehouse

    McGarr, A.

    1994-01-01

    Although laboratory stick-slip friction experiments have long been regarded as analogs to natural crustal earthquakes, the potential use of laboratory results for understanding the earthquake source mechanism has not been fully exploited because of essential difficulties in relating seismographic data to measurements made in the controlled laboratory environment. Mining-induced earthquakes, however, provide a means of calibrating the seismic data in terms of laboratory results because, in contrast to natural earthquakes, the causative forces as well as the hypocentral conditions are known. A comparison of stick-slip friction events in a large granite sample with mining-induced earthquakes in South Africa and Canada indicates both similarities and differences between the two phenomena. The physics of unstable fault slip appears to be largely the same for both types of events. For example, both laboratory and mining-induced earthquakes have very low seismic efficiencies {Mathematical expression} where ??a is the apparent stress and {Mathematical expression} is the average stress acting on the fault plane to cause slip; nearly all of the energy released by faulting is consumed in overcoming friction. In more detail, the mining-induced earthquakes differ from the laboratory events in the behavior of ?? as a function of seismic moment M0. Whereas for the laboratory events ?????0.06 independent of M0, ?? depends quite strongly on M0 for each set of induced earthquakes, with 0.06 serving, apparently, as an upper bound. It seems most likely that this observed scaling difference is due to variations in slip distribution over the fault plane. In the laboratory, a stick-slip event entails homogeneous slip over a fault of fixed area. For each set of induced earthquakes, the fault area appears to be approximately fixed but the slip is inhomogeneous due presumably to barriers (zones of no slip) distributed over the fault plane; at constant {Mathematical expression}, larger

  10. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  11. Delineating chalk sand distribution of Ekofisk formation using probabilistic neural network (PNN) and stepwise regression (SWR): Case study Danish North Sea field

    NASA Astrophysics Data System (ADS)

    Haris, A.; Nafian, M.; Riyanto, A.

    2017-07-01

    Danish North Sea Fields consist of several formations (Ekofisk, Tor, and Cromer Knoll) that was started from the age of Paleocene to Miocene. In this study, the integration of seismic and well log data set is carried out to determine the chalk sand distribution in the Danish North Sea field. The integration of seismic and well log data set is performed by using the seismic inversion analysis and seismic multi-attribute. The seismic inversion algorithm, which is used to derive acoustic impedance (AI), is model-based technique. The derived AI is then used as external attributes for the input of multi-attribute analysis. Moreover, the multi-attribute analysis is used to generate the linear and non-linear transformation of among well log properties. In the case of the linear model, selected transformation is conducted by weighting step-wise linear regression (SWR), while for the non-linear model is performed by using probabilistic neural networks (PNN). The estimated porosity, which is resulted by PNN shows better suited to the well log data compared with the results of SWR. This result can be understood since PNN perform non-linear regression so that the relationship between the attribute data and predicted log data can be optimized. The distribution of chalk sand has been successfully identified and characterized by porosity value ranging from 23% up to 30%.

  12. Up-to-date Probabilistic Earthquake Hazard Maps for Egypt

    NASA Astrophysics Data System (ADS)

    Gaber, Hanan; El-Hadidy, Mahmoud; Badawy, Ahmed

    2018-04-01

    An up-to-date earthquake hazard analysis has been performed in Egypt using a probabilistic seismic hazard approach. Through the current study, we use a complete and homogenous earthquake catalog covering the time period between 2200 BC and 2015 AD. Three seismotectonic models representing the seismic activity in and around Egypt are used. A logic-tree framework is applied to allow for the epistemic uncertainty in the declustering parameters, minimum magnitude, seismotectonic setting and ground-motion prediction equations. The hazard analysis is performed for a grid of 0.5° × 0.5° in terms of types of rock site for the peak ground acceleration (PGA) and spectral acceleration at 0.2-, 0.5-, 1.0- and 2.0-s periods. The hazard is estimated for three return periods (72, 475 and 2475 years) corresponding to 50, 10 and 2% probability of exceedance in 50 years. The uniform hazard spectra for the cities of Cairo, Alexandria, Aswan and Nuwbia are constructed. The hazard maps show that the highest ground acceleration values are expected in the northeastern part of Egypt around the Gulf of Aqaba (PGA up to 0.4 g for return period 475 years) and in south Egypt around the city of Aswan (PGA up to 0.2 g for return period 475 years). The Western Desert of Egypt is characterized by the lowest level of hazard (PGA lower than 0.1 g for return period 475 years).

  13. Detecting Seismic Events Using a Supervised Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Burks, L.; Forrest, R.; Ray, J.; Young, C.

    2017-12-01

    We explore the use of supervised hidden Markov models (HMMs) to detect seismic events in streaming seismogram data. Current methods for seismic event detection include simple triggering algorithms, such as STA/LTA and the Z-statistic, which can lead to large numbers of false positives that must be investigated by an analyst. The hypothesis of this study is that more advanced detection methods, such as HMMs, may decreases false positives while maintaining accuracy similar to current methods. We train a binary HMM classifier using 2 weeks of 3-component waveform data from the International Monitoring System (IMS) that was carefully reviewed by an expert analyst to pick all seismic events. Using an ensemble of simple and discrete features, such as the triggering of STA/LTA, the HMM predicts the time at which transition occurs from noise to signal. Compared to the STA/LTA detection algorithm, the HMM detects more true events, but the false positive rate remains unacceptably high. Future work to potentially decrease the false positive rate may include using continuous features, a Gaussian HMM, and multi-class HMMs to distinguish between types of seismic waves (e.g., P-waves and S-waves). Acknowledgement: Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.SAND No: SAND2017-8154 A

  14. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  15. Fault healing promotes high-frequency earthquakes in laboratory experiments and on natural faults

    USGS Publications Warehouse

    McLaskey, Gregory C.; Thomas, Amanda M.; Glaser, Steven D.; Nadeau, Robert M.

    2012-01-01

    Faults strengthen or heal with time in stationary contact and this healing may be an essential ingredient for the generation of earthquakes. In the laboratory, healing is thought to be the result of thermally activated mechanisms that weld together micrometre-sized asperity contacts on the fault surface, but the relationship between laboratory measures of fault healing and the seismically observable properties of earthquakes is at present not well defined. Here we report on laboratory experiments and seismological observations that show how the spectral properties of earthquakes vary as a function of fault healing time. In the laboratory, we find that increased healing causes a disproportionately large amount of high-frequency seismic radiation to be produced during fault rupture. We observe a similar connection between earthquake spectra and recurrence time for repeating earthquake sequences on natural faults. Healing rates depend on pressure, temperature and mineralogy, so the connection between seismicity and healing may help to explain recent observations of large megathrust earthquakes which indicate that energetic, high-frequency seismic radiation originates from locations that are distinct from the geodetically inferred locations of large-amplitude fault slip

  16. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  17. Seismic measurements of explosions in the Tatum Salt Dome, Mississippi

    USGS Publications Warehouse

    Borcherdt, Roger D.; Healy, J.H.; Jackson, W.H.; Warren, D.R.

    1967-01-01

    Project Sterling provided for the detonation of a nuclear device in the cavity resulting from the Salmon nuclear explosion in the Tatum salt dome in southern Mississippi. It also provided for a high explosive (HE) comparison shot in a nearby drill hole. The purpose of the experiment was to gather information on the seismic decoupling of a nuclear explosion in a cavity by comparing seismic signals from a nuclear shot in the Salmon cavity with seismic signals recorded from Salmon and with seismic signals recorded from a muall (about 2 tons) HE shot in the salt dome. Surface seismic measurements were made by the U.S. Geological Survey, the U.S. Coast and Geodetic Survey, and the Air Force Technical Applications Center with coordination and overall direction by the Lawrence Radiation Laboratory. This report covers only the seismic measurements made by the U. S. Geological Survey. The first objective of this report is to describe the field recording procedures and the data obtained by the U. S. Geological Survey from these events. The second objective is to describe the spectral analyses which have been made on the data and the relative seismic amplitudes which have been determined from these analyses.

  18. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  19. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  20. Modeling Forced Imbibition Processes and the Associated Seismic Attenuation in Heterogeneous Porous Rocks

    NASA Astrophysics Data System (ADS)

    Solazzi, Santiago G.; Guarracino, Luis; Rubino, J. Germán.; Müller, Tobias M.; Holliger, Klaus

    2017-11-01

    Quantifying seismic attenuation during laboratory imbibition experiments can provide useful information toward the use of seismic waves for monitoring injection and extraction of fluids in the Earth's crust. However, a deeper understanding of the physical causes producing the observed attenuation is needed for this purpose. In this work, we analyze seismic attenuation due to mesoscopic wave-induced fluid flow (WIFF) produced by realistic fluid distributions representative of imbibition experiments. To do so, we first perform two-phase flow simulations in a heterogeneous rock sample to emulate a forced imbibition experiment. We then select a subsample of the considered rock containing the resulting time-dependent saturation fields and apply a numerical upscaling procedure to compute the associated seismic attenuation. By exploring both saturation distributions and seismic attenuation, we observe that two manifestations of WIFF arise during imbibition experiments: the first one is produced by the compressibility contrast associated with the saturation front, whereas the second one is due to the presence of patches containing very high amounts of water that are located behind the saturation front. We demonstrate that while the former process is expected to play a significant role in the case of high injection rates, which are associated with viscous-dominated imbibition processes, the latter becomes predominant during capillary-dominated processes, that is, for relatively low injection rates. We conclude that this kind of joint numerical analysis constitutes a useful tool for improving our understanding of the physical mechanisms producing seismic attenuation during laboratory imbibition experiments.

  1. Seismic Hazard and risk assessment for Romania -Bulgaria cross-border region

    NASA Astrophysics Data System (ADS)

    Simeonova, Stela; Solakov, Dimcho; Alexandrova, Irena; Vaseva, Elena; Trifonova, Petya; Raykova, Plamena

    2016-04-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic hazard and vulnerability to earthquakes are steadily increasing as urbanization and development occupy more areas that are prone to effects of strong earthquakes. The assessment of the seismic hazard and risk is particularly important, because it provides valuable information for seismic safety and disaster mitigation, and it supports decision making for the benefit of society. Romania and Bulgaria, situated in the Balkan Region as a part of the Alpine-Himalayan seismic belt, are characterized by high seismicity, and are exposed to a high seismic risk. Over the centuries, both countries have experienced strong earthquakes. The cross-border region encompassing the northern Bulgaria and southern Romania is a territory prone to effects of strong earthquakes. The area is significantly affected by earthquakes occurred in both countries, on the one hand the events generated by the Vrancea intermediate-depth seismic source in Romania, and on the other hand by the crustal seismicity originated in the seismic sources: Shabla (SHB), Dulovo, Gorna Orjahovitza (GO) in Bulgaria. The Vrancea seismogenic zone of Romania is a very peculiar seismic source, often described as unique in the world, and it represents a major concern for most of the northern part of Bulgaria as well. In the present study the seismic hazard for Romania-Bulgaria cross-border region on the basis of integrated basic geo-datasets is assessed. The hazard results are obtained by applying two alternative approaches - probabilistic and deterministic. The MSK64 intensity (MSK64 scale is practically equal to the new EMS98) is used as output parameter for the hazard maps. We prefer to use here the macroseismic intensity instead of PGA, because it is directly related to the degree of damages and, moreover, the epicentral intensity is the original

  2. DSOD Procedures for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Howard, J. K.; Fraser, W. A.

    2005-12-01

    DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum

  3. Temblor, an App to Transform Seismic Science into Personal Risk Reduction

    NASA Astrophysics Data System (ADS)

    Sevilgen, V.; Jacobson, D. S.; Stein, R. S.; Lotto, G. C.; Sevilgen, S.; Kim, A.

    2016-12-01

    Government agencies and academic researchers provide a rich stream of seismic and engineering data. In addition to rapid earthquake notifications and damage assessments, these form the basis of probabilistic seismic hazard assessments and loss evaluations used by emergency management agencies, practicing engineers and geologists, and the insurance industry. But the data and the assessments that grow out of them are notoriously difficult for the public to comprehend. For example, who but the cognoscenti understands what "2% exceedance probability in 50 years," "0.5 g peak ground acceleration," or "moment-magnitude" mean? Nowhere is this divide more stark than in earthquake insurance. Using proprietary models, insurers calculate the probability of a payout above the deductible for your home policy, but sell the policy as "peace of mind" or "the strength to rebuild." How can a homeowner act in her best financial interests under these circumstances? Temblor (temblor.net) is our attempt to make seismic risk lucid, personal, and actionable. Free and ad-free, Temblor uses the best available public data and methods. Temblor gives you the seismic hazard rank of your location anywhere in the U.S. In its maps, you can see the active faults and recent quakes, and the landslide, liquefaction, tsunami inundation, and flood zones around you. Temblor also displays the Global Earthquake Activity Rate (GEAR) model of Bird et al. (2015). By entering the construction year and square footage for homes within the U.S., you learn the likely cost for seismic damage, and how that cost could be reduced by retrofit or covered by insurance. To give context to this decision, the app compares your seismic risk to other risks homeowners protect themselves against or insure for. Temblor estimates the cost and the most probable financial and safety benefits of a retrofit based on your location, home age and size, so you can decide if the expenditure makes sense. Seeking to make quakes more

  4. Extreme Threshold Failures Within a Heterogeneous Elastic Thin Sheet and the Spatial-Temporal Development of Induced Seismicity Within the Groningen Gas Field

    NASA Astrophysics Data System (ADS)

    Bourne, S. J.; Oates, S. J.

    2017-12-01

    Measurements of the strains and earthquakes induced by fluid extraction from a subsurface reservoir reveal a transient, exponential-like increase in seismicity relative to the volume of fluids extracted. If the frictional strength of these reactivating faults is heterogeneously and randomly distributed, then progressive failures of the weakest fault patches account in a general manner for this initial exponential-like trend. Allowing for the observable elastic and geometric heterogeneity of the reservoir, the spatiotemporal evolution of induced seismicity over 5 years is predictable without significant bias using a statistical physics model of poroelastic reservoir deformations inducing extreme threshold frictional failures of previously inactive faults. This model is used to forecast the temporal and spatial probability density of earthquakes within the Groningen natural gas reservoir, conditional on future gas production plans. Probabilistic seismic hazard and risk assessments based on these forecasts inform the current gas production policy and building strengthening plans.

  5. High Resolution Vertical Seismic Profile from the Chicxulub IODP/ICDP Expedition 364 Borehole: Wave Speeds and Seismic Reflectivity.

    NASA Astrophysics Data System (ADS)

    Nixon, C.; Kofman, R.; Schmitt, D. R.; Lofi, J.; Gulick, S. P. S.; Christeson, G. L.; Saustrup, S., Sr.; Morgan, J. V.

    2017-12-01

    We acquired a closely-spaced vertical seismic profile (VSP) in the Chicxulub K-Pg Impact Crater drilling program borehole to calibrate the existing surface seismic profiles and provide complementary measurements of in situ seismic wave speeds. Downhole seismic records were obtained at spacings ranging from 1.25 m to 5 m along the borehole from 47.5 m to 1325 mwsf (meters wireline below sea floor) (Fig 1a) using a Sercel SlimwaveTM geophone chain (University of Alberta). The seismic source was a 30/30ci Sercel Mini GI airgun (University of Texas), fired a minimum of 5 times per station. Seismic data processing used a combination of a commercial processing package (Schlumberger's VISTA) and MatlabTM codes. The VSP displays detailed reflectivity (Fig. 1a) with the strongest reflection seen at 600 mwsf (280 ms one-way time), geologically corresponding to the sharp contact between the post-impact sediments and the target peak ring rock, thus confirming the pre-drilling interpretations of the seismic profiles. A two-way time trace extracted from the separated up-going wavefield matches the major reflection both in travel time and character. In the granitic rocks that form the peak ring of the Chicxulub impact crater, we observe P-wave velocities of 4000-4500 m/s which are significantly less than the expected values of granitoids ( 6000 m/s) (Fig. 1b). The VSP measured wave speeds are confirmed against downhole sonic logging and in laboratory velocimetry measurements; these data provide additional evidence that the crustal material displaced by the impact experienced a significant amount of damage. Samples and data provided by IODP. Samples can be requested at http://web.iodp.tamu.edu/sdrm after 19 October 2017. Expedition 364 was jointly funded by ECORD, ICDP, and IODP with contributions and logistical support from the Yucatan State Government and UNAM. The downhole seismic chain and wireline system is funded by grants to DRS from the Canada Foundation for Innovation and

  6. Estimating the economic impact of seismic activity in Kyrgyzstan

    NASA Astrophysics Data System (ADS)

    Pittore, Massimiliano; Sousa, Luis; Grant, Damian; Fleming, Kevin; Parolai, Stefano; Free, Matthew; Moldobekov, Bolot; Takeuchi, Ko

    2017-04-01

    Estimating the short and long-term economical impact of large-scale damaging events such as earthquakes, tsunamis or tropical storms is an important component of risk assessment, whose outcomes are routinely used to improve risk awareness, optimize investments in prevention and mitigation actions, as well as to customize insurance and reinsurance rates to specific geographical regions or single countries. Such estimations can be carried out by modeling the whole causal process, from hazard assessment to the estimation of loss for specific categories of assets. This approach allows a precise description of the various physical mechanisms contributing to direct seismic losses. However, it should reflect the underlying epistemic and random uncertainties in all involved components in a meaningful way. Within a project sponsored by the World Bank, a seismic risk study for the Kyrgyz Republic has been conducted, focusing on the assessment of social and economical impacts assessed in terms of direct losses of the residential and public building stocks. Probabilistic estimates based on stochastic event catalogs have been computed and integrated with the simulation of specific earthquake scenarios. Although very few relevant data are available in the region on the economic consequences of past damaging events, the proposed approach sets a benchmark for decision makers and policy holders to better understand the short and long term consequences of earthquakes in the region. The presented results confirm the high level of seismic risk of the Kyrgyz Republic territory, outlining the most affected regions; thus advocating for significant Disaster Risk Reduction (DRR) measures to be implemented by local decision- and policy-makers.

  7. Frictional melt and seismic slip

    NASA Astrophysics Data System (ADS)

    Nielsen, S.; di Toro, G.; Hirose, T.; Shimamoto, T.

    2008-01-01

    Frictional melt is implied in a variety of processes such as seismic slip, ice skating, and meteorite combustion. A steady state can be reached when melt is continuously produced and extruded from the sliding interface, as shown recently in a number of laboratory rock friction experiments. A thin, low-viscosity, high-temperature melt layer is formed resulting in low shear resistance. A theoretical solution describing the coupling of shear heating, thermal diffusion, and extrusion is obtained, without imposing a priori the melt thickness. The steady state shear traction can be approximated at high slip rates by the theoretical form τss = σn1/4 (A/?) ? under a normal stress σn, slip rate V, radius of contact area R (A is a dimensional normalizing factor and W is a characteristic rate). Although the model offers a rather simplified view of a complex process, the predictions are compatible with experimental observations. In particular, we consider laboratory simulations of seismic slip on earthquake faults. A series of high-velocity rotary shear experiments on rocks, performed for σn in the range 1-20 MPa and slip rates in the range 0.5-2 m s-1, is confronted to the theoretical model. The behavior is reasonably well reproduced, though the effect of radiation loss taking place in the experiment somewhat alters the data. The scaling of friction with σn, R, and V in the presence of melt suggests that extrapolation of laboratory measures to real Earth is a highly nonlinear, nontrivial exercise.

  8. A method for producing digital probabilistic seismic landslide hazard maps; an example from the Los Angeles, California, area

    USGS Publications Warehouse

    Jibson, Randall W.; Harp, Edwin L.; Michael, John A.

    1998-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24,000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10-m grid spacing in the ARC/INFO GIS platform. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure.

  9. User perception and interpretation of tornado probabilistic hazard information: Comparison of four graphical designs.

    PubMed

    Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans

    2017-11-01

    Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Seismic source parameters of the induced seismicity at The Geysers geothermal area, California, by a generalized inversion approach

    NASA Astrophysics Data System (ADS)

    Picozzi, Matteo; Oth, Adrien; Parolai, Stefano; Bindi, Dino; De Landro, Grazia; Amoroso, Ortensia

    2017-04-01

    The accurate determination of stress drop, seismic efficiency and how source parameters scale with earthquake size is an important for seismic hazard assessment of induced seismicity. We propose an improved non-parametric, data-driven strategy suitable for monitoring induced seismicity, which combines the generalized inversion technique together with genetic algorithms. In the first step of the analysis the generalized inversion technique allows for an effective correction of waveforms for the attenuation and site contributions. Then, the retrieved source spectra are inverted by a non-linear sensitivity-driven inversion scheme that allows accurate estimation of source parameters. We therefore investigate the earthquake source characteristics of 633 induced earthquakes (ML 2-4.5) recorded at The Geysers geothermal field (California) by a dense seismic network (i.e., 32 stations of the Lawrence Berkeley National Laboratory Geysers/Calpine surface seismic network, more than 17.000 velocity records). We find for most of the events a non-selfsimilar behavior, empirical source spectra that requires ωγ source model with γ > 2 to be well fitted and small radiation efficiency ηSW. All these findings suggest different dynamic rupture processes for smaller and larger earthquakes, and that the proportion of high frequency energy radiation and the amount of energy required to overcome the friction or for the creation of new fractures surface changes with the earthquake size. Furthermore, we observe also two distinct families of events with peculiar source parameters that, in one case suggests the reactivation of deep structures linked to the regional tectonics, while in the other supports the idea of an important role of steeply dipping fault in the fluid pressure diffusion.

  11. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  12. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications

  13. Small Arrays for Seismic Intruder Detections: A Simulation Based Experiment

    NASA Astrophysics Data System (ADS)

    Pitarka, A.

    2014-12-01

    Seismic sensors such as geophones and fiber optic have been increasingly recognized as promising technologies for intelligence surveillance, including intruder detection and perimeter defense systems. Geophone arrays have the capability to provide cost effective intruder detection in protecting assets with large perimeters. A seismic intruder detection system uses one or multiple arrays of geophones design to record seismic signals from footsteps and ground vehicles. Using a series of real-time signal processing algorithms the system detects, classify and monitors the intruder's movement. We have carried out numerical experiments to demonstrate the capability of a seismic array to detect moving targets that generate seismic signals. The seismic source is modeled as a vertical force acting on the ground that generates continuous impulsive seismic signals with different predominant frequencies. Frequency-wave number analysis of the synthetic array data was used to demonstrate the array's capability at accurately determining intruder's movement direction. The performance of the array was also analyzed in detecting two or more objects moving at the same time. One of the drawbacks of using a single array system is its inefficiency at detecting seismic signals deflected by large underground objects. We will show simulation results of the effect of an underground concrete block at shielding the seismic signal coming from an intruder. Based on simulations we found that multiple small arrays can greatly improve the system's detection capability in the presence of underground structures. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344

  14. The underground seismic array of Gran Sasso (UNDERSEIS), central Italy

    NASA Astrophysics Data System (ADS)

    Scarpa, R.; Muscente, R.; Tronca, F.; Fischione, C.; Rotella, P.; Abril, M.; Alguacil, G.; Martini, M.; de Cesare, W.

    2003-04-01

    Since early May, 2002, a small aperture seismic array has been installed in the underground Physics Laboratories of Gran Sasso, located near seismic active faults of central Apennines, Italy. This array is presently composed by 21 three-component short period seismic stations (Mark L4C-3D), with average distance 90 m and semi-circular aperture of 400 m x 600 m. It is intersecting a main seismogenic fault where the presence of slow earthquakes has been recently detected through two wide band geodetic laser interferometers. The underground Laboratories are shielded by a limestone rock layer having 1400 m thickness. Each seismometer is linked, through a 24 bits A/D board, to a set of 6 industrial PC via a serial RS-485 standard. The six PC transmit data to a server through an ethernet network. Time syncronization is provided by a Master Oscillator controlled by an atomic clock. Earthworm package is used for data selection and transmission. High quality data have been recorded since May 2002, including local and regional earthquakes. In particular the 31 October, 2002, Molise (Mw=5.8 earthquake) and its aftershocks have been recorded at this array. Array techniques such as polarisation and frequency-slowness analyses with the MUSIC noise algorithm indicate the high performance of this array, as compared to the national seismic network, for identifying the basic source parameters for earthquakes located at distance of few hundreds of km.

  15. 50 years of Global Seismic Observations

    NASA Astrophysics Data System (ADS)

    Anderson, K. R.; Butler, R.; Berger, J.; Davis, P.; Derr, J.; Gee, L.; Hutt, C. R.; Leith, W. S.; Park, J. J.

    2007-12-01

    Seismological recordings have been made on Earth for hundreds of years in some form or another, however, global monitoring of earthquakes only began in the 1890's when John Milne created 40 seismic observatories to measure the waves from these events. Shortly after the International Geophysical Year (IGY), a concerted effort was made to establish and maintain a more modern standardized seismic network on the global scale. In the early 1960's, the World-Wide Standardized Seismograph Network (WWSSN) was established through funding from the Advanced Research Projects Agency (ARPA) and was installed and maintained by the USGS's Albuquerque Seismological Laboratory (then a part of the US Coast and Geodetic Survey). This network of identical seismic instruments consisted of 120 stations in 60 countries. Although the network was motivated by nuclear test monitoring, the WWSSN facilitated numerous advances in observational seismology. From the IGY to the present, the network has been upgraded (High-Gain Long-Period Seismograph Network, Seismic Research Observatories, Digital WWSSN, Global Telemetered Seismograph Network, etc.) and expanded (International Deployment of Accelerometers, US National Seismic Network, China Digital Seismograph Network, Joint Seismic Project, etc.), bringing the modern day Global Seismographic Network (GSN) to a current state of approximately 150 stations. The GSN consists of state-of-the-art very broadband seismic transducers, continuous power and communications, and ancillary sensors including geodetic, geomagnetic, microbarographic, meteorological and other related instrumentation. Beyond the GSN, the system of global network observatories includes contributions from other international partners (e.g., GEOSCOPE, GEOFON, MEDNET, F-Net, CTBTO), forming an even larger backbone of permanent seismological observatories as a part of the International Federation of Digital Seismograph Networks. 50 years of seismic network operations have provided

  16. Integration of P- and SH-wave high-resolution seismic reflection and micro-gravity techniques to improve interpretation of shallow subsurface structure: New Madrid seismic zone

    USGS Publications Warehouse

    Bexfield, C.E.; McBride, J.H.; Pugin, Andre J.M.; Ravat, D.; Biswas, S.; Nelson, W.J.; Larson, T.H.; Sargent, S.L.; Fillerup, M.A.; Tingey, B.E.; Wald, L.; Northcott, M.L.; South, J.V.; Okure, M.S.; Chandler, M.R.

    2006-01-01

    Shallow high-resolution seismic reflection surveys have traditionally been restricted to either compressional (P) or horizontally polarized shear (SH) waves in order to produce 2-D images of subsurface structure. The northernmost Mississippi embayment and coincident New Madrid seismic zone (NMSZ) provide an ideal laboratory to study the experimental use of integrating P- and SH-wave seismic profiles, integrated, where practicable, with micro-gravity data. In this area, the relation between "deeper" deformation of Paleozoic bedrock associated with the formation of the Reelfoot rift and NMSZ seismicity and "shallower" deformation of overlying sediments has remained elusive, but could be revealed using integrated P- and SH-wave reflection. Surface expressions of deformation are almost non-existent in this region, which makes seismic reflection surveying the only means of detecting structures that are possibly pertinent to seismic hazard assessment. Since P- and SH-waves respond differently to the rock and fluid properties and travel at dissimilar speeds, the resulting seismic profiles provide complementary views of the subsurface based on different levels of resolution and imaging capability. P-wave profiles acquired in southwestern Illinois and western Kentucky (USA) detect faulting of deep, Paleozoic bedrock and Cretaceous reflectors while coincident SH-wave surveys show that this deformation propagates higher into overlying Tertiary and Quaternary strata. Forward modeling of micro-gravity data acquired along one of the seismic profiles further supports an interpretation of faulting of bedrock and Cretaceous strata. The integration of the two seismic and the micro-gravity methods therefore increases the scope for investigating the relation between the older and younger deformation in an area of critical seismic hazard. ?? 2006 Elsevier B.V. All rights reserved.

  17. Finite-Difference Numerical Simulation of Seismic Gradiometry

    NASA Astrophysics Data System (ADS)

    Aldridge, D. F.; Symons, N. P.; Haney, M. M.

    2006-12-01

    separation. Finally, numerical tests of the "point seismic array" concept are oriented toward understanding its potential and limitations. Sandia National Laboratories is a multiprogram science and engineering facility operated by Sandia Corporation, a Lockheed-Martin company, for the United States Department of Energy under contract DE- AC04-94AL85000.

  18. Scaling of the critical slip distance for seismic faulting with shear strain in fault zones

    USGS Publications Warehouse

    Marone, Chris; Kilgore, Brian D.

    1993-01-01

    THEORETICAL and experimentally based laws for seismic faulting contain a critical slip distance1-5, Dc, which is the slip over which strength breaks down during earthquake nucleation. On an earthquake-generating fault, this distance plays a key role in determining the rupture nucleation dimension6, the amount of premonitory and post-seismic slip7-10, and the maximum seismic ground acceleration1,11. In laboratory friction experiments, Dc has been related to the size of surface contact junctions2,5,12; thus, the discrepancy between laboratory measurements of Dc (??? 10-5 m) and values obtained from modelling earthquakes (??? 10-2 m) has been attributed to differences in roughness between laboratory surfaces and natural faults5. This interpretation predicts a dependence of Dc on the particle size of fault gouge 2 (breccia and wear material) but not on shear strain. Here we present experimental results showing that Dc scales with shear strain in simulated fault gouge. Our data suggest a new physical interpretation for the critical slip distance, in which Dc is controlled by the thickness of the zone of localized shear strain. As gouge zones of mature faults are commonly 102-103 m thick13-17, whereas laboratory gouge layers are 1-10 mm thick, our data offer an alternative interpretation of the discrepancy between laboratory and field-based estimates of Dc.

  19. Near-real time 3D probabilistic earthquakes locations at Mt. Etna volcano

    NASA Astrophysics Data System (ADS)

    Barberi, G.; D'Agostino, M.; Mostaccio, A.; Patane', D.; Tuve', T.

    2012-04-01

    Automatic procedure for locating earthquake in quasi-real time must provide a good estimation of earthquakes location within a few seconds after the event is first detected and is strongly needed for seismic warning system. The reliability of an automatic location algorithm is influenced by several factors such as errors in picking seismic phases, network geometry, and velocity model uncertainties. On Mt. Etna, the seismic network is managed by INGV and the quasi-real time earthquakes locations are performed by using an automatic-picking algorithm based on short-term-average to long-term-average ratios (STA/LTA) calculated from an approximate squared envelope function of the seismogram, which furnish a list of P-wave arrival times, and the location algorithm Hypoellipse, with a 1D velocity model. The main purpose of this work is to investigate the performances of a different automatic procedure to improve the quasi-real time earthquakes locations. In fact, as the automatic data processing may be affected by outliers (wrong picks), the use of a traditional earthquake location techniques based on a least-square misfit function (L2-norm) often yield unstable and unreliable solutions. Moreover, on Mt. Etna, the 1D model is often unable to represent the complex structure of the volcano (in particular the strong lateral heterogeneities), whereas the increasing accuracy in the 3D velocity models at Mt. Etna during recent years allows their use today in routine earthquake locations. Therefore, we selected, as reference locations, all the events occurred on Mt. Etna in the last year (2011) which was automatically detected and located by means of the Hypoellipse code. By using this dataset (more than 300 events), we applied a nonlinear probabilistic earthquake location algorithm using the Equal Differential Time (EDT) likelihood function, (Font et al., 2004; Lomax, 2005) which is much more robust in the presence of outliers in the data. Successively, by using a probabilistic

  20. Approximate Seismic Diffusive Models of Near-Receiver Geology: Applications from Lab Scale to Field

    NASA Astrophysics Data System (ADS)

    King, Thomas; Benson, Philip; De Siena, Luca; Vinciguerra, Sergio

    2017-04-01

    This paper presents a novel and simple method of seismic envelope analysis that can be applied at multiple scales, e.g. field, m to km scale and laboratory, mm to cm scale, and utilises the diffusive approximation of the seismic wavefield (Wegler, 2003). Coefficient values for diffusion and attenuation are obtained from seismic coda energies and are used to describe the rate at which seismic energy is scattered and attenuated into the local medium around a receiver. Values are acquired by performing a linear least squares inversion of coda energies calculated in successive time windows along a seismic trace. Acoustic emission data were taken from piezoelectric transducers (PZT) with typical resonance frequency of 1-5MHz glued around rock samples during deformation laboratory experiments carried out using a servo-controlled triaxial testing machine, where a shear/damage zone is generated under compression after the nucleation, growth and coalescence of microcracks. Passive field data were collected from conventional geophones during the 2004-2008 eruption of Mount St. Helens volcano (MSH), USA where a sudden reawakening of the volcanic activity and a new dome growth has occurred. The laboratory study shows a strong correlation between variations of the coefficients over time and the increase of differential stress as the experiment progresses. The field study links structural variations present in the near-surface geology, including those seen in previous geophysical studies of the area, to these same coefficients. Both studies show a correlation between frequency and structural feature size, i.e. landslide slip-planes and microcracks, with higher frequencies being much more sensitive to smaller scale features and vice-versa.

  1. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    NASA Astrophysics Data System (ADS)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  2. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  3. 2018 one‐year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Rukstales, Kenneth S.; McNamara, Daniel E.; Williams, Robert A.; Shumway, Allison; Powers, Peter; Earle, Paul; Llenos, Andrea L.; Michael, Andrew J.; Rubinstein, Justin L.; Norbeck, Jack; Cochran, Elizabeth S.

    2018-01-01

    This article describes the U.S. Geological Survey (USGS) 2018 one‐year probabilistic seismic hazard forecast for the central and eastern United States from induced and natural earthquakes. For consistency, the updated 2018 forecast is developed using the same probabilistic seismicity‐based methodology as applied in the two previous forecasts. Rates of earthquakes across the United States M≥3.0">M≥3.0 grew rapidly between 2008 and 2015 but have steadily declined over the past 3 years, especially in areas of Oklahoma and southern Kansas where fluid injection has decreased. The seismicity pattern in 2017 was complex with earthquakes more spatially dispersed than in the previous years. Some areas of west‐central Oklahoma experienced increased activity rates where industrial activity increased. Earthquake rates in Oklahoma (429 earthquakes of M≥3">M≥3 and 4 M≥4">M≥4), Raton basin (Colorado/New Mexico border, six earthquakes M≥3">M≥3), and the New Madrid seismic zone (11 earthquakes M≥3">M≥3) continue to be higher than historical levels. Almost all of these earthquakes occurred within the highest hazard regions of the 2017 forecast. Even though rates declined over the past 3 years, the short‐term hazard for damaging ground shaking across much of Oklahoma remains at high levels due to continuing high rates of smaller earthquakes that are still hundreds of times higher than at any time in the state’s history. Fine details and variability between the 2016–2018 forecasts are obscured by significant uncertainties in the input model. These short‐term hazard levels are similar to active regions in California. During 2017, M≥3">M≥3 earthquakes also occurred in or near Ohio, West Virginia, Missouri, Kentucky, Tennessee, Arkansas, Illinois, Oklahoma, Kansas, Colorado, New Mexico, Utah, and Wyoming.

  4. Seismic properties of fluid bearing formations in magmatic geothermal systems: can we directly detect geothermal activity with seismic methods?

    NASA Astrophysics Data System (ADS)

    Grab, Melchior; Scott, Samuel; Quintal, Beatriz; Caspari, Eva; Maurer, Hansruedi; Greenhalgh, Stewart

    2016-04-01

    Seismic methods are amongst the most common techniques to explore the earth's subsurface. Seismic properties such as velocities, impedance contrasts and attenuation enable the characterization of the rocks in a geothermal system. The most important goal of geothermal exploration, however, is to describe the enthalpy state of the pore fluids, which act as the main transport medium for the geothermal heat, and to detect permeable structures such as fracture networks, which control the movement of these pore fluids in the subsurface. Since the quantities measured with seismic methods are only indirectly related with the fluid state and the rock permeability, the interpretation of seismic datasets is difficult and usually delivers ambiguous results. To help overcome this problem, we use a numerical modeling tool that quantifies the seismic properties of fractured rock formations that are typically found in magmatic geothermal systems. We incorporate the physics of the pore fluids, ranging from the liquid to the boiling and ultimately vapor state. Furthermore, we consider the hydromechanics of permeable structures at different scales from small cooling joints to large caldera faults as are known to be present in volcanic systems. Our modeling techniques simulate oscillatory compressibility and shear tests and yield the P- and S-wave velocities and attenuation factors of fluid saturated fractured rock volumes. To apply this modeling technique to realistic scenarios, numerous input parameters need to be indentified. The properties of the rock matrix and individual fractures were derived from extensive literature research including a large number of laboratory-based studies. The geometries of fracture networks were provided by structural geologists from their published studies of outcrops. Finally, the physical properties of the pore fluid, ranging from those at ambient pressures and temperatures up to the supercritical conditions, were taken from the fluid physics

  5. Towards a Multi-Resolution Model of Seismic Risk in Central Asia. Challenge and perspectives

    NASA Astrophysics Data System (ADS)

    Pittore, M.; Wieland, M.; Bindi, D.; Parolai, S.

    2011-12-01

    Assessing seismic risk, defined as the probability of occurrence of economical and social losses as consequence of an earthquake, both at regional and at local scale is a challenging, multi-disciplinary task. In order to provide a reliable estimate, diverse information must be gathered by seismologists, geologists, engineers and civil authorities, and carefully integrated keeping into account the different levels of uncertainty. The research towards an integrated methodology, able to seamlessly describe seismic risk at different spatial scales is challenging, but discloses new application perspectives, particularly in those countries which suffer from a relevant seismic hazard but do not have resources for a standard assessment. Central Asian countries in particular, which exhibit one of the highest seismic hazard in the world, are experiencing a steady demographic growth, often accompanied by informal settlement and urban sprawling. A reliable evaluation of how these factors affect the seismic risk, together with a realistic assessment of the assets exposed to seismic hazard and their structural vulnerability is of particular importance, in order to undertake proper mitigation actions and to promptly and efficiently react to a catastrophic event. New strategies are needed to efficiently cope with systematic lack of information and uncertainties. An original approach is presented to assess seismic risk based on integration of information coming from remote-sensing and ground-based panoramic imaging, in situ measurements, expert knowledge and already available data. Efficient sampling strategies based on freely available medium-resolution multi-spectral satellite images are adopted to optimize data collection and validation, in a multi-scale approach. Panoramic imaging is also considered as a valuable ground-based visual data collection technique, suitable both for manual and automatic analysis. A full-probabilistic framework based on Bayes Network is proposed to

  6. High-resolution seismicity catalog of Italian peninsula in the period 1981-2015

    NASA Astrophysics Data System (ADS)

    Michele, M.; Latorre, D.; Castello, B.; Di Stefano, R.; Chiaraluce, L.

    2017-12-01

    In order to provide an updated reference catalog of Italian seismicity, the absolute location of the last 35 years (1981-2015) of seismic activity was computed with a three-dimensional VP and VS velocity model covering the whole Italian territory. The NonLinLoc code (Lomax et al., 2000), which is based on a probabilistic approach, was used to provide a complete and robust description of the uncertainties associated to the locations corresponding to the hypocentral solutions with the highest probability density. Moreover, the code using a finite difference approximation of the eikonal equation (Podvin and Lecomte, 1991), allows to manage very contrasted velocity models in the arrival time computation. To optimize the earthquakes location, we included the station corrections in the inverse problem. For each year, the number of available earthquakes depends on both the network detection capability and the occurrence of major seismic sequences. The starting earthquakes catalog was based on 2.6 million P and 1.9 million S arrival time picks for 278.607 selected earthquakes, recorded at least by 3 seismic stations of the Italian seismic network. The new catalog compared to the previous ones consisting of hypocentral locations retrieved with linearized location methods, shows a very good improvement as testified by the location parameters assessing the quality of the solution (i.e., RMS, azimuthal gap, formal error on horizontal and vertical components). In addition, we used the distance between the expected and the maximum likelihood hypocenter location to establish the unimodal (high-resolved location) or multimodal (poor-resolved location) character of the probability distribution. We used these parameters to classify the resulting locations in four classes (A, B, C and D) considering the simultaneous goodness of the previous parameters. The upper classes (A and B) include the 65% of the relocated earthquake, while the lowest class (D) only includes the 7% of the

  7. Characterizing seismic noise in the 2-20 Hz band at a gravitational wave observatory

    NASA Astrophysics Data System (ADS)

    Coward, D.; Turner, J.; Blair, D.; Galybin, K.

    2005-04-01

    We present a study of seismic noise, using an array of seismic sensors, at the Australian International Gravitational Observatory. We show that despite excellent attenuation of 2-20 Hz seismic waves from the soil properties of the site, which is confirmed by a specific experiment, there are important technical issues associated with local sources of vibration originating from within the laboratory buildings. In particular, we identify vibrations from air-filtration equipment propagating throughout the site. We find significant building resonances in the 2-13 Hz band and identify seismic noise originating from regional mine blasts hundreds of kilometers distant. All these noise sources increase the performance requirements on vibration isolation in the 2-20 Hz frequency band.

  8. Martian seismicity

    NASA Technical Reports Server (NTRS)

    Phillips, Roger J.; Grimm, Robert E.

    1991-01-01

    The design and ultimate success of network seismology experiments on Mars depends on the present level of Martian seismicity. Volcanic and tectonic landforms observed from imaging experiments show that Mars must have been a seismically active planet in the past and there is no reason to discount the notion that Mars is seismically active today but at a lower level of activity. Models are explored for present day Mars seismicity. Depending on the sensitivity and geometry of a seismic network and the attenuation and scattering properties of the interior, it appears that a reasonable number of Martian seismic events would be detected over the period of a decade. The thermoelastic cooling mechanism as estimated is surely a lower bound, and a more refined estimate would take into account specifically the regional cooling of Tharsis and lead to a higher frequency of seismic events.

  9. Toward predicting clay landslide with ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Larose, E. F.; Mainsant, G.; Carriere, S.; Chambon, G.; Michoud, C.; Jongmans, D.; Jaboyedoff, M.

    2013-12-01

    Clay-rich pose critical problems in risk management worldwide. The most widely proposed mechanism leading to such flow-like movements is the increase in water pore pressure in the sliding mass, generating partial or complete liquefaction. This solid-to-liquid transition results in a dramatic reduction of mechanical rigidity, which could be detected by monitoring shear wave velocity variations, The ambient seismic noise correlation technique has been applied to measure the variation in the seismic surface wave velocity in the Pont Bourquin landslide (Swiss Alps). This small but active composite earthslide-earthflow was equipped with continuously recording seismic sensors during spring and summer 2010, and then again from fall 2011 on. An earthslide of a few thousand cubic meters was triggered in mid-August 2010, after a rainy period. This article shows that the seismic velocity of the sliding material, measured from daily noise correlograms, decreased continuously and rapidly for several days prior to the catastrophic event. From a spectral analysis of the velocity decrease, it was possible to determine the location of the change at the base of the sliding layer. These results are confirmed by analogous small-scale experiments in the laboratory. These results demonstrate that ambient seismic noise can be used to detect rigidity variations before failure and could potentially be used to predict landslides.

  10. Seismic Ecology

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The paper is devoted to researches of influence of seismic actions for industrial and civil buildings and people. The seismic actions bring influence directly on the people (vibration actions, force shocks at earthquakes) or indirectly through various build- ings and the constructions and can be strong (be felt by people) and weak (be fixed by sensing devices). The great number of work is devoted to influence of violent seismic actions (first of all of earthquakes) on people and various constructions. This work is devoted to study weak, but long seismic actions on various buildings and people. There is a need to take into account seismic oscillations, acting on the territory, at construction of various buildings on urbanized territories. Essential influence, except for violent earthquakes, man-caused seismic actions: the explosions, seismic noise, emitted by plant facilities and moving transport, radiation from high-rise buildings and constructions under action of a wind, etc. can exert. Materials on increase of man- caused seismicity in a number of regions in Russia, which earlier were not seismic, are presented in the paper. Along with maps of seismic microzoning maps to be built indicating a variation of amplitude spectra of seismic noise within day, months, years. The presence of an information about amplitudes and frequencies of oscillations from possible earthquakes and man-caused oscillations in concrete regions allows carry- ing out soundly designing and construction of industrial and civil housing projects. The construction of buildings even in not seismically dangerous regions, which have one from resonance frequencies coincident on magnitude to frequency of oscillations, emitted in this place by man-caused objects, can end in failure of these buildings and heaviest consequences for the people. The practical examples of detail of engineering- seismological investigation of large industrial and civil housing projects of Siberia territory (hydro power

  11. Standard penetration test-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    USGS Publications Warehouse

    Cetin, K.O.; Seed, R.B.; Der Kiureghian, A.; Tokimatsu, K.; Harder, L.F.; Kayen, R.E.; Moss, R.E.S.

    2004-01-01

    This paper presents'new correlations for assessment of the likelihood of initiation (or triggering) of soil liquefaction. These new correlations eliminate several sources of bias intrinsic to previous, similar correlations, and provide greatly reduced overall uncertainty and variance. Key elements in the development of these new correlations are (1) accumulation of a significantly expanded database of field performance case histories; (2) use of improved knowledge and understanding of factors affecting interpretation of standard penetration test data; (3) incorporation of improved understanding of factors affecting site-specific earthquake ground motions (including directivity effects, site-specific response, etc.); (4) use of improved methods for assessment of in situ cyclic shear stress ratio; (5) screening of field data case histories on a quality/uncertainty basis; and (6) use of high-order probabilistic tools (Bayesian updating). The resulting relationships not only provide greatly reduced uncertainty, they also help to resolve a number of corollary issues that have long been difficult and controversial including: (1) magnitude-correlated duration weighting factors, (2) adjustments for fines content, and (3) corrections for overburden stress. ?? ASCE.

  12. Slope Stability Analysis In Seismic Areas Of The Northern Apennines (Italy)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo Presti, D.; Fontana, T.; Marchetti, D.

    2008-07-08

    Several research works have been published on the slope stability in the northern Tuscany (central Italy) and particularly in the seismic areas of Garfagnana and Lunigiana (Lucca and Massa-Carrara districts), aimed at analysing the slope stability under static and dynamic conditions and mapping the landslide hazard. In addition, in situ and laboratory investigations are available for the study area, thanks to the activities undertaken by the Tuscany Seismic Survey. Based on such a huge information the co-seismic stability of few ideal slope profiles have been analysed by means of Limit equilibrium method LEM - (pseudo-static) and Newmark sliding block analysismore » (pseudo-dynamic). The analysis--results gave indications about the most appropriate seismic coefficient to be used in pseudo-static analysis after establishing allowable permanent displacement. Such indications are commented in the light of the Italian and European prescriptions for seismic stability analysis with pseudo-static approach. The stability conditions, obtained from the previous analyses, could be used to define microzonation criteria for the study area.« less

  13. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  14. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  15. Probabilistic finite elements for fracture mechanics

    NASA Technical Reports Server (NTRS)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  16. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  17. Seismic risk assessment of Navarre (Northern Spain)

    NASA Astrophysics Data System (ADS)

    Gaspar-Escribano, J. M.; Rivas-Medina, A.; García Rodríguez, M. J.; Benito, B.; Tsige, M.; Martínez-Díaz, J. J.; Murphy, P.

    2009-04-01

    The RISNA project, financed by the Emergency Agency of Navarre (Northern Spain), aims at assessing the seismic risk of the entire region. The final goal of the project is the definition of emergency plans for future earthquakes. With this purpose, four main topics are covered: seismic hazard characterization, geotechnical classification, vulnerability assessment and damage estimation to structures and exposed population. A geographic information system is used to integrate, analyze and represent all information colleted in the different phases of the study. Expected ground motions on rock conditions with a 90% probability of non-exceedance in an exposure time of 50 years are determined following a Probabilistic Seismic Hazard Assessment (PSHA) methodology that includes a logic tree with different ground motion and source zoning models. As the region under study is located in the boundary between Spain and France, an effort is required to collect and homogenise seismological data from different national and regional agencies. A new homogenised seismic catalogue, merging data from Spanish, French, Catalonian and international agencies and establishing correlations between different magnitude scales, is developed. In addition, a new seismic zoning model focused on the study area is proposed. Results show that the highest ground motions on rock conditions are expected in the northeastern part of the region, decreasing southwards. Seismic hazard can be expressed as low-to-moderate. A geotechnical classification of the entire region is developed based on surface geology, available borehole data and morphotectonic constraints. Frequency-dependent amplification factors, consistent with code values, are proposed. The northern and southern parts of the region are characterized by stiff and soft soils respectively, being the softest soils located along river valleys. Seismic hazard maps including soil effects are obtained by applying these factors to the seismic hazard maps

  18. Toggling of seismicity by the 1997 Kagoshima earthquake couplet: A demonstration of time-dependent stress transfer

    USGS Publications Warehouse

    Toda, S.; Stein, R.

    2003-01-01

    Two M ??? 6 well-recorded strike-slip earthquakes struck just 4 km and 48 days apart in Kagoshima prefecture, Japan, in 1997, providing an opportunity to study earthquake interaction. Aftershocks are abundant where the Coulomb stress is calculated to have been increased by the first event, and they abruptly stop where the stress is dropped by the second event. This ability of the main shocks to toggle seismicity on and off argues that static stress changes play a major role in exciting aftershocks, whereas the dynamic Coulomb stresses, which should only promote seismicity, appear to play a secondary role. If true, the net stress changes from a sequence of earthquakes might be expected to govern the subsequent seismicity distribution. However, adding the stress changes from the two Kagoshima events does not fully capture the ensuing seismicity, such as its rate change, temporal decay, or migration away from the ends of the ruptures. We therefore implement a stress transfer model that incorporates rate/state friction, in which seismicity is treated as a sequence of independent nucleation events that are dependent on the fault slip, slip rate, and elapsed time since the last event. The model reproduces the temporal response of seismicity to successive stress changes, including toggling, decay, and aftershock migration. Nevertheless, the match of observed to predicted seismicity is quite imperfect, due perhaps to inadequate knowledge of several model parameters. However, to demonstrate the potential of this approach, we build a probabilistic forecast of larger earthquakes on the expected rate of small aftershocks, taking advantage of the large statistical sample the small shocks afford. Not surprisingly, such probabilities are highly time- and location-dependent: During the first decade after the main shocks, the seismicity rate and the chance of successive large shocks are about an order of magnitude higher than the background rate and are concentrated exclusively in

  19. Simulation-Based Probabilistic Seismic Hazard Assessment Using System-Level, Physics-Based Models: Assembling Virtual California

    NASA Astrophysics Data System (ADS)

    Rundle, P. B.; Rundle, J. B.; Morein, G.; Donnellan, A.; Turcotte, D.; Klein, W.

    2004-12-01

    results on use of Virtual California for probabilistic earthquake forecasting for several sub-groups of major faults in California. These methods have the advantage that system-level fault interactions are explicitly included, as well as laboratory-based friction laws.

  20. Impact from Magnitude-Rupture Length Uncertainty on Seismic Hazard and Risk

    NASA Astrophysics Data System (ADS)

    Apel, E. V.; Nyst, M.; Kane, D. L.

    2015-12-01

    In probabilistic seismic hazard and risk assessments seismic sources are typically divided into two groups: fault sources (to model known faults) and background sources (to model unknown faults). In areas like the Central and Eastern United States and Hawaii the hazard and risk is driven primarily by background sources. Background sources can be modeled as areas, points or pseudo-faults. When background sources are modeled as pseudo-faults, magnitude-length or magnitude-area scaling relationships are required to construct these pseudo-faults. However the uncertainty associated with these relationships is often ignored or discarded in hazard and risk models, particularly when faults sources are the dominant contributor. Conversely, in areas modeled only with background sources these uncertainties are much more significant. In this study we test the impact of using various relationships and the resulting epistemic uncertainties on the seismic hazard and risk in the Central and Eastern United States and Hawaii. It is common to use only one magnitude length relationship when calculating hazard. However, Stirling et al. (2013) showed that for a given suite of magnitude-rupture length relationships the variability can be quite large. The 2014 US National Seismic Hazard Maps (Petersen et al., 2014) used one magnitude-rupture length relationship (Somerville, et al., 2001) in the Central and Eastern United States, and did not consider variability in the seismogenic rupture plane width. Here we use a suite of metrics to compare the USGS approach with these variable uncertainty models to assess 1) the impact on hazard and risk and 2) the epistemic uncertainty associated with choice of relationship. In areas where the seismic hazard is dominated by larger crustal faults (e.g. New Madrid) the choice of magnitude-rupture length relationship has little impact on the hazard or risk. However away from these regions, the choice of relationship is more significant and may approach

  1. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    NASA Astrophysics Data System (ADS)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  2. Carbonate hosted fault rocks: A review of structural and microstructural characteristic with implications for seismicity in the upper crust

    NASA Astrophysics Data System (ADS)

    Delle Piane, Claudio; Clennell, M. Ben; Keller, Joao V. A.; Giwelli, Ausama; Luzin, Vladimir

    2017-10-01

    The structure, frictional properties and permeability of faults within carbonate rocks exhibit a dynamic interplay that controls both seismicity and the exchange of fluid between different crustal levels. Here we review field and experimental studies focused on the characterization of fault zones in carbonate rocks with the aim of identifying the microstructural indicators of rupture nucleation and seismic slip. We highlight results from experimental research linked to observations on exhumed fault zones in carbonate rocks. From the analysis of these accumulated results we identify the meso and microstructural deformation styles in carbonates rocks and link them to the lithology of the protolith and their potential as seismic indicators. Although there has been significant success in the laboratory reproduction of deformation structures observed in the field, the range of slip rates and dynamic friction under which most of the potential seismic indicators is formed in the laboratory urges caution when using them as a diagnostic for seismic slip. We finally outline what we think are key topics for future research that would lead to a more in-depth understanding of the record of seismic slip in carbonate rocks.

  3. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  4. Seismic velocity deviation log: An effective method for evaluating spatial distribution of reservoir pore types

    NASA Astrophysics Data System (ADS)

    Shirmohamadi, Mohamad; Kadkhodaie, Ali; Rahimpour-Bonab, Hossain; Faraji, Mohammad Ali

    2017-04-01

    Velocity deviation log (VDL) is a synthetic log used to determine pore types in reservoir rocks based on a combination of the sonic log with neutron-density logs. The current study proposes a two step approach to create a map of porosity and pore types by integrating the results of petrographic studies, well logs and seismic data. In the first step, velocity deviation log was created from the combination of the sonic log with the neutron-density log. The results allowed identifying negative, zero and positive deviations based on the created synthetic velocity log. Negative velocity deviations (below - 500 m/s) indicate connected or interconnected pores and fractures, while positive deviations (above + 500 m/s) are related to isolated pores. Zero deviations in the range of [- 500 m/s, + 500 m/s] are in good agreement with intercrystalline and microporosities. The results of petrographic studies were used to validate the main pore type derived from velocity deviation log. In the next step, velocity deviation log was estimated from seismic data by using a probabilistic neural network model. For this purpose, the inverted acoustic impedance along with the amplitude based seismic attributes were formulated to VDL. The methodology is illustrated by performing a case study from the Hendijan oilfield, northwestern Persian Gulf. The results of this study show that integration of petrographic, well logs and seismic attributes is an instrumental way for understanding the spatial distribution of main reservoir pore types.

  5. Geomechanics-Based Stochastic Analysis of Injection- Induced Seismicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghassemi, Ahmad

    -thermo-poro-mechanical mechanisms associated with injection and utilizing a state-of-the-art stochastic inversion procedure. The approach proposed herein is innovative and significantly improves the existing SBCR technology (e.g., Shapiro et al. 2003) for geothermal reservoirs in several ways. First, the current scope of the SBRC is limited with respect to the physical processes considered and the rock properties used. Usually, the geomechanics analyses within SBRC is limited to the pore pressure diffusion in the rock mass, which is modeled using a time-dependent parabolic equation and solved using a finite element algorithm with either a line or a point source. However, water injection induces both poroelastic and thermoelastic stresses in the rock mass which affect the stress state. In fact, it has been suggested that thermoelastic stresses can play a dominant role in reservoir seismicity (Ghassemi et al., 2007). We include these important effects by using a fully-coupled poro-thermoelastic constitutive equations for the rock mass which will be solved using a 3D finite element model with more realistic injection geometries such as multiple injection/extraction sources (and in fractures), uncertainty in the material parameters and the in-situ stress distribution to better reflect the pore pressure and stress distributions. In addition, we developed a 3D stochastic fracture network model to study MEQ generation in fracture rocks. The model was verified using laboratory experiments, and calibrated and applied to Newberry EGS stimulation. In previous SBRC approaches, the triggering of micro-seismicity is modeled base on the assumption that the prior stochastic criticality model of the rock mass is a valid and adequate description. However, this assumption often does not hold in the field. Thus, we improved upon the current SBRC approach by using the micro-seismic responses to estimate the hydraulic diffusivity as well as the criticality distribution itself within the field. In this way, instead of

  6. The Sacred Mountain of Varallo in Italy: seismic risk assessment by acoustic emission and structural numerical models.

    PubMed

    Carpinteri, Alberto; Lacidogna, Giuseppe; Invernizzi, Stefano; Accornero, Federico

    2013-01-01

    We examine an application of Acoustic Emission (AE) technique for a probabilistic analysis in time and space of earthquakes, in order to preserve the valuable Italian Renaissance Architectural Complex named "The Sacred Mountain of Varallo." Among the forty-five chapels of the Renaissance Complex, the structure of the Chapel XVII is of particular concern due to its uncertain structural condition and due to the level of stress caused by the regional seismicity. Therefore, lifetime assessment, taking into account the evolution of damage phenomena, is necessary to preserve the reliability and safety of this masterpiece of cultural heritage. A continuous AE monitoring was performed to assess the structural behavior of the Chapel. During the monitoring period, a correlation between peaks of AE activity in the masonry of the "Sacred Mountain of Varallo" and regional seismicity was found. Although the two phenomena take place on very different scales, the AE in materials and the earthquakes in Earth's crust, belong to the same class of invariance. In addition, an accurate finite element model, performed with DIANA finite element code, is presented to describe the dynamic behavior of Chapel XVII structure, confirming visual and instrumental inspections of regional seismic effects.

  7. Accuracy and sensitivity analysis on seismic anisotropy parameter estimation

    NASA Astrophysics Data System (ADS)

    Yan, Fuyong; Han, De-Hua

    2018-04-01

    There is significant uncertainty in measuring the Thomsen’s parameter δ in laboratory even though the dimensions and orientations of the rock samples are known. It is expected that more challenges will be encountered in the estimating of the seismic anisotropy parameters from field seismic data. Based on Monte Carlo simulation of vertical transversely isotropic layer cake model using the database of laboratory anisotropy measurement from the literature, we apply the commonly used quartic non-hyperbolic reflection moveout equation to estimate the seismic anisotropy parameters and test its accuracy and sensitivities to the source-receive offset, vertical interval velocity error and time picking error. The testing results show that the methodology works perfectly for noise-free synthetic data with short spread length. However, this method is extremely sensitive to the time picking error caused by mild random noises, and it requires the spread length to be greater than the depth of the reflection event. The uncertainties increase rapidly for the deeper layers and the estimated anisotropy parameters can be very unreliable for a layer with more than five overlain layers. It is possible that an isotropic formation can be misinterpreted as a strong anisotropic formation. The sensitivity analysis should provide useful guidance on how to group the reflection events and build a suitable geological model for anisotropy parameter inversion.

  8. Reconciling deep seismic refraction and reflection data from the grenvillian-appalachian boundary in western New England

    USGS Publications Warehouse

    Hughes, S.; Luetgert, J.H.; Christensen, N.I.

    1993-01-01

    The Grenvillian-Appalachian boundary is characterized by pervasive mylonitic deformation and retrograde alteration of a suite of imbricated allochthonous and parautochthonous gneisses that were thrust upon the Grenvillian continental margin during the lower Paleozoic. Seismic reflection profiling across this structural boundary zone reveals prominent dipping reflectors interpreted as overthrust basement slices (parautochthons) of the Green Mountain Anticlinorium. In contrast, a seismic refraction study of the Grenvillian-Appalachian boundary reveals a sub-horizontally layered seismic velocity model that is difficult to reconcile with the pronounced sub-vertical structures observed in the Green mountains. A suite of rock samples was collected from the Green Mountain Anticlinorium and measured at high pressures in the laboratory to determine the seismic properties of these allochthonous and parautochthonous gneisses. The laboratory-measured seismic velocities agree favorably with the modelled velocity structure across the Grenvillian-Appalachian boundary suggesting that the rock samples are reliable indicators of the rock mass as whole. Samples of the parautochthonous Grenvillian basement exposed in the Green Mountains have lower velocities, by about 0.5 km/s, than lithologically equivalent units exposed in the eastern Adirondack Highlands. Velocity reduction in the Green Mountain parautochthons can be accounted for by retrograde metamorphic alteration (hydration) of the paragneisses. Seismic anisotropies, ranging from 2 to 12%, in the mylonitized Green Mountain paragneisses may also contribute to the observation of lower seismic velocities, where the direction of ray propagation is normal to the foliation. The velocity properties of the Green Mountain paragneisses are thus insufficiently different from the mantling Appalachian allochthons to permit their resolution by the Ontario-New York-New England seismic refraction profile. ?? 1993.

  9. Probabilistic Appraisal of Earthquake Hazard Parameters Deduced from a Bayesian Approach in the Northwest Frontier of the Himalayas

    NASA Astrophysics Data System (ADS)

    Yadav, R. B. S.; Tsapanos, T. M.; Bayrak, Yusuf; Koravos, G. Ch.

    2013-03-01

    A straightforward Bayesian statistic is applied in five broad seismogenic source zones of the northwest frontier of the Himalayas to estimate the earthquake hazard parameters (maximum regional magnitude M max, β value of G-R relationship and seismic activity rate or intensity λ). For this purpose, a reliable earthquake catalogue which is homogeneous for M W ≥ 5.0 and complete during the period 1900 to 2010 is compiled. The Hindukush-Pamir Himalaya zone has been further divided into two seismic zones of shallow ( h ≤ 70 km) and intermediate depth ( h > 70 km) according to the variation of seismicity with depth in the subduction zone. The estimated earthquake hazard parameters by Bayesian approach are more stable and reliable with low standard deviations than other approaches, but the technique is more time consuming. In this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in all seismogenic source zones. The zones of estimated M max greater than 8.0 are related to the Sulaiman-Kirthar ranges, Hindukush-Pamir Himalaya and Himalayan Frontal Thrusts belt; suggesting more seismically hazardous regions in the examined area. The lowest value of M max (6.44) has been calculated in Northern-Pakistan and Hazara syntaxis zone which have estimated lowest activity rate 0.0023 events/day as compared to other zones. The Himalayan Frontal Thrusts belt exhibits higher earthquake magnitude (8.01) in next 100-years with 90 % probability level as compared to other zones, which reveals that this zone is more vulnerable to occurrence of a great earthquake. The obtained results in this study are directly useful for the probabilistic seismic hazard assessment in the examined region of Himalaya.

  10. Developing an event-tree probabilistic tsunami inundation model for NE Atlantic coasts: Application to case studies

    NASA Astrophysics Data System (ADS)

    Omira, Rachid; Baptista, Maria Ana; Matias, Luis

    2015-04-01

    This study constitutes the first assessment of probabilistic tsunami inundation in the NE Atlantic region, using an event-tree approach. It aims to develop a probabilistic tsunami inundation approach for the NE Atlantic coast with an application to two test sites of ASTARTE project, Tangier-Morocco and Sines-Portugal. Only tsunamis of tectonic origin are considered here, taking into account near-, regional- and far-filed sources. The multidisciplinary approach, proposed here, consists of an event-tree method that gathers seismic hazard assessment, tsunami numerical modelling, and statistical methods. It presents also a treatment of uncertainties related to source location and tidal stage in order to derive the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height during a given return period. We derive high-resolution probabilistic maximum wave heights and flood distributions for both test-sites Tangier and Sines considering 100-, 500-, and 1000-year return periods. We find that the probability that a maximum wave height exceeds 1 m somewhere along the Sines coasts reaches about 55% for 100-year return period, and is up to 100% for 1000-year return period. Along Tangier coast, the probability of inundation occurrence (flow depth > 0m) is up to 45% for 100-year return period and reaches 96% in some near-shore costal location for 500-year return period. Acknowledgements: This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3).

  11. Revised seismic hazard map for the Kyrgyz Republic

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Ullah, Shahid; Parolai, Stefano; Walker, Richard; Pittore, Massimiliano; Free, Matthew; Fourniadis, Yannis; Villiani, Manuela; Sousa, Luis; Ormukov, Cholponbek; Moldobekov, Bolot; Takeuchi, Ko

    2017-04-01

    final hazard level, this is still not fully accounted for, even if a nation-wide first order Vs30 model (i.e., from the USGS) is available. Abdrakhmatov, K., Havenith, H.-B., Delvaux, D., Jongsmans, D. and Trefois, P. (2003) Probabilistic PGA and Arias Intensity maps of Kyrgyzstan (Central Asia), Journal of Seismology, 7, 203-220. Ulomov, V.I., The GSHAP Region 7 working group (1999) Seismic hazard of Northern Eurasia, Annali di Geofisica, 42, 1012-1038.

  12. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  13. Probabilistic simple sticker systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  14. Reassessment of the Seismicity and seismic hazards of Libya

    NASA Astrophysics Data System (ADS)

    Ben Suleman, A.; Elmeladi, A.

    2009-04-01

    The tectonic evolution of Libya, located at the northern extreme of the African continent, has yielded a complex crustal structure that is composed of a series of basins and uplifts. The present day deformation of Libya is the result of the Eurasia-Africa continental collision. At the end of the year 2005, The Libyan National Seismological Network was established to monitor local, regional and teleseismic activities, as well as to provide high quality data for research projects both locally and on the regional and global scale. This study aims to discuss the seismicity of Libya by using the new data from the Libyan national seismological network and to focus on the seismic hazards. At first glance the seismic activity map shows dominant trends of seismicity with most of the seismic activity concentrated along the northern coastal areas. Four major seismic trends were quite noticeable. A first trend is a NW-SE direction coinciding with the eastern boarder of the Hun Graben. A second trend is also a NW-SE direction in the offshore area and might be a continuation of this trend. The other two trends were located in the western Gulf of Sirt and Cyrenaica platform. The rest of seismicity is diffuse either offshore or in land, with no good correlation with well-mapped faults. Detailed investigations of the Libyan seismicity indicates that the Libya has experienced earthquakes of varying magnitudes and that there is definitely a certain amount of seismic risk involved in engineering projects, particularly in the northern regions. Detailed investigation of the distribution of the Libyan earthquakes in space and time along with all other geological considerations suggested the classification of the country into four seismic zones with the Hun graben zone being the most seismically active zone.

  15. Structural Identification And Seismic Analysis Of An Existing Masonry Building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Del Monte, Emanuele; Galano, Luciano; Ortolani, Barbara

    2008-07-08

    The paper presents the diagnostic investigation and the seismic analysis performed on an ancient masonry building in Florence. The building has historical interest and is subjected to conservative restrictions. The investigation involves a preliminary phase concerning the research of the historic documents and a second phase of execution of in situ and laboratory tests to detect the mechanical characteristics of the masonry. This investigation was conceived in order to obtain the 'LC2 Knowledge Level' and to perform the non-linear pushover analysis according to the new Italian Standards for seismic upgrading of existing masonry buildings.

  16. New Evaluation of Seismic Hazard in Cental America and la Hispaniola

    NASA Astrophysics Data System (ADS)

    Benito, B.; Camacho, E. I.; Rojas, W.; Climent, A.; Alvarado-Induni, G.; Marroquin, G.; Molina, E.; Talavera, E.; Belizaire, D.; Pierristal, G.; Torres, Y.; Huerfano, V.; Polanco, E.; García, R.; Zevallos, F.

    2013-05-01

    The results from seismic hazard studies carried out in two seismic scenarios, Central America Region (CA) and La Hispaniola Island, are presented here. Both cases follow the Probabilistic Seismic Hazard Assessment (PSHA) methodology and they are developed in terms of PGA, and SA (T), for T of 0.1, 0.2, 0.5, 1 and 2s. In both anaysis, hybrid zonation models are considered, integrated by seismogenic zones and faults where data of slip rate and recurrence time are available. First, we present a new evaluation of seismic hazard in CA, starting with the results of a previous study by Benito et al (2011). Some improvements are now included, such as: updated catalogue till 2011, corrections in the zonning model in particular for subduction regime taken into account the variation of the dip in Costa Rica and Panama, and modelization of some faults as independent units for the hazard estimation. The results allow us to carry out a sensitivity analysis comparing the ones obtained with and without faults. In a second part we present the results of the PSHA in La Hispaniola, carried out as part of the cooperative project SISMO-HAITI supported by UPM and developed in cooperation with ONEV. It started a few months after the 2010 event, as an answer to a required help from the Haitian government to UPM. The study was aimed at obtaining results suitable for seismic design purposes and started with the elaboration of a seismic catalogue for the Hispaniola, requiring an exhaustive revision of data reported by around 30 seismic agencies, apart from these from Puerto Rico and Dominican Republic Seismic Networks. Seismotectonic models for the region were reviewed and a new regional zonation was proposed, taking into account different geophysical data. Attenuation models for subduction and crustal zones were also reviewed and the more suitable were calibrated with data recorded inside the Caribbean plate. As a result of the PSHA, different maps were generated for the quoted parameters

  17. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  18. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  19. Frontal and Parietal Contributions to Probabilistic Association Learning

    PubMed Central

    Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke

    2011-01-01

    Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842

  20. High Resolution Near Surface 3D Seismic Experiments: A Carbonate Platform vs. a Siliciclastic Sequence

    NASA Astrophysics Data System (ADS)

    Filippidou, N.; Drijkoningen, G.; Braaksma, H.; Verwer, K.; Kenter, J.

    2005-05-01

    Interest in high-resolution 3D seismic experiments for imaging shallow targets has increased over the past years. Many case studies presented, show that producing clear seismic images with this non-evasive method, is still a challenge. We use two test-sites where nearby outcrops are present so that an accurate geological model can be built and the seismic result validated. The first so-called natural field laboratory is located in Boulonnais (N. France). It is an upper Jurassic siliciclastic sequence; age equivalent of the source rock of N. Sea. The second one is located in Cap Blanc,to the southwest of the Mallorca island(Spain); depicting an excellent example of Miocene prograding reef platform (Llucmajor Platform); it is a textbook analog for carbonate reservoirs. In both cases, the multidisciplinary experiment included the use of multicomponent and quasi- or 3D seismic recordings. The target depth does not exceed 120m. Vertical and shear portable vibrators were used as source. In the center of the setups, boreholes were drilled and Vertical Seismic Profiles were shot, along with core and borehole measurements both in situ and in the laboratory. These two geologically different sites, with different seismic stratigraphy have provided us with exceptionally high resolution seismic images. In general seismic data was processed more or less following standard procedures, a few innovative techniques on the Mallorca data, as rotation of horizontal components, 3D F-K filter and addition of parallel profiles, have improved the seismic image. In this paper we discuss the basic differences as seen on the seismic sections. The Boulonnais data present highly continuous reflection patterns of extremenly high resolution. This facilitated a high resolution stratigraphic description. Results from the VSP showed substantial wave energy attenuation. However, the high-fold (330 traces ) Mallorca seismic experiment returned a rather discontinuous pattern of possible reflectors

  1. Seismic safety assessment of unreinforced masonry low-rise buildings in Pakistan and its neighbourhood

    NASA Astrophysics Data System (ADS)

    Korkmaz, K. A.

    2009-06-01

    Pakistan and neighbourhood experience numerous earthquakes, most of which result in damaged or collapsed buildings and loss of life that also affect the economy adversely. On 29 October, 2008, an earthquake of magnitude 6.5 occurred in Ziarat, Quetta Region, Pakistan which was followed by more than 400 aftershocks. Many villages were completely destroyed and more than 200 people died. The previous major earthquake was in 2005, known as the South Asian earthquake (Mw=7.6) occurred in Kashmir, where 80 000 people died. Inadequate building stock is to be blamed for the degree of disaster, as the majority of the buildings in the region are unreinforced masonry low-rise buildings. In this study, seismic vulnerability of regionally common unreinforced masonry low-rise buildings was investigated using probabilistic based seismic safety assessment. The results of the study showed that unreinforced masonry low-rise buildings display higher displacements and shear force. Probability of damage due to higher displacements and shear forces can be directly related to damage or collapse.

  2. Probabilistic estimation of residential air exchange rates for ...

    EPA Pesticide Factsheets

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER measurements. An algorithm for probabilistically estimating AER was developed based on the Lawrence Berkley National Laboratory Infiltration model utilizing housing characteristics and meteorological data with adjustment for window opening behavior. The algorithm was evaluated by comparing modeled and measured AERs in four US cities (Los Angeles, CA; Detroit, MI; Elizabeth, NJ; and Houston, TX) inputting study-specific data. The impact on the modeled AER of using publically available housing data representative of the region for each city was also assessed. Finally, modeled AER based on region-specific inputs was compared with those estimated using literature-based distributions. While modeled AERs were similar in magnitude to the measured AER they were consistently lower for all cities except Houston. AERs estimated using region-specific inputs were lower than those using study-specific inputs due to differences in window opening probabilities. The algorithm produced more spatially and temporally variable AERs compared with literature-based distributions reflecting within- and between-city differences, helping reduce error in estimates of air pollutant exposure. Published in the Journal of

  3. Uncertainty analysis of depth predictions from seismic reflection data using Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2018-03-01

    Estimating the depths of target horizons from seismic reflection data is an important task in exploration geophysics. To constrain these depths we need a reliable and accurate velocity model. Here, we build an optimum 2D seismic reflection data processing flow focused on pre - stack deghosting filters and velocity model building and apply Bayesian methods, including Gaussian process emulation and Bayesian History Matching (BHM), to estimate the uncertainties of the depths of key horizons near the borehole DSDP-258 located in the Mentelle Basin, south west of Australia, and compare the results with the drilled core from that well. Following this strategy, the tie between the modelled and observed depths from DSDP-258 core was in accordance with the ± 2σ posterior credibility intervals and predictions for depths to key horizons were made for the two new drill sites, adjacent the existing borehole of the area. The probabilistic analysis allowed us to generate multiple realizations of pre-stack depth migrated images, these can be directly used to better constrain interpretation and identify potential risk at drill sites. The method will be applied to constrain the drilling targets for the upcoming International Ocean Discovery Program (IODP), leg 369.

  4. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  5. Applying new seismic analysis techniques to the lunar seismic dataset: New information about the Moon and planetary seismology on the eve of InSight

    NASA Astrophysics Data System (ADS)

    Dimech, J. L.; Weber, R. C.; Knapmeyer-Endrun, B.; Arnold, R.; Savage, M. K.

    2016-12-01

    The field of planetary science is poised for a major advance with the upcoming InSight mission to Mars due to launch in May 2018. Seismic analysis techniques adapted for use on planetary data are therefore highly relevant to the field. The heart of this project is in the application of new seismic analysis techniques to the lunar seismic dataset to learn more about the Moon's crust and mantle structure, with particular emphasis on `deep' moonquakes which are situated half-way between the lunar surface and its core with no surface expression. Techniques proven to work on the Moon might also be beneficial for InSight and future planetary seismology missions which face similar technical challenges. The techniques include: (1) an event-detection and classification algorithm based on `Hidden Markov Models' to reclassify known moonquakes and look for new ones. Apollo 17 gravimeter and geophone data will also be included in this effort. (2) Measurements of anisotropy in the lunar mantle and crust using `shear-wave splitting'. Preliminary measurements on deep moonquakes using the MFAST program are encouraging, and continued evaluation may reveal new structural information on the Moon's mantle. (3) Probabilistic moonquake locations using NonLinLoc, a non-linear hypocenter location technique, using a modified version of the codes designed to work with the Moon's radius. Successful application may provide a new catalog of moonquake locations with rigorous uncertainty information, which would be a valuable input into: (4) new fault plane constraints from focal mechanisms using a novel approach to Bayes' theorem which factor in uncertainties in hypocenter coordinates and S-P amplitude ratios. Preliminary results, such as shear-wave splitting measurements, will be presented and discussed.

  6. Seismic Characterization of the Newberry and Cooper Basin EGS Sites

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.

    2015-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  7. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  8. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  9. Sensitivity of Induced Seismic Sequences to Rate-and-State Frictional Processes

    NASA Astrophysics Data System (ADS)

    Kroll, Kayla A.; Richards-Dinger, Keith B.; Dieterich, James H.

    2017-12-01

    It is well established that subsurface injection of fluids increases pore fluid pressures that may lead to shear failure along a preexisting fault surface. Concern among oil and gas, geothermal, and carbon storage operators has risen dramatically over the past decade due to the increase in the number and magnitude of induced earthquakes. Efforts to mitigate the risk associated with injection-induced earthquakes include modeling of the interaction between fluids and earthquake faults. Here we investigate this relationship with simulations that couple a geomechanical reservoir model and RSQSim, a physics-based earthquake simulator. RSQSim employs rate- and state-dependent friction (RSF) that enables the investigation of the time-dependent nature of earthquake sequences. We explore the effect of two RSF parameters and normal stress on the spatiotemporal characteristics of injection-induced seismicity. We perform >200 simulations to systematically investigate the effect of these model components on the evolution of induced seismicity sequences and compare the spatiotemporal characteristics of our synthetic catalogs to observations of induced earthquakes. We find that the RSF parameters control the ability of seismicity to migrate away from the injection well, the total number and maximum magnitude of induced events. Additionally, the RSF parameters control the occurrence/absence of premonitory events. Lastly, we find that earthquake stress drops can be modulated by the normal stress and/or the RSF parameters. Insight gained from this study can aid in further development of models that address best practice protocols for injection operations, site-specific models of injection-induced earthquakes, and probabilistic hazard and risk assessments.

  10. Sensitivity of Induced Seismic Sequences to Rate-and-State Frictional Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroll, Kayla A.; Richards-Dinger, Keith B.; Dieterich, James H.

    It is well established that subsurface injection of fluids increases pore fluid pressures that may lead to shear failure along a preexisting fault surface. Concern among oil and gas, geothermal, and carbon storage operators has risen dramatically over the past decade due to the increase in the number and magnitude of induced earthquakes. Efforts to mitigate the risk associated with injection-induced earthquakes include modeling of the interaction between fluids and earthquake faults. Here we investigate this relationship with simulations that couple a geomechanical reservoir model and RSQSim, a physics-based earthquake simulator. RSQSim employs rate- and state-dependent friction (RSF) that enablesmore » the investigation of the time-dependent nature of earthquake sequences. We explore the effect of two RSF parameters and normal stress on the spatiotemporal characteristics of injection-induced seismicity. We perform >200 simulations to systematically investigate the effect of these model components on the evolution of induced seismicity sequences and compare the spatiotemporal characteristics of our synthetic catalogs to observations of induced earthquakes. We find that the RSF parameters control the ability of seismicity to migrate away from the injection well, the total number and maximum magnitude of induced events. Additionally, the RSF parameters control the occurrence/absence of premonitory events. Finally, we find that earthquake stress drops can be modulated by the normal stress and/or the RSF parameters. Insight gained from this study can aid in further development of models that address best practice protocols for injection operations, site-specific models of injection-induced earthquakes, and probabilistic hazard and risk assessments.« less

  11. Sensitivity of Induced Seismic Sequences to Rate-and-State Frictional Processes

    DOE PAGES

    Kroll, Kayla A.; Richards-Dinger, Keith B.; Dieterich, James H.

    2017-11-09

    It is well established that subsurface injection of fluids increases pore fluid pressures that may lead to shear failure along a preexisting fault surface. Concern among oil and gas, geothermal, and carbon storage operators has risen dramatically over the past decade due to the increase in the number and magnitude of induced earthquakes. Efforts to mitigate the risk associated with injection-induced earthquakes include modeling of the interaction between fluids and earthquake faults. Here we investigate this relationship with simulations that couple a geomechanical reservoir model and RSQSim, a physics-based earthquake simulator. RSQSim employs rate- and state-dependent friction (RSF) that enablesmore » the investigation of the time-dependent nature of earthquake sequences. We explore the effect of two RSF parameters and normal stress on the spatiotemporal characteristics of injection-induced seismicity. We perform >200 simulations to systematically investigate the effect of these model components on the evolution of induced seismicity sequences and compare the spatiotemporal characteristics of our synthetic catalogs to observations of induced earthquakes. We find that the RSF parameters control the ability of seismicity to migrate away from the injection well, the total number and maximum magnitude of induced events. Additionally, the RSF parameters control the occurrence/absence of premonitory events. Finally, we find that earthquake stress drops can be modulated by the normal stress and/or the RSF parameters. Insight gained from this study can aid in further development of models that address best practice protocols for injection operations, site-specific models of injection-induced earthquakes, and probabilistic hazard and risk assessments.« less

  12. Probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  13. Seismic hazard estimation of northern Iran using smoothed seismicity

    NASA Astrophysics Data System (ADS)

    Khoshnevis, Naeem; Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Cramer, Chris H.

    2017-07-01

    This article presents a seismic hazard assessment for northern Iran, where a smoothed seismicity approach has been used in combination with an updated seismic catalog and a ground motion prediction equation recently found to yield good fit with data. We evaluate the hazard over a geographical area including the seismic zones of Azerbaijan, the Alborz Mountain Range, and Kopeh-Dagh, as well as parts of other neighboring seismic zones that fall within our region of interest. In the chosen approach, seismic events are not assigned to specific faults but assumed to be potential seismogenic sources distributed within regular grid cells. After performing the corresponding magnitude conversions, we decluster both historical and instrumental seismicity catalogs to obtain earthquake rates based on the number of events within each cell, and smooth the results to account for the uncertainty in the spatial distribution of future earthquakes. Seismicity parameters are computed for each seismic zone separately, and for the entire region of interest as a single uniform seismotectonic region. In the analysis, we consider uncertainties in the ground motion prediction equation, the seismicity parameters, and combine the resulting models using a logic tree. The results are presented in terms of expected peak ground acceleration (PGA) maps and hazard curves at selected locations, considering exceedance probabilities of 2 and 10% in 50 years for rock site conditions. According to our results, the highest levels of hazard are observed west of the North Tabriz and east of the North Alborz faults, where expected PGA values are between about 0.5 and 1 g for 10 and 2% probability of exceedance in 50 years, respectively. We analyze our results in light of similar estimates available in the literature and offer our perspective on the differences observed. We find our results to be helpful in understanding seismic hazard for northern Iran, but recognize that additional efforts are necessary to

  14. Structural Geology of the Northwestern Portion of Los Alamos National Laboratory, Rio Grande Rift, New Mexico: Implications for Seismic Surface Rupture Potential from TA-3 to TA-55

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamie N. Gardner: Alexis Lavine; Giday WoldeGabriel; Donathon Krier

    1999-03-01

    Los Alamos National Laboratory lies at the western boundary of the Rio Grande rift, a major tectonic feature of the North American Continent. Three major faults locally constitute the modem rift boundary, and each of these is potentially seismogenic. In this study we have gathered structural geologic data for the northwestern portion of Los Alamos National Laboratory through high-precision geologic mapping, conventional geologic mapping, stratigraphic studies, drilling, petrologic studies, and stereographic aerial photograph analyses. Our study area encompasses TA-55 and TA-3, where potential for seismic surface rupture is of interest, and is bounded on the north and south by themore » townsite of Los Alamos and Twomile Canyon, respectively. The study area includes parts of two of the potentially active rift boundary faults--the Pajarito and Rendija Canyon faults-that form a large graben that we name the Diamond Drive graben. The graben embraces the western part of the townsite of Los Alamos, and its southern end is in the TA-3 area where it is defined by east-southeast-trending cross faults. The cross faults are small, but they accommodate interactions between the two major fault zones and gentle tilting of structural blocks to the north into the graben. North of Los Alamos townsite, the Rendija Canyon fault is a large normal fault with about 120 feet of down-to-the-west displacement over the last 1.22 million years. South from Los Alamos townsite, the Rendija Canyon fault splays to the southwest into a broad zone of deformation. The zone of deformation is about 2,000 feet wide where it crosses Los Alamos Canyon and cuts through the Los Alamos County Landfill. Farther southwest, the fault zone is about 3,000 feet wide at the southeastern corner of TA-3 in upper Mortandad Canyon and about 5,000 feet wide in Twomile Canyon. Net down-to-the-west displacement across the entire fault zone over the last 1.22 million years decreases to the south as the fault zone

  15. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  16. Analysis of induced seismicity at The Geysers geothermal field, California

    NASA Astrophysics Data System (ADS)

    Emolo, A.; Maercklin, N.; Matrullo, E.; Orefice, A.; Amoroso, O.; Convertito, V.; Sharma, N.; Zollo, A.

    2012-12-01

    Fluid injection, steam extraction, and reservoir stimulation in geothermal systems lead to induced seismicity. While in rare cases induced events may be large enough to pose a hazard, on the other hand the microseismicity provides information on the extent and the space-time varying properties of the reservoir. Therefore, microseismic monitoring is important, both for mitigation of unwanted effects of industrial operations and for continuous assessment of reservoir conditions. Here we analyze induced seismicity at The Geysers geothermal field in California, a vapor-dominated field with the top of the main steam reservoir some 1-3 km below the surface. Commercial exploitation began in the 1960s, and the seismicity increased with increasing field development. We focus our analyses on induced seismicity recorded between August 2007 and October 2011. Our calibrated waveform database contains some 15000 events with magnitudes between 1.0 and 4.5 and recorded by the LBNL Geysers/Calpine surface seismic network. We associated all data with events from the NCEDC earthquake catalog and re-picked first arrival times. Using selected events with at least 20 high-quality P-wave picks, we determined a minimum 1-D velocity model using VELEST. A well-constrained P-velocity model shows a sharp velocity increase at 1-2 km depth (from 3 to 5 km/s) and then a gradient-like trend down to about 5 km depth, where velocities reach values of 6-7 km/s. The station corrections show coherent, relatively high, positive travel time delays in the NW zone, thus indicating a strong lateral variation of the P-wave velocities. We determined an average Vp-to-Vs ratio of 1.67, which is consistent with estimates from other authors for the same time period. The events have been relocated in the new model using a non-linear probabilistic methods. The seismicity appears spatially diffused in a 15x10 km2 area elongated in NW-SE direction, and earthquake depths range between 0 and 6 km. As in previous

  17. Observations and modeling of seismic background noise

    USGS Publications Warehouse

    Peterson, Jon R.

    1993-01-01

    The preparation of this report had two purposes. One was to present a catalog of seismic background noise spectra obtained from a worldwide network of seismograph stations. The other purpose was to refine and document models of seismic background noise that have been in use for several years. The second objective was, in fact, the principal reason that this study was initiated and influenced the procedures used in collecting and processing the data.With a single exception, all of the data used in this study were extracted from the digital data archive at the U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL). This archive dates from 1972 when ASL first began deploying digital seismograph systems and collecting and distributing digital data under the sponsorship of the Defense Advanced Research Projects Agency (DARPA). There have been many changes and additions to the global seismograph networks during the past twenty years, but perhaps none as significant as the current deployment of very broadband seismographs by the U.S. Geological Survey (USGS) and the University of California San Diego (UCSD) under the scientific direction of the IRIS consortium. The new data acquisition systems have extended the bandwidth and resolution of seismic recording, and they utilize high-density recording media that permit the continuous recording of broadband data. The data improvements and continuous recording greatly benefit and simplify surveys of seismic background noise.Although there are many other sources of digital data, the ASL archive data were used almost exclusively because of accessibility and because the data systems and their calibration are well documented for the most part. Fortunately, the ASL archive contains high-quality data from other stations in addition to those deployed by the USGS. Included are data from UCSD IRIS/IDA stations, the Regional Seismic Test Network (RSTN) deployed by Sandia National Laboratories (SNL), and the TERRAscope network

  18. Progressive Seismic Failure, Seismic Gap, and Great Seismic Risk across the Densely Populated North China Basin

    NASA Astrophysics Data System (ADS)

    Yin, A.; Yu, X.; Shen, Z.

    2014-12-01

    Although the seismically active North China basin has the most complete written records of pre-instrumentation earthquakes in the world, this information has not been fully utilized for assessing potential earthquake hazards of this densely populated region that hosts ~200 million people. In this study, we use the historical records to document the earthquake migration pattern and the existence of a 180-km seismic gap along the 600-km long right-slip Tangshan-Hejian-Cixian (THC) fault zone that cuts across the North China basin. The newly recognized seismic gap, which is centered at Tianjin with a population of 11 million people and ~120 km from Beijing (22 million people) and Tangshan (7 million people), has not been ruptured in the past 1000 years by M≥6 earthquakes. The seismic migration pattern in the past millennium suggests that the epicenters of major earthquakes have shifted towards this seismic gap along the THC fault, which implies that the 180- km gap could be the site of the next great earthquake with M≈7.6 if it is ruptured by a single event. Alternatively, the seismic gap may be explained by aseismic creeping or seismic strain transfer between active faults.

  19. Time-Independent Annual Seismic Rates, Based on Faults and Smoothed Seismicity, Computed for Seismic Hazard Assessment in Italy

    NASA Astrophysics Data System (ADS)

    Murru, M.; Falcone, G.; Taroni, M.; Console, R.

    2017-12-01

    In 2015 the Italian Department of Civil Protection, started a project for upgrading the official Italian seismic hazard map (MPS04) inviting the Italian scientific community to participate in a joint effort for its realization. We participated providing spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events on the entire Italian territory, considering cells of 0.1°x0.1° from M4.5 up to M8.1 for magnitude bin of 0.1 units. Our final model was composed by two different models, merged in one ensemble model, each one with the same weight: the first one was realized by a smoothed seismicity approach, the second one using the seismogenic faults. The spatial smoothed seismicity was obtained using the smoothing method introduced by Frankel (1995) applied to the historical and instrumental seismicity. In this approach we adopted a tapered Gutenberg-Richter relation with a b-value fixed to 1 and a corner magnitude estimated with the bigger events in the catalogs. For each seismogenic fault provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate (for each cells of 0.1°x0.1°) for magnitude bin of 0.1 units, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the same tapered Gutenberg-Richter relation of the smoothed seismicity model. The annual rate for the final model was determined in the following way: if the cell falls within one of the seismic sources, we merge the respective value of rate determined by the seismic moments of the earthquakes generated by each fault and the value of the smoothed seismicity model with the same weight; if instead the cells fall outside of any seismic source we considered the rate obtained from the spatial smoothed seismicity. Here we present the final results of our study to be used for the new Italian seismic hazard map.

  20. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  1. The Pollino Seismic Sequence: Activated Graben Structures in a Seismic Gap

    NASA Astrophysics Data System (ADS)

    Rößler, Dirk; Passarelli, Luigi; Govoni, Aladino; Bindi, Dino; Cesca, Simone; Hainzl, Sebatian; Maccaferri, Francesco; Rivalta, Eleonora; Woith, Heiko; Dahm, Torsten

    2015-04-01

    The Mercure Basin (MB) and the Castrovillari Fault (CF) in the Pollino range (Southern Apennines, Italy) represent one of the most prominent seismic gaps in the Italian seismic catalogue, with no M>5.5 earthquakes during the last centuries. In historical times several swarm-like seismic sequences occurred in the area including two intense swarms within the past two decades. The most energetic one started in 2010 and has been still active in 2014. The seismicity culminated in autumn 2012 with a M=5 event on 25 October. The range hosts a number of opposing normal faults forming a graben-like structure. Their rheology and their interactions are unclear. Current debates include the potential of the MB and the CF to host large earthquakes and the style of deformation. Understanding the seismicity and the behaviour of the faults is necessary to assess the tectonics and the seismic hazard. The GFZ German Research Centre for Geosciences and INGV, Italy, have jointly monitored the ongoing seismicity using a small-aperture seismic array, integrated in a temporary seismic network. Based on this installation, we located more than 16,000 local earthquakes that occurred between November 2012 and September 2014. Here we investigate quantitatively all the phases of the seismic sequence starting from January 2010. Event locations along with moment tensor inversion constrain spatially the structures activated by the swarm and the migration pattern of the seismicity. The seismicity forms clusters concentrated within the southern part of the MB and along the Pollino Fault linking MB and CF. Most earthquakes are confined to the upper 10 km of the crust in an area of ~15x15 km2. However, sparse seismicity at depths between 15 and 20 km and moderate seismicity further north with deepening hypocenters also exist. In contrast, the CF appears aseismic; only the northern part has experienced micro-seismicity. The spatial distribution is however more complex than the major tectonic structures

  2. From multi-disciplinary monitoring observation to probabilistic eruption forecasting: a Bayesian view

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2011-12-01

    Eruption forecasting is the probability of eruption in a specific time-space-magnitude window. The use of probabilities to track the evolution of a phase of unrest is unavoidable for two main reasons: first, eruptions are intrinsically unpredictable in a deterministic sense, and, second, probabilities represent a quantitative tool that can be rationally used by decision-makers (this is usually done in many other fields). The primary information for the probability assessment during a phase of unrest come from monitoring data of different quantities, such as the seismic activity, ground deformation, geochemical signatures, and so on. Nevertheless, the probabilistic forecast based on monitoring data presents two main difficulties. First, many high-risk volcanoes do not have monitoring pre-eruptive and unrest databases, making impossible a probabilistic assessment based on the frequency of past observations. The ongoing project WOVOdat (led by Christopher Newhall) is trying to tackle this limitation creating a sort of worldwide epidemiological database that may cope with the lack of monitoring pre-eruptive and unrest databases for a specific volcano using observations of 'analogs' volcanoes. Second, the quantity and quality of monitoring data are rapidly increasing in many volcanoes, creating strongly inhomogeneous dataset. In these cases, classical statistical analysis can be performed on high quality monitoring observations only for (usually too) short periods of time, or alternatively using only few specific monitoring data that are available for longer times (such as the number of earthquakes), therefore neglecting a lot of information carried out by the most recent kind of monitoring. Here, we explore a possible strategy to cope with these limitations. In particular, we present a Bayesian strategy that merges different kinds of information. In this approach, all relevant monitoring observations are embedded into a probabilistic scheme through expert opinion

  3. Seismic sample areas defined from incomplete catalogues: an application to the Italian territory

    NASA Astrophysics Data System (ADS)

    Mulargia, F.; Tinti, S.

    1985-11-01

    The comprehensive understanding of earthquake source-physics under real conditions requires the study not of single faults as separate entities but rather of a seismically active region as a whole, accounting for the interaction among different structures. We define "seismic sample area" the most convenient region to be used as a natural laboratory for the study of seismic source physics. This coincides with the region where the average large magnitude seismicity is the highest. To this end, time and space future distributions of large earthquakes are to be estimated. Using catalog seismicity as an input, the rate of occurrence is not constant but appears generally biased by incompleteness in some parts of the catalog and possible nonstationarities in seismic activity. We present a statistical procedure which is capable, under a few mild assumptions, of both detecting nonstationarities in seismicity and finding the incomplete parts of a seismic catalog. The procedure is based on Kolmogorov-Smirnov nonparametric statistics, and can be applied without a priori assuming the parent distribution of the events. The efficiency of this procedure allows the analysis of small data sets. An application to the Italian territory is presented, using the most recent version of the ENEL seismic catalog. Seismic activity takes place in six well defined areas but only five of them have a number of events sufficient for analysis. Barring a few exceptions, seismicity is found stationary throughout the whole catalog span 1000-1980. The eastern Alps region stands out as the best "sample area", with the highest average probability of event occurrence per time and area unit. Final objective of this characterization is to stimulate a program of intensified research.

  4. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  5. Seismic attenuation and scattering tomography of rock samples using stochastic wavefields: linking seismology, volcanology, and rock physics.

    NASA Astrophysics Data System (ADS)

    Fazio, Marco; De Siena, Luca; Benson, Phillip

    2016-04-01

    Seismic attenuation and scattering are two attributes that can be linked with porosity and permeability in laboratory experiments. When measuring these two quantities using seismic waveforms recorder at lithospheric and volcanic scales the areas of highest heterogeneity, as batches of melt and zones of high deformation, produce anomalous values of the measured quantities, the seismic quality factor and scattering coefficient. When employed as indicators of heterogeneity and absorption in volcanic areas these anomalous effects become strong indicators of magma accumulation and tectonic boundaries, shaping magmatic chambers and conduit systems. We perform attenuation and scattering measurements and imaging using seismic waveforms produced in laboratory experiments, at frequencies ranging between the kHz and MHz. As attenuation and scattering are measured from the shape of the envelopes, disregarding phases, we are able to connect the observations with the micro fracturing and petrological quantities previously measured on the sample. Connecting the imaging of dry and saturated samples via these novel attributes with the burst of low-period events with increasing saturation and deformation is a challenge. Its solution could plant the seed for better relating attenuation and scattering tomography measurements to the presence of fluids and gas, therefore creating a novel path for reliable porosity and permeability tomography. In particular for volcanoes, being able to relate attenuation/scattering measurements with low-period micro seismicity could deliver new data to settle the debate about if both source and medium can produce seismic resonance.

  6. Documentation for Initial Seismic Hazard Maps for Haiti

    USGS Publications Warehouse

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2010-01-01

    In response to the urgent need for earthquake-hazard information after the tragic disaster caused by the moment magnitude (M) 7.0 January 12, 2010, earthquake, we have constructed initial probabilistic seismic hazard maps for Haiti. These maps are based on the current information we have on fault slip rates and historical and instrumental seismicity. These initial maps will be revised and improved as more data become available. In the short term, more extensive logic trees will be developed to better capture the uncertainty in key parameters. In the longer term, we will incorporate new information on fault parameters and previous large earthquakes obtained from geologic fieldwork. These seismic hazard maps are important for the management of the current crisis and the development of building codes and standards for the rebuilding effort. The boundary between the Caribbean and North American Plates in the Hispaniola region is a complex zone of deformation. The highly oblique ~20 mm/yr convergence between the two plates (DeMets and others, 2000) is partitioned between subduction zones off of the northern and southeastern coasts of Hispaniola and strike-slip faults that transect the northern and southern portions of the island. There are also thrust faults within the island that reflect the compressional component of motion caused by the geometry of the plate boundary. We follow the general methodology developed for the 1996 U.S. national seismic hazard maps and also as implemented in the 2002 and 2008 updates. This procedure consists of adding the seismic hazard calculated from crustal faults, subduction zones, and spatially smoothed seismicity for shallow earthquakes and Wadati-Benioff-zone earthquakes. Each one of these source classes will be described below. The lack of information on faults in Haiti requires many assumptions to be made. These assumptions will need to be revisited and reevaluated as more fieldwork and research are accomplished. We made two sets of

  7. Probabilistic models of cognition: conceptual foundations.

    PubMed

    Chater, Nick; Tenenbaum, Joshua B; Yuille, Alan

    2006-07-01

    Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, 'sophisticated' probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, explore how the approach relates to studies of explicit probabilistic reasoning, and give a brief overview of the field as it stands today.

  8. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  9. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  10. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  11. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  12. Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Bryant, Larry

    2014-01-01

    Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.

  13. Angola Seismicity MAP

    NASA Astrophysics Data System (ADS)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic

  14. Statistical physics, seismogenesis, and seismic hazard

    NASA Astrophysics Data System (ADS)

    Main, Ian

    1996-11-01

    generic statistical properties similar to the "universal" behavior seen in a wide variety of critical phenomena, with significant implications for practical problems in probabilistic seismic hazard evaluation. In particular, the notion of self-organized criticality (or near-criticality) gives a scientific rationale for the a priori assumption of "stationarity" used as a first step in the prediction of the future level of hazard. The Gutenberg-Richter law (a power law in energy or seismic moment) is found to apply only within a finite scale range, both in model and natural seismicity. Accordingly, the frequency-magnitude distribution can be generalized to a gamma distribution in energy or seismic moment (a power law, with an exponential tail). This allows extrapolations of the frequency-magnitude distribution and the maximum credible magnitude to be constrained by observed seismic or tectonic moment release rates. The answers to other questions raised are less clear, for example, the effect of the a priori assumption of a Poisson process in a system with strong local interactions, and the impact of zoning a potentially multifractal distribution of epicentres with smooth polygons. The results of some models show premonitory patterns of seismicity which could in principle be used as mainshock precursors. However, there remains no consensus, on both theoretical and practical grounds, on the possibility or otherwise of reliable intermediate-term earthquake prediction.

  15. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    NASA Astrophysics Data System (ADS)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  16. Application of Nonlinear Seismic Soil-Structure Interaction Analysis for Identification of Seismic Margins at Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varma, Amit H.; Seo, Jungil; Coleman, Justin Leigh

    2015-11-01

    Seismic probabilistic risk assessment (SPRA) methods and approaches at nuclear power plants (NPP) were first developed in the 1970s and aspects of them have matured over time as they were applied and incrementally improved. SPRA provides information on risk and risk insights and allows for some accounting for uncertainty and variability. As a result, SPRA is now used as an important basis for risk-informed decision making for both new and operating NPPs in the US and in an increasing number of countries globally. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures thatmore » can lead to a seismic induced core damage event. However, in some instances the current SPRA approach contains large uncertainties, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility). INL has an advanced SPRA research and development (R&D) activity that will identify areas in the calculation process that contain significant uncertainties. One current area of focus is the use of nonlinear soil-structure interaction (NLSSI) analysis methods to accurately capture: 1) nonlinear soil behavior and 2) gapping and sliding between the NPP and soil. The goal of this study is to compare numerical NLSSI analysis results with recorded earthquake ground motions at Fukushima Daichii (Great Tohuku Earthquake) and evaluate the sources of nonlinearity contributing to the observed reduction in peak acceleration. Comparisons are made using recorded data in the free-field (soil column with no structural influence) and recorded data on the NPP basemat (in-structure response). Results presented in this study should identify areas of focus for future R&D activities with the goal of minimizing uncertainty in SPRA calculations. This is not a validation activity since there are too many sources of uncertainty that a numerical analysis

  17. Implementation of NGA-West2 ground motion models in the 2014 U.S. National Seismic Hazard Maps

    USGS Publications Warehouse

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Harmsen, Stephen C.; Frankel, Arthur D.

    2014-01-01

    The U.S. National Seismic Hazard Maps (NSHMs) have been an important component of seismic design regulations in the United States for the past several decades. These maps present earthquake ground shaking intensities at specified probabilities of being exceeded over a 50-year time period. The previous version of the NSHMs was developed in 2008; during 2012 and 2013, scientists at the U.S. Geological Survey have been updating the maps based on their assessment of the “best available science,” resulting in the 2014 NSHMs. The update includes modifications to the seismic source models and the ground motion models (GMMs) for sites across the conterminous United States. This paper focuses on updates in the Western United States (WUS) due to the use of new GMMs for shallow crustal earthquakes in active tectonic regions developed by the Next Generation Attenuation (NGA-West2) project. Individual GMMs, their weighted combination, and their impact on the hazard maps relative to 2008 are discussed. In general, the combined effects of lower medians and increased standard deviations in the new GMMs have caused only small changes, within 5–20%, in the probabilistic ground motions for most sites across the WUS compared to the 2008 NSHMs.

  18. The Sacred Mountain of Varallo in Italy: Seismic Risk Assessment by Acoustic Emission and Structural Numerical Models

    PubMed Central

    Carpinteri, Alberto; Invernizzi, Stefano; Accornero, Federico

    2013-01-01

    We examine an application of Acoustic Emission (AE) technique for a probabilistic analysis in time and space of earthquakes, in order to preserve the valuable Italian Renaissance Architectural Complex named “The Sacred Mountain of Varallo.” Among the forty-five chapels of the Renaissance Complex, the structure of the Chapel XVII is of particular concern due to its uncertain structural condition and due to the level of stress caused by the regional seismicity. Therefore, lifetime assessment, taking into account the evolution of damage phenomena, is necessary to preserve the reliability and safety of this masterpiece of cultural heritage. A continuous AE monitoring was performed to assess the structural behavior of the Chapel. During the monitoring period, a correlation between peaks of AE activity in the masonry of the “Sacred Mountain of Varallo” and regional seismicity was found. Although the two phenomena take place on very different scales, the AE in materials and the earthquakes in Earth's crust, belong to the same class of invariance. In addition, an accurate finite element model, performed with DIANA finite element code, is presented to describe the dynamic behavior of Chapel XVII structure, confirming visual and instrumental inspections of regional seismic effects. PMID:24381511

  19. Exploring the Differences Between the European (SHARE) and the Reference Italian Seismic Hazard Models

    NASA Astrophysics Data System (ADS)

    Visini, F.; Meletti, C.; D'Amico, V.; Rovida, A.; Stucchi, M.

    2014-12-01

    The recent release of the probabilistic seismic hazard assessment (PSHA) model for Europe by the SHARE project (Giardini et al., 2013, www.share-eu.org) arises questions about the comparison between its results for Italy and the official Italian seismic hazard model (MPS04; Stucchi et al., 2011) adopted by the building code. The goal of such a comparison is identifying the main input elements that produce the differences between the two models. It is worthwhile to remark that each PSHA is realized with data and knowledge available at the time of the release. Therefore, even if a new model provides estimates significantly different from the previous ones that does not mean that old models are wrong, but probably that the current knowledge is strongly changed and improved. Looking at the hazard maps with 10% probability of exceedance in 50 years (adopted as the standard input in the Italian building code), the SHARE model shows increased expected values with respect to the MPS04 model, up to 70% for PGA. However, looking in detail at all output parameters of both the models, we observe a different behaviour for other spectral accelerations. In fact, for spectral periods greater than 0.3 s, the current reference PSHA for Italy proposes higher values than the SHARE model for many and large areas. This observation suggests that this behaviour could not be due to a different definition of seismic sources and relevant seismicity rates; it mainly seems the result of the adoption of recent ground-motion prediction equations (GMPEs) that estimate higher values for PGA and for accelerations with periods lower than 0.3 s and lower values for higher periods with respect to old GMPEs. Another important set of tests consisted in analysing separately the PSHA results obtained by the three source models adopted in SHARE (i.e., area sources, fault sources with background, and a refined smoothed seismicity model), whereas MPS04 only uses area sources. Results seem to confirm the

  20. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  1. Seismic Noise Analysis and Reduction through Utilization of Collocated Seismic and Atmospheric Sensors at the GRO Chile Seismic Network

    NASA Astrophysics Data System (ADS)

    Farrell, M. E.; Russo, R. M.

    2013-12-01

    The installation of Earthscope Transportable Array-style geophysical observatories in Chile expands open data seismic recording capabilities in the southern hemisphere by nearly 30%, and has nearly tripled the number of seismic stations providing freely-available data in southern South America. Through the use of collocated seismic and atmospheric sensors at these stations we are able to analyze how local atmospheric conditions generate seismic noise, which can degrade data in seismic frequency bands at stations in the ';roaring forties' (S latitudes). Seismic vaults that are climate-controlled and insulated from the local environment are now employed throughout the world in an attempt to isolate seismometers from as many noise sources as possible. However, this is an expensive solution that is neither practical nor possible for all seismic deployments; and also, the increasing number and scope of temporary seismic deployments has resulted in the collection and archiving of terabytes of seismic data that is affected to some degree by natural seismic noise sources such as wind and atmospheric pressure changes. Changing air pressure can result in a depression and subsequent rebound of Earth's surface - which generates low frequency noise in seismic frequency bands - and even moderate winds can apply enough force to ground-coupled structures or to the surface above the seismometers themselves, resulting in significant noise. The 10 stations of the permanent Geophysical Reporting Observatories (GRO Chile), jointly installed during 2011-12 by IRIS and the Chilean Servicio Sismológico, include instrumentation in addition to the standard three seismic components. These stations, spaced approximately 300 km apart along the length of the country, continuously record a variety of atmospheric data including infrasound, air pressure, wind speed, and wind direction. The collocated seismic and atmospheric sensors at each station allow us to analyze both datasets together, to

  2. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  3. Technology Innovation for the CTBT, the National Laboratory Contribution

    NASA Astrophysics Data System (ADS)

    Goldstein, W. H.

    2016-12-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) and its Protocol are the result of a long history of scientific engagement and international technical collaboration. The U.S. Department of Energy National Laboratories have been conducting nuclear explosive test-ban research for over 50 years and have made significant contributions to this legacy. Recent examples include the RSTT (regional seismic travel time) computer code and the Smart Sampler—both of these products are the result of collaborations among Livermore, Sandia, Los Alamos, and Pacific Northwest National Laboratories. The RSTT code enables fast and accurate seismic event locations using regional data. This code solves the long-standing problem of using teleseismic and regional seismic data together to locate events. The Smart Sampler is designed for use in On-site Inspections to sample soil gases to look for noble gas fission products from a potential underground nuclear explosive test. The Smart Sampler solves the long-standing problem of collecting soil gases without contaminating the sample with gases from the atmosphere by operating only during atmospheric low-pressure events. Both these products are being evaluated by the Preparatory Commission for the CTBT Organization and the international community. In addition to R&D, the National Laboratories provide experts to support U.S. policy makers in ongoing discussions such as CTBT Working Group B, which sets policy for the development of the CTBT monitoring and verification regime.

  4. Analysis of mean seismic ground motion and its uncertainty based on the UCERF3 geologic slip rate model with uncertainty for California

    USGS Publications Warehouse

    Zeng, Yuehua

    2018-01-01

    The Uniform California Earthquake Rupture Forecast v.3 (UCERF3) model (Field et al., 2014) considers epistemic uncertainty in fault‐slip rate via the inclusion of multiple rate models based on geologic and/or geodetic data. However, these slip rates are commonly clustered about their mean value and do not reflect the broader distribution of possible rates and associated probabilities. Here, we consider both a double‐truncated 2σ Gaussian and a boxcar distribution of slip rates and use a Monte Carlo simulation to sample the entire range of the distribution for California fault‐slip rates. We compute the seismic hazard following the methodology and logic‐tree branch weights applied to the 2014 national seismic hazard model (NSHM) for the western U.S. region (Petersen et al., 2014, 2015). By applying a new approach developed in this study to the probabilistic seismic hazard analysis (PSHA) using precomputed rates of exceedance from each fault as a Green’s function, we reduce the computer time by about 10^5‐fold and apply it to the mean PSHA estimates with 1000 Monte Carlo samples of fault‐slip rates to compare with results calculated using only the mean or preferred slip rates. The difference in the mean probabilistic peak ground motion corresponding to a 2% in 50‐yr probability of exceedance is less than 1% on average over all of California for both the Gaussian and boxcar probability distributions for slip‐rate uncertainty but reaches about 18% in areas near faults compared with that calculated using the mean or preferred slip rates. The average uncertainties in 1σ peak ground‐motion level are 5.5% and 7.3% of the mean with the relative maximum uncertainties of 53% and 63% for the Gaussian and boxcar probability density function (PDF), respectively.

  5. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  6. Global/local methods for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.

    1993-01-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  7. Global/local methods for probabilistic structural analysis

    NASA Astrophysics Data System (ADS)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  8. Performance-based seismic assessment of skewed bridges with and without considering soil-foundation interaction effects for various site classes

    NASA Astrophysics Data System (ADS)

    Ghotbi, Abdoul R.

    2014-09-01

    The seismic behavior of skewed bridges has not been well studied compared to straight bridges. Skewed bridges have shown extensive damage, especially due to deck rotation, shear keys failure, abutment unseating and column-bent drift. This research, therefore, aims to study the behavior of skewed and straight highway overpass bridges both with and without taking into account the effects of Soil-Structure Interaction (SSI) due to near-fault ground motions. Due to several sources of uncertainty associated with the ground motions, soil and structure, a probabilistic approach is needed. Thus, a probabilistic methodology similar to the one developed by the Pacific Earthquake Engineering Research Center (PEER) has been utilized to assess the probability of damage due to various levels of shaking using appropriate intensity measures with minimum dispersions. The probabilistic analyses were performed for various bridge configurations and site conditions, including sand ranging from loose to dense and clay ranging from soft to stiff, in order to evaluate the effects. The results proved a considerable susceptibility of skewed bridges to deck rotation and shear keys displacement. It was also found that SSI had a decreasing effect on the damage probability for various demands compared to the fixed-base model without including SSI. However, deck rotation for all types of the soil and also abutment unseating for very loose sand and soft clay showed an increase in damage probability compared to the fixed-base model. The damage probability for various demands has also been found to decrease with an increase of soil strength for both sandy and clayey sites. With respect to the variations in the skew angle, an increase in skew angle has had an increasing effect on the amplitude of the seismic response for various demands. Deck rotation has been very sensitive to the increase in the skew angle; therefore, as the skew angle increased, the deck rotation responded accordingly

  9. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  10. A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

    DOE PAGES

    Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; ...

    2015-10-01

    Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phasesmore » are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.« less

  11. Transparent Seismic Mitigation for Community Resilience

    NASA Astrophysics Data System (ADS)

    Poland, C. D.; Pekelnicky, R.

    2008-12-01

    Healthy communities continuously grow by leveraging their intellectual capital to drive economic development while protecting their cultural heritage. Success, in part, depends on the support of a healthy built environment that is rooted in contemporary urban planning, sustainability and disaster resilience. Planners and policy makers are deeply concerned with all aspects of their communities, including its seismic safety. Their reluctance to implement the latest plans for achieving seismic safety is rooted in a misunderstanding of the hazard they face and the risk it poses to their built environment. Probabilistic lingo and public debate about how big the "big one" will be drives them to resort to their own experience and intuition. There is a fundamental lack of transparency related to what is expected to happen, and it is partially blocking the policy changes that are needed. The solution: craft the message in broad based, usable terms that name the hazard, defines performance, and establishes a set of performance goals that represent the resiliency needed to drive a community's natural ability to rebound from a major seismic event. By using transparent goals and measures with an intuitive vocabulary for both performance and hazard, earthquake professionals, working with the San Francisco Urban Planning and Research Association (SPUR), have defined a level of resiliency that needs to be achieved by the City of San Francisco to assure their response to an event will be manageable and full recovery achievable within three years. Five performance measures for buildings and three for lifeline systems have been defined. Each declares whether people will be safe inside, whether the building will be able to be repaired and whether they will be usable during repairs. Lifeline systems are further defined in terms of the time intervals to restore 90%, 95%, and full service. These transparent categories are used in conjunction with the expected earthquake level to describe

  12. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  13. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  14. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  15. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao; Cloetingh, Sierd; Negendank, Jörg

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  16. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  17. The influence of maximum magnitude on seismic-hazard estimates in the Central and Eastern United States

    USGS Publications Warehouse

    Mueller, C.S.

    2010-01-01

    I analyze the sensitivity of seismic-hazard estimates in the central and eastern United States (CEUS) to maximum magnitude (mmax) by exercising the U.S. Geological Survey (USGS) probabilistic hazard model with several mmax alternatives. Seismicity-based sources control the hazard in most of the CEUS, but data seldom provide an objective basis for estimating mmax. The USGS uses preferred mmax values of moment magnitude 7.0 and 7.5 for the CEUS craton and extended margin, respectively, derived from data in stable continental regions worldwide. Other approaches, for example analysis of local seismicity or judgment about a source's seismogenic potential, often lead to much smaller mmax. Alternative models span the mmax ranges from the 1980s Electric Power Research Institute/Seismicity Owners Group (EPRI/SOG) analysis. Results are presented as haz-ard ratios relative to the USGS national seismic hazard maps. One alternative model specifies mmax equal to moment magnitude 5.0 and 5.5 for the craton and margin, respectively, similar to EPRI/SOG for some sources. For 2% probability of exceedance in 50 years (about 0.0004 annual probability), the strong mmax truncation produces hazard ratios equal to 0.35-0.60 for 0.2-sec spectral acceleration, and 0.15-0.35 for 1.0-sec spectral acceleration. Hazard-controlling earthquakes interact with mmax in complex ways. There is a relatively weak dependence on probability level: hazardratios increase 0-15% for 0.002 annual exceedance probability and decrease 5-25% for 0.00001 annual exceedance probability. Although differences at some sites are tempered when faults are added, mmax clearly accounts for some of the discrepancies that are seen in comparisons between USGS-based and EPRI/SOG-based hazard results.

  18. Gas and seismicity within the Istanbul seismic gap.

    PubMed

    Géli, L; Henry, P; Grall, C; Tary, J-B; Lomax, A; Batsi, E; Riboulot, V; Cros, E; Gürbüz, C; Işık, S E; Sengör, A M C; Le Pichon, X; Ruffine, L; Dupré, S; Thomas, Y; Kalafat, D; Bayrakci, G; Coutellier, Q; Regnier, T; Westbrook, G; Saritas, H; Çifçi, G; Çağatay, M N; Özeren, M S; Görür, N; Tryon, M; Bohnhoff, M; Gasperini, L; Klingelhoefer, F; Scalabrin, C; Augustin, J-M; Embriaco, D; Marinaro, G; Frugoni, F; Monna, S; Etiope, G; Favali, P; Bécel, A

    2018-05-01

    Understanding micro-seismicity is a critical question for earthquake hazard assessment. Since the devastating earthquakes of Izmit and Duzce in 1999, the seismicity along the submerged section of North Anatolian Fault within the Sea of Marmara (comprising the "Istanbul seismic gap") has been extensively studied in order to infer its mechanical behaviour (creeping vs locked). So far, the seismicity has been interpreted only in terms of being tectonic-driven, although the Main Marmara Fault (MMF) is known to strike across multiple hydrocarbon gas sources. Here, we show that a large number of the aftershocks that followed the M 5.1 earthquake of July, 25 th 2011 in the western Sea of Marmara, occurred within a zone of gas overpressuring in the 1.5-5 km depth range, from where pressurized gas is expected to migrate along the MMF, up to the surface sediment layers. Hence, gas-related processes should also be considered for a complete interpretation of the micro-seismicity (~M < 3) within the Istanbul offshore domain.

  19. Seismic anisotropy of the Archean crust in the Minnesota River Valley, Superior Province

    NASA Astrophysics Data System (ADS)

    Ferré, Eric C.; Gébelin, Aude; Conder, James A.; Christensen, Nik; Wood, Justin D.; Teyssier, Christian

    2014-03-01

    The Minnesota River Valley (MRV) subprovince is a well-exposed example of late Archean lithosphere. Its high-grade gneisses display a subhorizontal layering, most likely extending down to the crust-mantle boundary. The strong linear fabric of the gneisses results from high-temperature plastic flow during collage-related contraction. Seismic anisotropies measured up to 1 GPa in the laboratory, and seismic anisotropies calculated through forward-modeling indicate ΔVP ~5-6% and ΔVS ~3%. The MRV crust exhibits a strong macroscopic layering and foliation, and relatively strong seismic anisotropies at the hand specimen scale. Yet the horizontal attitude of these structures precludes any substantial contribution of the MRV crust to shear wave splitting for vertically propagating shear waves such as SKS. The origin of the regionally low seismic anisotropy must lie in the upper mantle. A horizontally layered mantle underneath the United States interior could provide an explanation for the observed low SWS.

  20. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.