Sample records for seismic analysis code

  1. Estimation of the behavior factor of existing RC-MRF buildings

    NASA Astrophysics Data System (ADS)

    Vona, Marco; Mastroberti, Monica

    2018-01-01

    In recent years, several research groups have studied a new generation of analysis methods for seismic response assessment of existing buildings. Nevertheless, many important developments are still needed in order to define more reliable and effective assessment procedures. Moreover, regarding existing buildings, it should be highlighted that due to the low knowledge level, the linear elastic analysis is the only analysis method allowed. The same codes (such as NTC2008, EC8) consider the linear dynamic analysis with behavior factor as the reference method for the evaluation of seismic demand. This type of analysis is based on a linear-elastic structural model subject to a design spectrum, obtained by reducing the elastic spectrum through a behavior factor. The behavior factor (reduction factor or q factor in some codes) is used to reduce the elastic spectrum ordinate or the forces obtained from a linear analysis in order to take into account the non-linear structural capacities. The behavior factors should be defined based on several parameters that influence the seismic nonlinear capacity, such as mechanical materials characteristics, structural system, irregularity and design procedures. In practical applications, there is still an evident lack of detailed rules and accurate behavior factor values adequate for existing buildings. In this work, some investigations of the seismic capacity of the main existing RC-MRF building types have been carried out. In order to make a correct evaluation of the seismic force demand, actual behavior factor values coherent with force based seismic safety assessment procedure have been proposed and compared with the values reported in the Italian seismic code, NTC08.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sesigur, Haluk; Cili, Feridun

    Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC Hmore » shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.« less

  3. Study on comparison of special moment frame steel structure (SMF) and base isolation special moment frame steel structure (BI-SMF) in Indonesia

    NASA Astrophysics Data System (ADS)

    Setiawan, Jody; Nakazawa, Shoji

    2017-10-01

    This paper discusses about comparison of seismic response behaviors, seismic performance and seismic loss function of a conventional special moment frame steel structure (SMF) and a special moment frame steel structure with base isolation (BI-SMF). The validation of the proposed simplified estimation method of the maximum deformation of the base isolation system by using the equivalent linearization method and the validation of the design shear force of the superstructure are investigated from results of the nonlinear dynamic response analysis. In recent years, the constructions of steel office buildings with seismic isolation system are proceeding even in Indonesia where the risk of earthquakes is high. Although the design code for the seismic isolation structure has been proposed, there is no actual construction example for special moment frame steel structure with base isolation. Therefore, in this research, the SMF and BI-SMF buildings are designed by Indonesian Building Code which are assumed to be built at Padang City in Indonesia. The material of base isolation system is high damping rubber bearing. Dynamic eigenvalue analysis and nonlinear dynamic response analysis are carried out to show the dynamic characteristics and seismic performance. In addition, the seismic loss function is obtained from damage state probability and repair cost. For the response analysis, simulated ground accelerations, which have the phases of recorded seismic waves (El Centro NS, El Centro EW, Kobe NS and Kobe EW), adapted to the response spectrum prescribed by the Indonesian design code, that has, are used.

  4. Effect of URM infills on seismic vulnerability of Indian code designed RC frame buildings

    NASA Astrophysics Data System (ADS)

    Haldar, Putul; Singh, Yogendra; Paul, D. K.

    2012-03-01

    Unreinforced Masonry (URM) is the most common partitioning material in framed buildings in India and many other countries. Although it is well-known that under lateral loading the behavior and modes of failure of the frame buildings change significantly due to infill-frame interaction, the general design practice is to treat infills as nonstructural elements and their stiffness, strength and interaction with the frame is often ignored, primarily because of difficulties in simulation and lack of modeling guidelines in design codes. The Indian Standard, like many other national codes, does not provide explicit insight into the anticipated performance and associated vulnerability of infilled frames. This paper presents an analytical study on the seismic performance and fragility analysis of Indian code-designed RC frame buildings with and without URM infills. Infills are modeled as diagonal struts as per ASCE 41 guidelines and various modes of failure are considered. HAZUS methodology along with nonlinear static analysis is used to compare the seismic vulnerability of bare and infilled frames. The comparative study suggests that URM infills result in a significant increase in the seismic vulnerability of RC frames and their effect needs to be properly incorporated in design codes.

  5. Design, analysis, and seismic performance of a hypothetical seismically isolated bridge on legacy highway.

    DOT National Transportation Integrated Search

    2011-01-01

    The need to maintain the functionality of critical transportation lifelines after a large seismic event motivates the : strategy to design certain bridges for performance standards beyond the minimum required by bridge design codes. : To design a bri...

  6. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Sullivan, T. J.

    2012-04-01

    The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework consists of the following four main analysis stages: (i) probabilistic seismic hazard analysis to give the mean occurrence rate of earthquake events having an intensity greater than a threshold value, (ii) structural analysis to estimate the global structural response, given a certain value of seismic intensity, (iii) damage analysis, in which fragility functions are used to express the probability that a building component exceeds a damage state, as a function of the global structural response, (iv) loss analysis, in which the overall performance is assessed based on the damage state of all components. This final step gives estimates of the mean annual frequency with which various repair cost levels (or other decision variables) are exceeded. The realisation of this framework does suggest that risk-based seismic design is now possible. However, comparing current code approaches with the proposed PBEE framework, it becomes apparent that mainstream consulting engineers would have to go through a massive learning curve in order to apply the new procedures in practice. With this in mind, it is proposed that simplified loss-based seismic design procedures are a logical means of helping the engineering profession transition from what are largely deterministic seismic design procedures in current codes, to more rational risk-based seismic design methodologies. Examples are provided to illustrate the likely benefits of adopting loss-based seismic design approaches in practice.

  7. Improved Simplified Methods for Effective Seismic Analysis and Design of Isolated and Damped Bridges in Western and Eastern North America

    NASA Astrophysics Data System (ADS)

    Koval, Viacheslav

    The seismic design provisions of the CSA-S6 Canadian Highway Bridge Design Code and the AASHTO LRFD Seismic Bridge Design Specifications have been developed primarily based on historical earthquake events that have occurred along the west coast of North America. For the design of seismic isolation systems, these codes include simplified analysis and design methods. The appropriateness and range of application of these methods are investigated through extensive parametric nonlinear time history analyses in this thesis. It was found that there is a need to adjust existing design guidelines to better capture the expected nonlinear response of isolated bridges. For isolated bridges located in eastern North America, new damping coefficients are proposed. The applicability limits of the code-based simplified methods have been redefined to ensure that the modified method will lead to conservative results and that a wider range of seismically isolated bridges can be covered by this method. The possibility of further improving current simplified code methods was also examined. By transforming the quantity of allocated energy into a displacement contribution, an idealized analytical solution is proposed as a new simplified design method. This method realistically reflects the effects of ground-motion and system design parameters, including the effects of a drifted oscillation center. The proposed method is therefore more appropriate than current existing simplified methods and can be applicable to isolation systems exhibiting a wider range of properties. A multi-level-hazard performance matrix has been adopted by different seismic provisions worldwide and will be incorporated into the new edition of the Canadian CSA-S6-14 Bridge Design code. However, the combined effect and optimal use of isolation and supplemental damping devices in bridges have not been fully exploited yet to achieve enhanced performance under different levels of seismic hazard. A novel Dual-Level Seismic Protection (DLSP) concept is proposed and developed in this thesis which permits to achieve optimum seismic performance with combined isolation and supplemental damping devices in bridges. This concept is shown to represent an attractive design approach for both the upgrade of existing seismically deficient bridges and the design of new isolated bridges.

  8. Crustal Fracturing Field and Presence of Fluid as Revealed by Seismic Anisotropy

    NASA Astrophysics Data System (ADS)

    Pastori, M.; Piccinini, D.; de Gori, P.; Margheriti, L.; Barchi, M. R.; di Bucci, D.

    2010-12-01

    In the last three years, we developed, tested and improved an automatic analysis code (Anisomat+) to calculate the shear wave splitting parameters, fast polarization direction (φ) and delay time (∂t). The code is a set of MatLab scripts able to retrieve crustal anisotropy parameters from three-component seismic recording of local earthquakes using horizontal component cross-correlation method. The analysis procedure consists in choosing an appropriate frequency range, that better highlights the signal containing the shear waves, and a length of time window on the seismogram centered on the S arrival (the temporal window contains at least one cycle of S wave). The code was compared to other two automatic analysis code (SPY and SHEBA) and tested on three Italian areas (Val d’Agri, Tiber Valley and L’Aquila surrounding) along the Apennine mountains. For each region we used the anisotropic parameters resulting from the automatic computation as a tool to determine the fracture field geometries connected with the active stress field. We compare the temporal variations of anisotropic parameters to the evolution of vp/vs ratio for the same seismicity. The anisotropic fast directions are used to define the active stress field (EDA model), finding a general consistence between fast direction and main stress indicators (focal mechanism and borehole break-out). The magnitude of delay time is used to define the fracture field intensity finding higher value in the volume where micro-seismicity occurs. Furthermore we studied temporal variations of anisotropic parameters and vp/vs ratio in order to explain if fluids play an important role in the earthquake generation process. The close association of anisotropic and vp/vs parameters variations and seismicity rate changes supports the hypothesis that the background seismicity is influenced by the fluctuation of pore fluid pressure in the rocks.

  9. Preliminary consideration on the seismic actions recorded during the 2016 Central Italy seismic sequence

    NASA Astrophysics Data System (ADS)

    Carlo Ponzo, Felice; Ditommaso, Rocco; Nigro, Antonella; Nigro, Domenico S.; Iacovino, Chiara

    2017-04-01

    After the Mw 6.0 mainshock of August 24, 2016 at 03.36 a.m. (local time), with the epicenter located between the towns of Accumoli (province of Rieti), Amatrice (province of Rieti) and Arquata del Tronto (province of Ascoli Piceno), several activities were started in order to perform some preliminary evaluations on the characteristics of the recent seismic sequence in the areas affected by the earthquake. Ambient vibration acquisitions have been performed using two three-directional velocimetric synchronized stations, with a natural frequency equal to 0.5Hz and a digitizer resolution of equal to 24bit. The activities are continuing after the events of the seismic sequence of October 26 and October 30, 2016. In this paper, in order to compare recorded and code provision values in terms of peak (PGA, PGV and PGD), spectral and integral (Housner Intensity) seismic parameters, several preliminary analyses have been performed on accelerometric time-histories acquired by three near fault station of the RAN (Italian Accelerometric Network): Amatrice station (station code AMT), Norcia station (station code NRC) and Castelsantangelo sul Nera station (station code CNE). Several comparisons between the elastic response spectra derived from accelerometric recordings and the elastic demand spectra provided by the Italian seismic code (NTC 2008) have been performed. Preliminary results retrieved from these analyses highlight several apparent difference between experimental data and conventional code provision. Then, the ongoing seismic sequence appears compatible with the historical seismicity in terms of integral parameters, but not in terms of peak and spectral values. It seems appropriate to reconsider the necessity to revise the simplified design approach based on the conventional spectral values. Acknowledgements This study was partially funded by the Italian Department of Civil Protection within the project DPC-RELUIS 2016 - RS4 ''Seismic observatory of structures and health monitoring'' and by the "Centre of Integrated Geomorphology for the Mediterranean Area - CGIAM" within the Framework Agreement with the University of Basilicata "Study, Research and Experimentation in the Field of Analysis and Monitoring of Seismic Vulnerability of Strategic and Relevant Buildings for the purposes of Civil Protection and Development of Innovative Strategies of Seismic Reinforcement".

  10. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    NASA Astrophysics Data System (ADS)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (≤20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  11. A Proposal of Monitoring and Forecasting Method for Crustal Activity in and around Japan with 3-dimensional Heterogeneous Medium Using a Large-scale High-fidelity Finite Element Simulation

    NASA Astrophysics Data System (ADS)

    Hori, T.; Agata, R.; Ichimura, T.; Fujita, K.; Yamaguchi, T.; Takahashi, N.

    2017-12-01

    Recently, we can obtain continuous dense surface deformation data on land and partly on the sea floor, the obtained data are not fully utilized for monitoring and forecasting of crustal activity, such as spatio-temporal variation in slip velocity on the plate interface including earthquakes, seismic wave propagation, and crustal deformation. For construct a system for monitoring and forecasting, it is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate inter-face and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1) & (2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Unstructured FE non-linear seismic wave simulation code has been developed. This achieved physics-based urban earthquake simulation enhanced by 1.08 T DOF x 6.6 K time-step. A high fidelity FEM simulation code with mesh generator has also been developed to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. This code has been improved the code for crustal deformation and achieved 2.05 T-DOF with 45m resolution on the plate interface. This high-resolution analysis enables computation of change of stress acting on the plate interface. Further, for inverse analyses, waveform inversion code for modeling 3D crustal structure has been developed, and the high-fidelity FEM code has been improved to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. We are developing the methods for forecasting the slip velocity variation on the plate interface. Although the prototype is for elastic half space model, we are applying it for 3D heterogeneous structure with the high-fidelity FE model. Furthermore, large-scale simulation codes for monitoring are being implemented on the GPU clusters and analysis tools are developing to include other functions such as examination in model errors.

  12. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  13. Seismic Safety Of Simple Masonry Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guadagnuolo, Mariateresa; Faella, Giuseppe

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have tomore » be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.« less

  14. Comparative Study on Code-based Linear Evaluation of an Existing RC Building Damaged during 1998 Adana-Ceyhan Earthquake

    NASA Astrophysics Data System (ADS)

    Toprak, A. Emre; Gülay, F. Gülten; Ruge, Peter

    2008-07-01

    Determination of seismic performance of existing buildings has become one of the key concepts in structural analysis topics after recent earthquakes (i.e. Izmit and Duzce Earthquakes in 1999, Kobe Earthquake in 1995 and Northridge Earthquake in 1994). Considering the need for precise assessment tools to determine seismic performance level, most of earthquake hazardous countries try to include performance based assessment in their seismic codes. Recently, Turkish Earthquake Code 2007 (TEC'07), which was put into effect in March 2007, also introduced linear and non-linear assessment procedures to be applied prior to building retrofitting. In this paper, a comparative study is performed on the code-based seismic assessment of RC buildings with linear static methods of analysis, selecting an existing RC building. The basic principles dealing the procedure of seismic performance evaluations for existing RC buildings according to Eurocode 8 and TEC'07 will be outlined and compared. Then the procedure is applied to a real case study building is selected which is exposed to 1998 Adana-Ceyhan Earthquake in Turkey, the seismic action of Ms = 6.3 with a maximum ground acceleration of 0.28 g It is a six-storey RC residential building with a total of 14.65 m height, composed of orthogonal frames, symmetrical in y direction and it does not have any significant structural irregularities. The rectangular shaped planar dimensions are 16.40 m×7.80 m = 127.90 m2 with five spans in x and two spans in y directions. It was reported that the building had been moderately damaged during the 1998 earthquake and retrofitting process was suggested by the authorities with adding shear-walls to the system. The computations show that the performing methods of analysis with linear approaches using either Eurocode 8 or TEC'07 independently produce similar performance levels of collapse for the critical storey of the structure. The computed base shear value according to Eurocode is much higher than the requirements of the Turkish Earthquake Code while the selected ground conditions represent the same characteristics. The main reason is that the ordinate of the horizontal elastic response spectrum for Eurocode 8 is increased by the soil factor. In TEC'07 force-based linear assessment, the seismic demands at cross-sections are to be checked with residual moment capacities; however, the chord rotations of primary ductile elements must be checked for Eurocode safety verifications. On the other hand, the demand curvatures from linear methods of analysis of Eurocode 8 together with TEC'07 are almost similar.

  15. GISMO: A MATLAB toolbox for seismic research, monitoring, & education

    NASA Astrophysics Data System (ADS)

    Thompson, G.; Reyes, C. G.; Kempler, L. A.

    2017-12-01

    GISMO is an open-source MATLAB toolbox which provides an object-oriented framework to build workflows and applications that read, process, visualize and write seismic waveform, catalog and instrument response data. GISMO can retrieve data from a variety of sources (e.g. FDSN web services, Earthworm/Winston servers) and data formats (SAC, Seisan, etc.). It can handle waveform data that crosses file boundaries. All this alleviates one of the most time consuming part for scientists developing their own codes. GISMO simplifies seismic data analysis by providing a common interface for your data, regardless of its source. Several common plots are built-in to GISMO, such as record section plots, spectrograms, depth-time sections, event count per unit time, energy release per unit time, etc. Other visualizations include map views and cross-sections of hypocentral data. Several common processing methods are also included, such as an extensive set of tools for correlation analysis. Support is being added to interface GISMO with ObsPy. GISMO encourages community development of an integrated set of codes and accompanying documentation, eliminating the need for seismologists to "reinvent the wheel". By sharing code the consistency and repeatability of results can be enhanced. GISMO is hosted on GitHub with documentation both within the source code and in the project wiki. GISMO has been used at the University of South Florida and University of Alaska Fairbanks in graduate-level courses including Seismic Data Analysis, Time Series Analysis and Computational Seismology. GISMO has also been tailored to interface with the common seismic monitoring software and data formats used by volcano observatories in the US and elsewhere. As an example, toolbox training was delivered to researchers at INETER (Nicaragua). Applications built on GISMO include IceWeb (e.g. web-based spectrograms), which has been used by Alaska Volcano Observatory since 1998 and became the prototype for the USGS Pensive system.

  16. A proposal of monitoring and forecasting system for crustal activity in and around Japan using a large-scale high-fidelity finite element simulation codes

    NASA Astrophysics Data System (ADS)

    Hori, Takane; Ichimura, Tsuyoshi; Takahashi, Narumi

    2017-04-01

    Here we propose a system for monitoring and forecasting of crustal activity, such as spatio-temporal variation in slip velocity on the plate interface including earthquakes, seismic wave propagation, and crustal deformation. Although, we can obtain continuous dense surface deformation data on land and partly on the sea floor, the obtained data are not fully utilized for monitoring and forecasting. It is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate interface and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1) & (2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Actually, Ichimura et al. (2015, SC15) has developed unstructured FE non-linear seismic wave simulation code, which achieved physics-based urban earthquake simulation enhanced by 1.08 T DOF x 6.6 K time-step. Ichimura et al. (2013, GJI) has developed high fidelity FEM simulation code with mesh generator to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. Fujita et al. (2016, SC16) has improved the code for crustal deformation and achieved 2.05 T-DOF with 45m resolution on the plate interface. This high-resolution analysis enables computation of change of stress acting on the plate interface. Further, for inverse analyses, Errol et al. (2012, BSSA) has developed waveform inversion code for modeling 3D crustal structure, and Agata et al. (2015, AGU Fall Meeting) has improved the high-fidelity FEM code to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. Furthermore, we are developing the methods for forecasting the slip velocity variation on the plate interface. Basic concept is given in Hori et al. (2014, Oceanography) introducing ensemble based sequential data assimilation procedure. Although the prototype described there is for elastic half space model, we are applying it for 3D heterogeneous structure with the high-fidelity FE model.

  17. Sensitivity analysis of tall buildings in Semarang, Indonesia due to fault earthquakes with maximum 7 Mw

    NASA Astrophysics Data System (ADS)

    Partono, Windu; Pardoyo, Bambang; Atmanto, Indrastono Dwi; Azizah, Lisa; Chintami, Rouli Dian

    2017-11-01

    Fault is one of the dangerous earthquake sources that can cause building failure. A lot of buildings were collapsed caused by Yogyakarta (2006) and Pidie (2016) fault source earthquakes with maximum magnitude 6.4 Mw. Following the research conducted by Team for Revision of Seismic Hazard Maps of Indonesia 2010 and 2016, Lasem, Demak and Semarang faults are three closest earthquake sources surrounding Semarang. The ground motion from those three earthquake sources should be taken into account for structural design and evaluation. Most of tall buildings, with minimum 40 meter high, in Semarang were designed and constructed following the 2002 and 2012 Indonesian Seismic Code. This paper presents the result of sensitivity analysis research with emphasis on the prediction of deformation and inter-story drift of existing tall building within the city against fault earthquakes. The analysis was performed by conducting dynamic structural analysis of 8 (eight) tall buildings using modified acceleration time histories. The modified acceleration time histories were calculated for three fault earthquakes with magnitude from 6 Mw to 7 Mw. The modified acceleration time histories were implemented due to inadequate time histories data caused by those three fault earthquakes. Sensitivity analysis of building against earthquake can be predicted by evaluating surface response spectra calculated using seismic code and surface response spectra calculated from acceleration time histories from a specific earthquake event. If surface response spectra calculated using seismic code is greater than surface response spectra calculated from acceleration time histories the structure will stable enough to resist the earthquake force.

  18. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

  19. Comparative Study on Code-based Linear Evaluation of an Existing RC Building Damaged during 1998 Adana-Ceyhan Earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toprak, A. Emre; Guelay, F. Guelten; Ruge, Peter

    2008-07-08

    Determination of seismic performance of existing buildings has become one of the key concepts in structural analysis topics after recent earthquakes (i.e. Izmit and Duzce Earthquakes in 1999, Kobe Earthquake in 1995 and Northridge Earthquake in 1994). Considering the need for precise assessment tools to determine seismic performance level, most of earthquake hazardous countries try to include performance based assessment in their seismic codes. Recently, Turkish Earthquake Code 2007 (TEC'07), which was put into effect in March 2007, also introduced linear and non-linear assessment procedures to be applied prior to building retrofitting. In this paper, a comparative study is performedmore » on the code-based seismic assessment of RC buildings with linear static methods of analysis, selecting an existing RC building. The basic principles dealing the procedure of seismic performance evaluations for existing RC buildings according to Eurocode 8 and TEC'07 will be outlined and compared. Then the procedure is applied to a real case study building is selected which is exposed to 1998 Adana-Ceyhan Earthquake in Turkey, the seismic action of Ms = 6.3 with a maximum ground acceleration of 0.28 g It is a six-storey RC residential building with a total of 14.65 m height, composed of orthogonal frames, symmetrical in y direction and it does not have any significant structural irregularities. The rectangular shaped planar dimensions are 16.40 mx7.80 m = 127.90 m{sup 2} with five spans in x and two spans in y directions. It was reported that the building had been moderately damaged during the 1998 earthquake and retrofitting process was suggested by the authorities with adding shear-walls to the system. The computations show that the performing methods of analysis with linear approaches using either Eurocode 8 or TEC'07 independently produce similar performance levels of collapse for the critical storey of the structure. The computed base shear value according to Eurocode is much higher than the requirements of the Turkish Earthquake Code while the selected ground conditions represent the same characteristics. The main reason is that the ordinate of the horizontal elastic response spectrum for Eurocode 8 is increased by the soil factor. In TEC'07 force-based linear assessment, the seismic demands at cross-sections are to be checked with residual moment capacities; however, the chord rotations of primary ductile elements must be checked for Eurocode safety verifications. On the other hand, the demand curvatures from linear methods of analysis of Eurocode 8 together with TEC'07 are almost similar.« less

  20. Fundamental period of Italian reinforced concrete buildings: comparison between numerical, experimental and Italian code simplified values

    NASA Astrophysics Data System (ADS)

    Ditommaso, Rocco; Carlo Ponzo, Felice; Auletta, Gianluca; Iacovino, Chiara; Nigro, Antonella

    2015-04-01

    Aim of this study is a comparison among the fundamental period of reinforced concrete buildings evaluated using the simplified approach proposed by the Italian Seismic code (NTC 2008), numerical models and real values retrieved from an experimental campaign performed on several buildings located in Basilicata region (Italy). With the intention of proposing simplified relationships to evaluate the fundamental period of reinforced concrete buildings, scientists and engineers performed several numerical and experimental campaigns, on different structures all around the world, to calibrate different kind of formulas. Most of formulas retrieved from both numerical and experimental analyses provides vibration periods smaller than those suggested by the Italian seismic code. However, it is well known that the fundamental period of a structure play a key role in the correct evaluation of the spectral acceleration for seismic static analyses. Generally, simplified approaches impose the use of safety factors greater than those related to in depth nonlinear analyses with the aim to cover possible unexpected uncertainties. Using the simplified formula proposed by the Italian seismic code the fundamental period is quite higher than fundamental periods experimentally evaluated on real structures, with the consequence that the spectral acceleration adopted in the seismic static analysis may be significantly different than real spectral acceleration. This approach could produces a decreasing in safety factors obtained using linear and nonlinear seismic static analyses. Finally, the authors suggest a possible update of the Italian seismic code formula for the simplified estimation of the fundamental period of vibration of existing RC buildings, taking into account both elastic and inelastic structural behaviour and the interaction between structural and non-structural elements. Acknowledgements This study was partially funded by the Italian Civil Protection Department within the project DPC-RELUIS 2014 - RS4 ''Seismic observatory of structures and health monitoring''. References R. Ditommaso, M. Vona, M. R. Gallipoli and M. Mucciarelli (2013). Evaluation and considerations about fundamental periods of damaged reinforced concrete buildings. Nat. Hazards Earth Syst. Sci., 13, 1903-1912, 2013. www.nat-hazards-earth-syst-sci.net/13/1903/2013. doi:10.5194/nhess-13-1903-2013

  1. Probabilistic Seismic Hazard Assessment for Iraq

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less

  2. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  3. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...

  4. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...

  5. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...

  6. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...

  7. On-line Data Transmission, as Part of the Seismic Evaluation Process in the Buildings Field

    NASA Astrophysics Data System (ADS)

    Sorin Dragomir, Claudiu; Dobre, Daniela; Craifaleanu, Iolanda; Georgescu, Emil-Sever

    2017-12-01

    The thorough analytical modelling of seismic actions, of the structural system and of the foundation soil is essential for a proper dynamic analysis of a building. However, the validation of the used models should be made, whenever possible, with reference to results obtained from experimental investigations, building instrumentation and monitoring of vibrations generated by various seismic or non-seismic sources. In Romania, the permanent seismic instrumentation/monitoring of buildings is part of a special follow-up activity, performed in accordance with the P130/1999 code for the time monitoring of building behaviour and with the seismic design code, P100-2013. By using the state-of-the-art modern equipment (GeoSIG and Kinemetrics digital accelerographs) in the seismic network of the National Institute for Research and Development URBAN-INCERC, the instrumented buildings can be monitored remotely, with recorded data being sent to authorities or to research institutes in the field by a real-time data transmission system. The obtained records are processed, computing the Fourier amplitude spectra and the response spectra, and the modal parameters of buildings are determined. The paper presents some of the most important results of the institute in the field of building monitoring, focusing on the situation of some significant instrumented buildings located in different parts of the country. In addition, maps with data received from seismic stations after the occurrence of two recent Vrancea (Romania) earthquakes, showing the spatial distribution of ground accelerations, are presented, together with a comparative analysis, performed with reference to previous studies in the literature.

  8. Structural vibration passive control and economic analysis of a high-rise building in Beijing

    NASA Astrophysics Data System (ADS)

    Chen, Yongqi; Cao, Tiezhu; Ma, Liangzhe; Luo, Chaoying

    2009-12-01

    Performance analysis of the Pangu Plaza under earthquake and wind loads is described in this paper. The plaza is a 39-story steel high-rise building, 191 m high, located in Beijing close to the 2008 Olympic main stadium. It has both fluid viscous dampers (FVDs) and buckling restrained braces or unbonded brace (BRB or UBB) installed. A repeated iteration procedure in its design and analysis was adopted for optimization. Results from the seismic response analysis in the horizontal and vertical directions show that the FVDs are highly effective in reducing the response of both the main structure and the secondary system. A comparative analysis of structural seismic performance and economic impact was conducted using traditional methods, i.e., increased size of steel columns and beams and/or use of an increased number of seismic braces versus using FVD. Both the structural response and economic analysis show that using FVD to absorb seismic energy not only satisfies the Chinese seismic design code for a “rare” earthquake, but is also the most economical way to improve seismic performance both for one-time direct investment and long term maintenance.

  9. Evaluation of seismic design spectrum based on UHS implementing fourth-generation seismic hazard maps of Canada

    NASA Astrophysics Data System (ADS)

    Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.

    2016-12-01

    Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).

  10. Seismic hazard assessment in the Catania and Siracusa urban areas (Italy) through different approaches

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria

    2010-05-01

    The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed, through the site approach, for Catania area can be interpreted in terms of greater damage historically observed at this town and its smaller distance from the seismogenic structures. On the other hand, the higher level of hazard found for Siracusa, throughout the Esteva-Cornell approach, could be a consequence of the features of such method which spreads out the intensities over a wide area. However, in SHA the use of a combined approach is recommended for a mutual validation of obtained results and any choice between the two approaches is strictly linked to the knowledge of the local seismotectonic features. References Albarello D. and Mucciarelli M.; 2002: Seismic hazard estimates using ill?defined macroseismic data at site. Pure Appl. Geophys., 159, 1289?1304. Cornell C.A.; 1968: Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58(5), 1583-1606. D'Amico V. and Albarello D.; 2007: Codice per il calcolo della pericolosità sismica da dati di sito (freeware). Progetto DPC-INGV S1, http://esse1.mi.ingv.it/d12.html Esteva L.; 1967: Criterios para la construcción de espectros para diseño sísmico. Proceedings of XII Jornadas Sudamericanas de Ingeniería Estructural y III Simposio Panamericano de Estructuras, Caracas, 1967. Published later in Boletín del Instituto de Materiales y Modelos Estructurales, Universidad Central de Venezuela, No. 19. Ordaz M., Aguilar A. and Arboleda J.; 2007: CRISIS2007, Program for computing seismic hazard. Version 5.4, Mexico City: UNAM. SSHAC (Senior Seismic Hazard Analysis Committee); 1997: Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts. NUREG/CR-6372.

  11. An analytical study on excitation of nuclear-coupled thermal-hydraulic instability due to seismically induced resonance in BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirano, Masashi

    1997-07-01

    This paper describes the results of a scoping study on seismically induced resonance of nuclear-coupled thermal-hydraulic instability in BWRs, which was conducted by using TRAC-BF1 within a framework of a point kinetics model. As a result of the analysis, it is shown that a reactivity insertion could occur accompanied by in-surge of coolant into the core resulted from the excitation of the nuclear-coupled instability by the external acceleration. In order to analyze this phenomenon more in detail, it is necessary to couple a thermal-hydraulic code with a three-dimensional nuclear kinetics code.

  12. Re-evaluation and updating of the seismic hazard of Lebanon

    NASA Astrophysics Data System (ADS)

    Huijer, Carla; Harajli, Mohamed; Sadek, Salah

    2016-01-01

    This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.

  13. Research on Influencing Factors and Generalized Power of Synthetic Artificial Seismic Wave

    NASA Astrophysics Data System (ADS)

    Jiang, Yanpei

    2018-05-01

    Start your abstract here… In this paper, according to the trigonometric series method, the author adopts different envelope functions and the acceleration design spectrum in Seismic Code For Urban Bridge Design to simulate the seismic acceleration time history which meets the engineering accuracy requirements by modifying and iterating the initial wave. Spectral analysis is carried out to find out the the distribution law of the changing frequencies of the energy of seismic time history and to determine the main factors that affect the acceleration amplitude spectrum and energy spectrum density. The generalized power formula of seismic time history is derived from the discrete energy integral formula and the author studied the changing characteristics of generalized power of the seismic time history under different envelop functions. Examples are analyzed to illustrate that generalized power can measure the seismic performance of bridges.

  14. Revision of seismic design codes corresponding to building damages in the ``5.12'' Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yayong

    2010-06-01

    A large number of buildings were seriously damaged or collapsed in the “5.12” Wenchuan earthquake. Based on field surveys and studies of damage to different types of buildings, seismic design codes have been updated. This paper briefly summarizes some of the major revisions that have been incorporated into the “Standard for classification of seismic protection of building constructions GB50223-2008” and “Code for Seismic Design of Buildings GB50011-2001.” The definition of seismic fortification class for buildings has been revisited, and as a result, the seismic classifications for schools, hospitals and other buildings that hold large populations such as evacuation shelters and information centers have been upgraded in the GB50223-2008 Code. The main aspects of the revised GB50011-2001 code include: (a) modification of the seismic intensity specified for the Provinces of Sichuan, Shanxi and Gansu; (b) basic conceptual design for retaining walls and building foundations in mountainous areas; (c) regularity of building configuration; (d) integration of masonry structures and pre-cast RC floors; (e) requirements for calculating and detailing stair shafts; and (f) limiting the use of single-bay RC frame structures. Some significant examples of damage in the epicenter areas are provided as a reference in the discussion on the consequences of collapse, the importance of duplicate structural systems, and the integration of RC and masonry structures.

  15. Joint Inversion of Vp, Vs, and Resistivity at SAFOD

    NASA Astrophysics Data System (ADS)

    Bennington, N. L.; Zhang, H.; Thurber, C. H.; Bedrosian, P. A.

    2010-12-01

    Seismic and resistivity models at SAFOD have been derived from separate inversions that show significant spatial similarity between the main model features. Previous work [Zhang et al., 2009] used cluster analysis to make lithologic inferences from trends in the seismic and resistivity models. We have taken this one step further by developing a joint inversion scheme that uses the cross-gradient penalty function to achieve structurally similar Vp, Vs, and resistivity images that adequately fit the seismic and magnetotelluric MT data without forcing model similarity where none exists. The new inversion code, tomoDDMT, merges the seismic inversion code tomoDD [Zhang and Thurber, 2003] and the MT inversion code Occam2DMT [Constable et al., 1987; deGroot-Hedlin and Constable, 1990]. We are exploring the utility of the cross-gradients penalty function in improving models of fault-zone structure at SAFOD on the San Andreas Fault in the Parkfield, California area. Two different sets of end-member starting models are being tested. One set is the separately inverted Vp, Vs, and resistivity models. The other set consists of simple, geologically based block models developed from borehole information at the SAFOD drill site and a simplified version of features seen in geophysical models at Parkfield. For both starting models, our preliminary results indicate that the inversion produces a converging solution with resistivity, seismic, and cross-gradient misfits decreasing over successive iterations. We also compare the jointly inverted Vp, Vs, and resistivity models to borehole information from SAFOD to provide a "ground truth" comparison.

  16. Rapid earthquake detection through GPU-Based template matching

    NASA Astrophysics Data System (ADS)

    Mu, Dawei; Lee, En-Jui; Chen, Po

    2017-12-01

    The template-matching algorithm (TMA) has been widely adopted for improving the reliability of earthquake detection. The TMA is based on calculating the normalized cross-correlation coefficient (NCC) between a collection of selected template waveforms and the continuous waveform recordings of seismic instruments. In realistic applications, the computational cost of the TMA is much higher than that of traditional techniques. In this study, we provide an analysis of the TMA and show how the GPU architecture provides an almost ideal environment for accelerating the TMA and NCC-based pattern recognition algorithms in general. So far, our best-performing GPU code has achieved a speedup factor of more than 800 with respect to a common sequential CPU code. We demonstrate the performance of our GPU code using seismic waveform recordings from the ML 6.6 Meinong earthquake sequence in Taiwan.

  17. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...

  18. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...

  19. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...

  20. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...

  1. A comparison of IBC with 1997 UBC for modal response spectrum analysis in standard-occupancy buildings

    NASA Astrophysics Data System (ADS)

    Nahhas, Tariq M.

    2011-03-01

    This paper presents a comparison of the seismic forces generated from a Modal Response Spectrum Analysis (MRSA) by applying the provisions of two building codes, the 1997 Uniform Building Code (UBC) and the 2000-2009 International Building Code (IBC), to the most common ordinary residential buildings of standard occupancy. Considering IBC as the state of the art benchmark code, the primary concern is the safety of buildings designed using the UBC as compared to those designed using the IBC. A sample of four buildings with different layouts and heights was used for this comparison. Each of these buildings was assumed to be located at four different geographical sample locations arbitrarily selected to represent various earthquake zones on a seismic map of the USA, and was subjected to code-compliant response spectrum analyses for all sample locations and for five different soil types at each location. Response spectrum analysis was performed using the ETABS software package. For all the cases investigated, the UBC was found to be significantly more conservative than the IBC. The UBC design response spectra have higher spectral accelerations, and as a result, the response spectrum analysis provided a much higher base shear and moment in the structural members as compared to the IBC. The conclusion is that ordinary office and residential buildings designed using UBC 1997 are considered to be overdesigned, and therefore they are quite safe even according to the IBC provisions.

  2. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it

    2014-10-06

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where themore » masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.« less

  3. A proposal for seismic evaluation index of mid-rise existing RC buildings in Afghanistan

    NASA Astrophysics Data System (ADS)

    Naqi, Ahmad; Saito, Taiki

    2017-10-01

    Mid-rise RC buildings gradually rise in Kabul and entire Afghanistan since 2001 due to rapid increase of population. To protect the safety of resident, Afghan Structure Code was issued in 2012. But the building constructed before 2012 failed to conform the code requirements. In Japan, new sets of rules and law for seismic design of buildings had been issued in 1981 and severe earthquake damage was disclosed for the buildings designed before 1981. Hence, the Standard for Seismic Evaluation of RC Building published in 1977 has been widely used in Japan to evaluate the seismic capacity of existing buildings designed before 1981. Currently similar problem existed in Afghanistan, therefore, this research examined the seismic capacity of six RC buildings which were built before 2012 in Kabul by applying the seismic screening procedure presented by Japanese standard. Among three screening procedures with different capability, the less detailed screening procedure, the first level of screening, is applied. The study founds an average seismic index (IS-average=0.21) of target buildings. Then, the results were compared with those of more accurate seismic evaluation procedures of Capacity Spectrum Method (CSM) and Time History Analysis (THA). The results for CSM and THA show poor seismic performance of target buildings not able to satisfy the safety design limit (1/100) of the maximum story drift. The target buildings are then improved by installing RC shear walls. The seismic indices of these retrofitted buildings were recalculated and the maximum story drifts were analyzed by CSM and THA. The seismic indices and CSM and THA results are compared and found that building with seismic index larger than (IS-average =0.4) are able to satisfy the safety design limit. Finally, to screen and minimize the earthquake damage over the existing buildings, the judgement seismic index (IS-Judgment=0.5) for the first level of screening is proposed.

  4. Comprehensive Final Report for the Marine Seismic System Program

    DTIC Science & Technology

    1985-08-01

    Executive summary g ■ -■• < ".• v>:.* From 1981 through 1983, the Defense Advanced Research Projects Agency funded the National Science...S. Government. Per Mr. J. A. Ballard, NORDA/Code 360 Accesion For NTIS CRA&I DUG TAB Unannou.iCed Justification G D By Distib...n>r" Analysis of Ambient Seismic Noise Recorded by Downhole and Ocean-Bottom Seismometers on Dee: Sea Drilling Project Leg 78B Richard G

  5. Predicting the seismic performance of typical R/C healthcare facilities: emphasis on hospitals

    NASA Astrophysics Data System (ADS)

    Bilgin, Huseyin; Frangu, Idlir

    2017-09-01

    Reinforced concrete (RC) type of buildings constitutes an important part of the current building stock in earthquake prone countries such as Albania. Seismic response of structures during a severe earthquake plays a vital role in the extent of structural damage and resulting injuries and losses. In this context, this study evaluates the expected performance of a five-story RC healthcare facility, representative of common practice in Albania, designed according to older codes. The design was based on the code requirements used in this region during the mid-1980s. Non-linear static and dynamic time history analyses were conducted on the structural model using the Zeus NL computer program. The dynamic time history analysis was conducted with a set of ground motions from real earthquakes. The building responses were estimated in global levels. FEMA 356 criteria were used to predict the seismic performance of the building. The structural response measures such as capacity curve and inter-story drift under the set of ground motions and pushover analyses results were compared and detailed seismic performance assessment was done. The main aim of this study is considering the application and methodology for the earthquake performance assessment of existing buildings. The seismic performance of the structural model varied significantly under different ground motions. Results indicate that case study building exhibit inadequate seismic performance under different seismic excitations. In addition, reasons for the poor performance of the building is discussed.

  6. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).

  7. Two applications of time reversal mirrors: seismic radio and seismic radar.

    PubMed

    Hanafy, Sherif M; Schuster, Gerard T

    2011-10-01

    Two seismic applications of time reversal mirrors (TRMs) are introduced and tested with field experiments. The first one is sending, receiving, and decoding coded messages similar to a radio except seismic waves are used. The second one is, similar to radar surveillance, detecting and tracking a moving object(s) in a remote area, including the determination of the objects speed of movement. Both applications require the prior recording of calibration Green's functions in the area of interest. This reference Green's function will be used as a codebook to decrypt the coded message in the first application and as a moving sensor for the second application. Field tests show that seismic radar can detect the moving coordinates (x(t), y(t), z(t)) of a person running through a calibration site. This information also allows for a calculation of his velocity as a function of location. Results with the seismic radio are successful in seismically detecting and decoding coded pulses produced by a hammer. Both seismic radio and radar are highly robust to signals in high noise environments due to the super-stacking property of TRMs. © 2011 Acoustical Society of America

  8. Development of battering ram vibrator system

    NASA Astrophysics Data System (ADS)

    Sun, F.; Chen, Z.; Lin, J.; Tong, X.

    2012-12-01

    This paper researched the battering ram vibrator system, by electric machinery we can control oil system of battering ram, we realized exact control of battering ram, after analyzed pseudorandom coding, code "0" and "1" correspond to rest and shake of battering ram, then we can get pseudorandom coding which is the same with battering ram vibrator. After testing , by the reference trace and single shot record, when we using pseudorandom coding mode, the ratio of seismic wavelet to correlation interfere is about 68 dB, while the general mode , the ratio of seismic wavelet to correlation interfere only is 27.9dB, by battering ram vibrator system, we can debase the correlation interfere which come from the single shaking frequency of battering ram, this system advanced the signal-to-noise ratio of seismic data, which can give direction of the application of battering ram vibrator in metal mine exploration and high resolving seismic exploration.

  9. A proposal of monitoring and forecasting system for crustal activity in and around Japan using a large-scale high-fidelity finite element simulation codes

    NASA Astrophysics Data System (ADS)

    Hori, T.; Ichimura, T.

    2015-12-01

    Here we propose a system for monitoring and forecasting of crustal activity, especially great interplate earthquake generation and its preparation processes in subduction zone. Basically, we model great earthquake generation as frictional instability on the subjecting plate boundary. So, spatio-temporal variation in slip velocity on the plate interface should be monitored and forecasted. Although, we can obtain continuous dense surface deformation data on land and partly at the sea bottom, the data obtained are not fully utilized for monitoring and forecasting. It is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate interface and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1)&(2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Actually, Ichimura et al. (2014, SC14) has developed unstructured FE non-linear seismic wave simulation code, which achieved physics-based urban earthquake simulation enhanced by 10.7 BlnDOF x 30 K time-step. Ichimura et al. (2013, GJI) has developed high fidelity FEM simulation code with mesh generator to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. Further, for inverse analyses, Errol et al. (2012, BSSA) has developed waveform inversion code for modeling 3D crustal structure, and Agata et al. (2015, this meeting) has improved the high fidelity FEM code to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. Furthermore, we are developing the methods for forecasting the slip velocity variation on the plate interface. Basic concept is given in Hori et al. (2014, Oceanography) introducing ensemble based sequential data assimilation procedure. Although the prototype described there is for elastic half space model, we will apply it for 3D heterogeneous structure with the high fidelity FE model.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  11. Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators

    NASA Astrophysics Data System (ADS)

    Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.

    2015-12-01

    Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaSalle, F.R.; Golbeg, P.R.; Chenault, D.M.

    For reactor and nuclear facilities, both Title 10, Code of Federal Regulations, Part 50, and US Department of Energy Order 6430.1A require assessments of the interaction of non-Safety Class 1 piping and equipment with Safety Class 1 piping and equipment during a seismic event to maintain the safety function. The safety class systems of nuclear reactors or nuclear facilities are designed to the applicable American Society of Mechanical Engineers standards and Seismic Category 1 criteria that require rigorous analysis, construction, and quality assurance. Because non-safety class systems are generally designed to lesser standards and seismic criteria, they may become missilesmore » during a safe shutdown earthquake. The resistance of piping, tubing, and equipment to seismically generated missiles is addressed in the paper. Gross plastic and local penetration failures are considered with applicable test verification. Missile types and seismic zones of influence are discussed. Field qualification data are also developed for missile evaluation.« less

  13. Site characterization of the national seismic network of Italy

    NASA Astrophysics Data System (ADS)

    Bordoni, Paola; Pacor, Francesca; Cultrera, Giovanna; Casale, Paolo; Cara, Fabrizio; Di Giulio, Giuseppe; Famiani, Daniela; Ladina, Chiara; PIschiutta, Marta; Quintiliani, Matteo

    2017-04-01

    The national seismic network of Italy (Rete Sismica Nazionale, RSN) run by Istituto Nazionale di Geofisica e Vulcanologia (INGV) consists of more than 400 seismic stations connected in real time to the institute data center in order to locate earthquakes for civil defense purposes. A critical issue in the performance of a network is the characterization of site condition at the recording stations. Recently INGV has started addressing this subject through the revision of all available geological and geophysical data, the acquisition of new information by means of ad-hoc field measurements and the analysis of seismic waveforms. The main effort is towards building a database, integrated with the other INGV infrastructures, designed to archive homogeneous parameters through the seismic network useful for a complete site characterization, including housing, geological, seismological and geotechnical features as well as the site class according to the European and Italian building codes. Here we present the ongoing INGV activities.

  14. A New Code for Calculating Post-seismic Displacements as Well as Geoid and Gravity Changes on a Layered Visco-Elastic Spherical Earth

    NASA Astrophysics Data System (ADS)

    Gao, Shanghua; Fu, Guangyu; Liu, Tai; Zhang, Guoqing

    2017-03-01

    Tanaka et al. (Geophys J Int 164:273-289, 2006, Geophys J Int 170:1031-1052, 2007) proposed the spherical dislocation theory (SDT) in a spherically symmetric, self-gravitating visco-elastic earth model. However, to date there have been no reports on easily adopted, widely used software that utilizes Tanaka's theory. In this study we introduce a new code to compute post-seismic deformations (PSD), including displacements as well as Geoid and gravity changes, caused by a seismic source at any position. This new code is based on the above-mentioned SDT. The code consists of two parts. The first part is the numerical frame of the dislocation Green function (DGF), which contains a set of two-dimensional discrete numerical frames of DGFs on a symmetric earth model. The second part is an integration function, which performs bi-quadratic spline interpolation operations on the frame of DGFs. The inputs are the information on the seismic fault models and the information on the observation points. After the user prepares the inputs in a file with given format, the code will automatically compute the PSD. As an example, we use the new code to calculate the co-seismic displacements caused by the Tohoku-Oki Mw 9.0 earthquake. We compare the result with observations and the result from a full-elastic SDT, and we found that the Root Mean Square error between the calculated and observed results is 7.4 cm. This verifies the suitability of our new code. Finally, we discuss several issues that require attention when using the code, which should be helpful for users.

  15. Towards Simulating a Realistic Planetary Seismic Wavefield: The Contribution of the Megaregolith and Low-Velocity Waveguides

    NASA Technical Reports Server (NTRS)

    Schmerr, Nicholas C.; Weber, Renee C.; Lin, Pei-Ying Patty; Thorne, Michael Scott; Garnero, Ed J.

    2011-01-01

    Lunar seismograms are distinctly different from their terrestrial counterparts. The Apollo lunar seismometers recorded moonquakes without distinct P- or S-wave arrivals; instead waves arrive as a diffuse coda that decays over several hours making the identification of body waves difficult. The unusual character of the lunar seismic wavefield is generally tied to properties of the megaregolith: it consists of highly fractured and broken crustal rock, the result of extensive bombardment of the Moon. The megaregolith extends several kilometers into the lunar crust, possibly into the mantle in some regions, and is covered by a thin coating of fine-scale dust. These materials possess very low seismic velocities that strongly scatter the seismic wavefield at high frequencies. Directly modeling the effects of the megaregolith to simulate an accurate lunar seismic wavefield is a challenging computational problem, owing to the inherent 3-D nature of the problem and the high frequencies (greater than 1 Hz) required. Here we focus on modeling the long duration code, studying the effects of the low velocities found in the megaregolith. We produce synthetic seismograms using 1-D slowness integration methodologies, GEMINI and reflectivity, and a 3-D Cartesian finite difference code, Wave Propagation Program, to study the effect of thin layers of low velocity on the surface of a planet. These codes allow us generate seismograms with dominant frequencies of approximately 1 Hz. For background lunar seismic structure we explore several models, including the recent model of Weber et al., Science, 2011. We also investigate variations in megaregolithic thickness, velocity, attenuation, and seismogram frequency content. Our results are compared to the Apollo seismic dataset, using both a cross correlation technique and integrated envelope approach to investigate coda decay. We find our new high frequency results strongly support the hypothesis that the long duration of the lunar seismic codes is generated by the presence of the low velocity megaregolith, and that the diffuse arrivals are a combination of scattered energy and multiple reverberations within this layer. The 3-D modeling indicates the extreme surface topography of the Moon adds only a small contribution to scattering effects, though local geology may play a larger role. We also study the effects of the megaregolith on core reflected and converted phases and other body waves. Our analysis indicates detection of core interacting arrivals with a polarization filter technique is robust and lends the possibility of detecting other body waves from the Moon.

  16. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    NASA Astrophysics Data System (ADS)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years, SAC files contained a fixed-length header. Time and distance-related values are stored in single precision, which has become a problem with the increase in desired precision for data compared to thirty years ago. A future goal is to address this precision problem, but in a backward compatible manner. We would also like to transition SAC to a more open source license.

  17. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parisi, Carlo; Prescott, Steve; Ma, Zhegang

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less

  18. A simplified method in comparison with comprehensive interaction incremental dynamic analysis to assess seismic performance of jacket-type offshore platforms

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Ajamy, A.; Asgarian, B.

    2015-12-01

    The primary goal of seismic reassessment procedures in oil platform codes is to determine the reliability of a platform under extreme earthquake loading. Therefore, in this paper, a simplified method is proposed to assess seismic performance of existing jacket-type offshore platforms (JTOP) in regions ranging from near-elastic to global collapse. The simplified method curve exploits well agreement between static pushover (SPO) curve and the entire summarized interaction incremental dynamic analysis (CI-IDA) curve of the platform. Although the CI-IDA method offers better understanding and better modelling of the phenomenon, it is a time-consuming and challenging task. To overcome the challenges, the simplified procedure, a fast and accurate approach, is introduced based on SPO analysis. Then, an existing JTOP in the Persian Gulf is presented to illustrate the procedure, and finally a comparison is made between the simplified method and CI-IDA results. The simplified method is very informative and practical for current engineering purposes. It is able to predict seismic performance elasticity to global dynamic instability with reasonable accuracy and little computational effort.

  19. Various Approaches to Forward and Inverse Wide-Angle Seismic Modelling Tested on Data from DOBRE-4 Experiment

    NASA Astrophysics Data System (ADS)

    Janik, Tomasz; Środa, Piotr; Czuba, Wojciech; Lysynchuk, Dmytro

    2016-12-01

    The interpretation of seismic refraction and wide angle reflection data usually involves the creation of a velocity model based on an inverse or forward modelling of the travel times of crustal and mantle phases using the ray theory approach. The modelling codes differ in terms of model parameterization, data used for modelling, regularization of the result, etc. It is helpful to know the capabilities, advantages and limitations of the code used compared to others. This work compares some popular 2D seismic modelling codes using the dataset collected along the seismic wide-angle profile DOBRE-4, where quite peculiar/uncommon reflected phases were observed in the wavefield. The 505 km long profile was realized in southern Ukraine in 2009, using 13 shot points and 230 recording stations. Double PMP phases with a different reduced time (7.5-11 s) and a different apparent velocity, intersecting each other, are observed in the seismic wavefield. This is the most striking feature of the data. They are interpreted as reflections from strongly dipping Moho segments with an opposite dip. Two steps were used for the modelling. In the previous work by Starostenko et al. (2013), the trial-and-error forward model based on refracted and reflected phases (SEIS83 code) was published. The interesting feature is the high-amplitude (8-17 km) variability of the Moho depth in the form of downward and upward bends. This model is compared with results from other seismic inversion methods: the first arrivals tomography package FAST based on first arrivals; the JIVE3D code, which can also use later refracted arrivals and reflections; and the forward and inversion code RAYINVR using both refracted and reflected phases. Modelling with all the codes tested showed substantial variability of the Moho depth along the DOBRE-4 profile. However, SEIS83 and RAYINVR packages seem to give the most coincident results.

  20. Spectral-Element Seismic Wave Propagation Codes for both Forward Modeling in Complex Media and Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.

    2015-12-01

    We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.

  1. Effects of Irregular Bridge Columns and Feasibility of Seismic Regularity

    NASA Astrophysics Data System (ADS)

    Thomas, Abey E.

    2018-05-01

    Bridges with unequal column height is one of the main irregularities in bridge design particularly while negotiating steep valleys, making the bridges vulnerable to seismic action. The desirable behaviour of bridge columns towards seismic loading is that, they should perform in a regular fashion, i.e. the capacity of each column should be utilized evenly. But, this type of behaviour is often missing when the column heights are unequal along the length of the bridge, allowing short columns to bear the maximum lateral load. In the present study, the effects of unequal column height on the global seismic performance of bridges are studied using pushover analysis. Codes such as CalTrans (Engineering service center, earthquake engineering branch, 2013) and EC-8 (EN 1998-2: design of structures for earthquake resistance. Part 2: bridges, European Committee for Standardization, Brussels, 2005) suggests seismic regularity criterion for achieving regular seismic performance level at all the bridge columns. The feasibility of adopting these seismic regularity criterions along with those mentioned in literatures will be assessed for bridges designed as per the Indian Standards in the present study.

  2. Uncertainty analysis in seismic tomography

    NASA Astrophysics Data System (ADS)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  3. Documentation for the Southeast Asia seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark; Harmsen, Stephen; Mueller, Charles; Haller, Kathleen; Dewey, James; Luco, Nicolas; Crone, Anthony; Lidke, David; Rukstales, Kenneth

    2007-01-01

    The U.S. Geological Survey (USGS) Southeast Asia Seismic Hazard Project originated in response to the 26 December 2004 Sumatra earthquake (M9.2) and the resulting tsunami that caused significant casualties and economic losses in Indonesia, Thailand, Malaysia, India, Sri Lanka, and the Maldives. During the course of this project, several great earthquakes ruptured subduction zones along the southern coast of Indonesia (fig. 1) causing additional structural damage and casualties in nearby communities. Future structural damage and societal losses from large earthquakes can be mitigated by providing an advance warning of tsunamis and introducing seismic hazard provisions in building codes that allow buildings and structures to withstand strong ground shaking associated with anticipated earthquakes. The Southeast Asia Seismic Hazard Project was funded through a United States Agency for International Development (USAID)—Indian Ocean Tsunami Warning System to develop seismic hazard maps that would assist engineers in designing buildings that will resist earthquake strong ground shaking. An important objective of this project was to discuss regional hazard issues with building code officials, scientists, and engineers in Thailand, Malaysia, and Indonesia. The code communities have been receptive to these discussions and are considering updating the Thailand and Indonesia building codes to incorporate new information (for example, see notes from Professor Panitan Lukkunaprasit, Chulalongkorn University in Appendix A).

  4. The amplitude effects of sedimentary basins on through-passing surface waves

    NASA Astrophysics Data System (ADS)

    Feng, L.; Ritzwoller, M. H.; Pasyanos, M.

    2016-12-01

    Understanding the effect of sedimentary basins on through-passing surface waves is essential in many aspects of seismology, including the estimation of the magnitude of natural and anthropogenic events, the study of the attenuation properties of Earth's interior, and the analysis of ground motion as part of seismic hazard assessment. In particular, knowledge of the physical causes of amplitude variations is important in the application of the Ms:mb discriminant of nuclear monitoring. Our work addresses two principal questions, both in the period range between 10 s and 20 s. The first question is: In what respects can surface wave propagation through 3D structures be simulated as 2D membrane waves? This question is motivated by our belief that surface wave amplitude effects down-stream from sedimentary basins result predominantly from elastic focusing and defocusing, which we understand as analogous to the effect of a lens. To the extent that this understanding is correct, 2D membrane waves will approximately capture the amplitude effects of focusing and defocusing. We address this question by applying the 3D simulation code SW4 (a node-based finite-difference code for 3D seismic wave simulation) and the 2D code SPECFEM2D (a spectral element code for 2D seismic wave simulation). Our results show that for surface waves propagating downstream from 3D sedimentary basins, amplitude effects are mostly caused by elastic focusing and defocusing which is modeled accurately as a 2D effect. However, if the epicentral distance is small, higher modes may contaminate the fundamental mode, which may result in large errors in the 2D membrane wave approximation. The second question is: Are observations of amplitude variations across East Asia following North Korean nuclear tests consistent with simulations of amplitude variations caused by elastic focusing/defocusing through a crustal reference model of China (Shen et al., A seismic reference model for the crust and uppermost mantle beneath China from surface wave dispersion, Geophys. J. Int., 206(2), 2015)? We simulate surface wave propagation across Eastern Asia with SES3D (a spectral element code for 3D seismic wave simulation) and observe significant amplitude variations caused by focusing and defocusing with a magnitude that is consistent with the observations.

  5. Nonlinear analysis of r.c. framed buildings retrofitted with elastomeric and friction bearings under near-fault earthquakes

    NASA Astrophysics Data System (ADS)

    Mazza, Mirko

    2015-12-01

    Reinforced concrete (r.c.) framed buildings designed in compliance with inadequate seismic classifications and code provisions present in many cases a high vulnerability and need to be retrofitted. To this end, the insertion of a base isolation system allows a considerable reduction of the seismic loads transmitted to the superstructure. However, strong near-fault ground motions, which are characterised by long-duration horizontal pulses, may amplify the inelastic response of the superstructure and induce a failure of the isolation system. The above considerations point out the importance of checking the effectiveness of different isolation systems for retrofitting a r.c. framed structure. For this purpose, a numerical investigation is carried out with reference to a six-storey r.c. framed building, which, primarily designed (as to be a fixed-base one) in compliance with the previous Italian code (DM96) for a medium-risk seismic zone, has to be retrofitted by insertion of an isolation system at the base for attaining performance levels imposed by the current Italian code (NTC08) in a high-risk seismic zone. Besides the (fixed-base) original structure, three cases of base isolation are studied: elastomeric bearings acting alone (e.g. HDLRBs); in-parallel combination of elastomeric and friction bearings (e.g. high-damping-laminated-rubber bearings, HDLRBs and steel-PTFE sliding bearings, SBs); friction bearings acting alone (e.g. friction pendulum bearings, FPBs). The nonlinear analysis of the fixed-base and base-isolated structures subjected to horizontal components of near-fault ground motions is performed for checking plastic conditions at the potential critical (end) sections of the girders and columns as well as critical conditions of the isolation systems. Unexpected high values of ductility demand are highlighted at the lower floors of all base-isolated structures, while re-centring problems of the base isolation systems under near-fault earthquakes are expected in case of friction bearings acting alone (i.e. FPBs) or that in combination (i.e. SBs) with HDLRBs.

  6. GSAC - Generic Seismic Application Computing

    NASA Astrophysics Data System (ADS)

    Herrmann, R. B.; Ammon, C. J.; Koper, K. D.

    2004-12-01

    With the success of the IRIS data management center, the use of large data sets in seismological research has become common. Such data sets, and especially the significantly larger data sets expected from EarthScope, present challenges for analysis with existing tools developed over the last 30 years. For much of the community, the primary format for data analysis is the Seismic Analysis Code (SAC) format developed by Lawrence Livermore National Laboratory. Although somewhat restrictive in meta-data storage, the simplicity and stability of the format has established it as an important component of seismological research. Tools for working with SAC files fall into two categories - custom research quality processing codes and shared display - processing tools such as SAC2000, MatSeis,etc., which were developed primarily for the needs of individual seismic research groups. While the current graphics display and platform dependence of SAC2000 may be resolved if the source code is released, the code complexity and the lack of large-data set analysis or even introductory tutorials could preclude code improvements and development of expertise in its use. We believe that there is a place for new, especially open source, tools. The GSAC effort is an approach that focuses on ease of use, computational speed, transportability, rapid addition of new features and openness so that new and advanced students, researchers and instructors can quickly browse and process large data sets. We highlight several approaches toward data processing under this model. gsac - part of the Computer Programs in Seismology 3.30 distribution has much of the functionality of SAC2000 and works on UNIX/LINUX/MacOS-X/Windows (CYGWIN). This is completely programmed in C from scratch, is small, fast, and easy to maintain and extend. It is command line based and is easily included within shell processing scripts. PySAC is a set of Python functions that allow easy access to SAC files and enable efficient manipulation of SAC files under a variety of operating systems. PySAC has proven to be valuable in organizing large data sets. An array processing package includes standard beamforming algorithms and a search based method for inference of slowness vectors. The search results can be visualized using GMT scripts output by the C programs, and the resulting snapshots can be combined into an animation of the time evolution of the 2D slowness field.

  7. Comparison of the sand liquefaction estimated based on codes and practical earthquake damage phenomena

    NASA Astrophysics Data System (ADS)

    Fang, Yi; Huang, Yahong

    2017-12-01

    Conducting sand liquefaction estimated based on codes is the important content of the geotechnical design. However, the result, sometimes, fails to conform to the practical earthquake damages. Based on the damage of Tangshan earthquake and engineering geological conditions, three typical sites are chosen. Moreover, the sand liquefaction probability was evaluated on the three sites by using the method in the Code for Seismic Design of Buildings and the results were compared with the sand liquefaction phenomenon in the earthquake. The result shows that the difference between sand liquefaction estimated based on codes and the practical earthquake damage is mainly attributed to the following two aspects: The primary reasons include disparity between seismic fortification intensity and practical seismic oscillation, changes of groundwater level, thickness of overlying non-liquefied soil layer, local site effect and personal error. Meanwhile, although the judgment methods in the codes exhibit certain universality, they are another reason causing the above difference due to the limitation of basic data and the qualitative anomaly of the judgment formulas.

  8. Evaluating Seismic Site Effects at Cultural Heritage Sites in the Mediterranean Area

    NASA Astrophysics Data System (ADS)

    Imposa, S.; D'Amico, S.; Panzera, F.; Lombardo, G.; Grassi, S.; Betti, M.; Muscat, R.

    2017-12-01

    Present study concern integrated geophysical and numerical simulation aiming at evaluate the seismic vulnerability of cultural heritage sites. Non-invasive analysis targeted to characterize local site effects as well as dynamic properties of the structure were performed. Data were collected at several locations in the Maltese Archipelago (central Mediterranean) and in some historical buildings located in Catania (Sicily). In particular, passive seismic techniques and H/V data where used to derive 1D velocity models and amplification functions. The dynamic properties of a building are usually described through its natural frequency and the damping ratio. This latter is important in seismic design since it allows one to evaluate the ability of a structure to dissipate the vibration energy during an earthquake. The fundamental frequency of the investigated structure was obtained using ambient vibrations recorded by two or more sensors monitoring the motion at different locations in the building. Accordingly, the fundamental period of several Maltese Watchtowers and some historical buildings of Catania were obtained by computing the ratio between the amplitudes of the Fourier spectrum of horizontal (longitudinal and transverse) components recorded on the top and on the ground floors. Using ANSYS code, the modal analysis was performed to evaluate the first 50 vibration modes with the aim to check the activation of the modal masses and to assess the seismic vulnerability of the tower. The STRATA code was instead adopted in the Catania heritage buildings using as reference earthquake moderate to strong shocks that struck south-eastern Sicily. In most of the investigated buildings is was not possible to identify a single natural frequency but several oscillation modes. These results appear linked to the structural complexity of the edifices, their irregular plan shape and the presence of adjacent structures. The H/V outside the buildings were used to determine predominant frequencies of the soil and to highlight potential site-to-structure resonance. The achieved findings can represent useful clues for further additional engineering investigations aiming at reducing the seismic risk, highlighting how the structural complexity and the local seismic response play an important role on building damage.

  9. ADAPTION OF NONSTANDARD PIPING COMPONENTS INTO PRESENT DAY SEISMIC CODES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. T. Clark; M. J. Russell; R. E. Spears

    2009-07-01

    With spiraling energy demand and flat energy supply, there is a need to extend the life of older nuclear reactors. This sometimes requires that existing systems be evaluated to present day seismic codes. Older reactors built in the 1960s and early 1970s often used fabricated piping components that were code compliant during their initial construction time period, but are outside the standard parameters of present-day piping codes. There are several approaches available to the analyst in evaluating these non-standard components to modern codes. The simplest approach is to use the flexibility factors and stress indices for similar standard components withmore » the assumption that the non-standard component’s flexibility factors and stress indices will be very similar. This approach can require significant engineering judgment. A more rational approach available in Section III of the ASME Boiler and Pressure Vessel Code, which is the subject of this paper, involves calculation of flexibility factors using finite element analysis of the non-standard component. Such analysis allows modeling of geometric and material nonlinearities. Flexibility factors based on these analyses are sensitive to the load magnitudes used in their calculation, load magnitudes that need to be consistent with those produced by the linear system analyses where the flexibility factors are applied. This can lead to iteration, since the magnitude of the loads produced by the linear system analysis depend on the magnitude of the flexibility factors. After the loading applied to the nonstandard component finite element model has been matched to loads produced by the associated linear system model, the component finite element model can then be used to evaluate the performance of the component under the loads with the nonlinear analysis provisions of the Code, should the load levels lead to calculated stresses in excess of Allowable stresses. This paper details the application of component-level finite element modeling to account for geometric and material nonlinear component behavior in a linear elastic piping system model. Note that this technique can be applied to the analysis of B31 piping systems.« less

  10. Outstanding challenges in the seismological study of volcanic processes: Results from recent U.S. and European community-wide discussion workshops

    NASA Astrophysics Data System (ADS)

    Roman, D. C.; Rodgers, M.; Mather, T. A.; Power, J. A.; Pyle, D. M.

    2014-12-01

    Observations of volcanically induced seismicity are essential for eruption forecasting and for real-time and near-real-time warnings of hazardous volcanic activity. Studies of volcanic seismicity and of seismic wave propagation also provide critical understanding of subsurface magmatic systems and the physical processes associated with magma genesis, transport, and eruption. However, desipite significant advances in recent years, our ability to successfully forecast volcanic eruptions and fully understand subsurface volcanic processes is limited by our current understanding of the source processes of volcano-seismic events, the effects on seismic wave propagation within volcanic structures, limited data, and even the non-standardized terminology used to describe seismic waveforms. Progress in volcano seismology is further hampered by inconsistent data formats and standards, lack of state-of-the-art hardware and professional technical staff, as well as a lack of widely adopted analysis techniques and software. Addressing these challenges will not only advance scientific understanding of volcanoes, but also will lead to more accurate forecasts and warnings of hazardous volcanic eruptions that would ultimately save lives and property world-wide. Two recent workshops held in Anchorage, Alaska, and Oxford, UK, represent important steps towards developing a relationship among members of the academic community and government agencies, focused around a shared, long-term vision for volcano seismology. Recommendations arising from the two workshops fall into six categories: 1) Ongoing and enhanced community-wide discussions, 2) data and code curation and dissemination, 3) code development, 4) development of resources for more comprehensive data mining, 5) enhanced strategic seismic data collection, and 6) enhanced integration of multiple datasets (including seismicity) to understand all states of volcano activity through space and time. As presented sequentially above, these steps can be regarded as a road map for galvanizing and strengthening the volcano seismological community to drive new scientific and technical progress over the next 5-10 years.

  11. Probabilistic seismic hazard zonation for the Cuban building code update

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Llanes-Buron, C.

    2013-05-01

    A probabilistic seismic hazard assessment has been performed in response to a revision and update of the Cuban building code (NC-46-99) for earthquake-resistant building construction. The hazard assessment have been done according to the standard probabilistic approach (Cornell, 1968) and importing the procedures adopted by other nations dealing with the problem of revising and updating theirs national building codes. Problems of earthquake catalogue treatment, attenuation of peak and spectral ground acceleration, as well as seismic source definition have been rigorously analyzed and a logic-tree approach was used to represent the inevitable uncertainties encountered through the whole seismic hazard estimation process. The seismic zonation proposed here, is formed by a map where it is reflected the behaviour of the spectral acceleration values for short (0.2 seconds) and large (1.0 seconds) periods on rock conditions with a 1642 -year return period, which being considered as maximum credible earthquake (ASCE 07-05). In addition, other three design levels are proposed (severe earthquake: with a 808 -year return period, ordinary earthquake: with a 475 -year return period and minimum earthquake: with a 225 -year return period). The seismic zonation proposed here fulfils the international standards (IBC-ICC) as well as the world tendencies in this thematic.

  12. Liquid Rocket Booster (LRB) for the Space Transportation System (STS) systems study. Appendix E: Pressure-fed booster test bed for the liquid rocket booster study

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The stress analysis/structural design of the Pressure-Fed Booster Engine Test Bed using the existing F-1 Test Facility Test Stand at Huntsville, Alabama is described. The analysis has been coded and set up for solution on NASTRAN. A separate stress program was established to take the NASTRAN output and perform stress checks on the members. Joint checks and other necessary additional checks were performed by hand. The notes include a brief description of other programs which assist in reproducing and reviewing the NASTRAN results. The redesign of the test stand members and the stress analysis was performed per the A.I.S.C. Code. Loads on the stand consist of the loaded run tanks; wind loads; seismic loads; live loads consisting of snow and ice: live and dead loads of steel; and loaded pressurant bottle. In combining loads, wind loads and seismic loads were each combined with full live loads. Wind and seismic loads were not combined. No one third increase in allowables was taken for the environmental loads except at decks 147 and 214, where the increase was used when considering the stay rods, brackets and stay beams. Wind and seismic loads were considered from each of the four coordinate directions (i.e. N,S,E,W) to give eight basic conditions. The analysis was run with the pressurant tank mounted at level 125. One seismic condition was also run with the tank mounted at levels 169 and 214. No failures were noted with mounting at level 169, but extensive deck failure with mounting at level 214 (the loadsets used are included on the tape, but no detailed results are included in the package). Decking support beams at levels 147 and 214 are not included in the model. The stress program thus does not reduce strut lengths to the length between support beams (the struts are attached to the beams at intersection points) and gives stress ratios larger than one for some of the struts. The affected members were therefore checked by hand.

  13. The Analysis of North Korea's Nuclear Tests by Turkish National Data Center

    NASA Astrophysics Data System (ADS)

    Semin, K.; Meral Ozel, N.; Destici, T. C.; Necmioglu, O.; Kocak, S.

    2013-12-01

    The Democratic People's Republic of Korea (DPRK) announced the conduct of a third underground nuclear test on 12 February 2013 in the northeastern part of the country as the previous tests that were conducted in 2009 and 2006. The latest nuclear test is the best detected nuclear event by the global seismic networks. The magnitude estimates show that each new test increased in size when compared with the previous one. As Turkish NDC (National Data Center), we have analyzed the 2013 and 2009 nuclear tests using seismic data from International Monitoring System (IMS) stations through the International Data Center (IDC) located in Vienna. Discrimination analysis was performed based on mb:Ms magnitude ratio and spectral analysis. We have also applied array based waveform cross-correlation to show the similarity of the nuclear tests and precise arrival time measurements for relative location estimates and basic infrasound analysis using two IMS infrasound stations for the 2013 event. Seismic analysis were performed using softwares such as Geotool, EP (Event processor from Norsar) and Seismic Analysis Code (SAC) and the infrasound data were analyzed by using PMCC from CEA-France. The IMS network is operating under the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The CTBTO verification system is under continuous development, also making use of the state of the art technologies and methodologies.

  14. Site-specific seismic ground motion analyses for transportation infrastructure in the New Madrid seismic zone.

    DOT National Transportation Integrated Search

    2012-11-01

    Generic, code-based design procedures cannot account for the anticipated short-period attenuation and long-period amplification of earthquake ground motions in the deep, soft sediments of the Mississippi Embayment within the New Madrid Seismic Zone (...

  15. GrowYourIC: an open access Python code to facilitate comparison between kinematic models of inner core evolution and seismic observations

    NASA Astrophysics Data System (ADS)

    Lasbleis, M.; Day, E. A.; Waszek, L.

    2017-12-01

    The complex nature of inner core structure has been well-established from seismic studies, with heterogeneities at various length scales, both radially and laterally. Despite this, no geodynamic model has successfully explained all of the observed seismic features. To facilitate comparisons between seismic observations and geodynamic models of inner core growth we have developed a new, open access Python tool - GrowYourIC - that allows users to compare models of inner core structure. The code allows users to simulate different evolution models of the inner core, with user-defined rates of inner core growth, translation and rotation. Once the user has "grown" an inner core with their preferred parameters they can then explore the effect of "their" inner core's evolution on the relative age and growth rate in different regions of the inner core. The code will convert these parameters into seismic properties using either built-in mineral physics models, or user-supplied ones that calculate these seismic properties with users' own preferred mineralogical models. The 3D model of isotropic inner core properties can then be used to calculate the predicted seismic travel time anomalies for a random, or user-specified, set of seismic ray paths through the inner core. A real dataset of inner core body-wave differential travel times is included for the purpose of comparing user-generated models of inner core growth to actual observed travel time anomalies in the top 100km of the inner core. Here, we explore some of the possibilities of our code. We investigate the effect of the limited illumination of the inner core by seismic waves on the robustness of kinematic model interpretation. We test the impact on seismic differential travel time observations of several kinematic models of inner core growth: fast lateral translation; slow differential growth; and inner core super-rotation. We find that a model of inner core evolution incorporating both differential growth and slow super-rotation is able to recreate some of the more intricate details of the seismic observations. Specifically we are able to "grow" an inner core that has an asymmetric shift in isotropic hemisphere boundaries with increasing depth in the inner core.

  16. Comparison of fundamental natural period of masonry and reinforced concrete buildings retrieved from experimental campaigns performed in Italy, Greece and Spain

    NASA Astrophysics Data System (ADS)

    Nigro, Antonella; Ponzo, Felice C.; Ditommaso, Rocco; Auletta, Gianluca; Iacovino, Chiara; Nigro, Domenico S.; Soupios, Pantelis; García-Fernández, Mariano; Jimenez, Maria-Jose

    2017-04-01

    Aim of this study is the experimental estimation of the dynamic characteristics of existing buildings and the comparison of the related fundamental natural period of the buildings (masonry and reinforced concrete) located in Basilicata (Italy), in Madrid (Spain) and in Crete (Greece). Several experimental campaigns, on different kind of structures all over the world, have been performed in the last years with the aim of proposing simplified relationships to evaluate the fundamental period of buildings. Most of formulas retrieved from experimental analyses provide vibration periods smaller than those suggested by the Italian Seismic Code (NTC2008) and the European Seismic Code (EC8). It is known that the fundamental period of a structure play a key role in the correct estimation of the spectral acceleration for seismic static analyses and to detect possible resonance phenomena with the foundation soil. Usually, simplified approaches dictate the use of safety factors greater than those related to in depth dynamic linear and nonlinear analyses with the aim to cover any unexpected uncertainties. The fundamental period calculated with the simplified formula given by both NTC 2008 and EC8 is higher than the fundamental period measured on the investigated structures in Italy, Spain and Greece. The consequence is that the spectral acceleration adopted in the seismic static analysis may be significantly different than real spectral acceleration. This approach could produces a decreasing in safety factors obtained using linear seismic static analyses. Based on numerical and experimental results, in order to confirm the results proposed in this work, authors suggest to increase the number of numerical and experimental tests considering also the effects of non-structural components and soil during small, medium and strong motion earthquakes. Acknowledgements This study was partially funded by the Italian Department of Civil Protection within the project DPC-RELUIS 2016 - RS4 ''Seismic observatory of structures and health monitoring'' and by the "Centre of Integrated Geomorphology for the Mediterranean Area - CGIAM" within the Framework Agreement with the University of Basilicata "Study, Research and Experimentation in the Field of Analysis and Monitoring of Seismic Vulnerability of Strategic and Relevant Buildings for the purposes of Civil Protection and Development of Innovative Strategies of Seismic Reinforcement".

  17. Seismic assessment of Technical Area V (TA-V).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medrano, Carlos S.

    The Technical Area V (TA-V) Seismic Assessment Report was commissioned as part of Sandia National Laboratories (SNL) Self Assessment Requirement per DOE O 414.1, Quality Assurance, for seismic impact on existing facilities at Technical Area-V (TA-V). SNL TA-V facilities are located on an existing Uniform Building Code (UBC) Seismic Zone IIB Site within the physical boundary of the Kirtland Air Force Base (KAFB). The document delineates a summary of the existing facilities with their safety-significant structure, system and components, identifies DOE Guidance, conceptual framework, past assessments and the present Geological and Seismic conditions. Building upon the past information and themore » evolution of the new seismic design criteria, the document discusses the potential impact of the new standards and provides recommendations based upon the current International Building Code (IBC) per DOE O 420.1B, Facility Safety and DOE G 420.1-2, Guide for the Mitigation of Natural Phenomena Hazards for DOE Nuclear Facilities and Non-Nuclear Facilities.« less

  18. Assessment of seismic design response factors of concrete wall buildings

    NASA Astrophysics Data System (ADS)

    Mwafy, Aman

    2011-03-01

    To verify the seismic design response factors of high-rise buildings, five reference structures, varying in height from 20- to 60-stories, were selected and designed according to modern design codes to represent a wide range of concrete wall structures. Verified fiber-based analytical models for inelastic simulation were developed, considering the geometric nonlinearity and material inelasticity of the structural members. The ground motion uncertainty was accounted for by employing 20 earthquake records representing two seismic scenarios, consistent with the latest understanding of the tectonic setting and seismicity of the selected reference region (UAE). A large number of Inelastic Pushover Analyses (IPAs) and Incremental Dynamic Collapse Analyses (IDCAs) were deployed for the reference structures to estimate the seismic design response factors. It is concluded that the factors adopted by the design code are adequately conservative. The results of this systematic assessment of seismic design response factors apply to a wide variety of contemporary concrete wall buildings with various characteristics.

  19. Recommendations for the establishment of the seismic code of Haiti

    NASA Astrophysics Data System (ADS)

    Pierristal, G.; Benito, B.; Cervera, J.; Belizaire, D.

    2013-05-01

    Haiti, because of his seismicity associated with plate boundary and several faults that cross the island of Hispaniola (Haiti-Dominican Republic), has been affected in the past by major earthquakes, which have caused loss of life and damage or considerable structural collapses (ex. 1771, 1842), sometimes the destruction of the cities. The recent earthquake of January 12, 2010, was the most destructive earthquake that any country has experienced in modern times, when we measure the number of people killed with respect to the population of a country (Cavallo et al. 2010). It's obvious that the major causes of theses losses are the lack of awareness of the population about the earthquakes, the absence of seismic code and quality control of the building. In this paper, we propose some recommendations for the establishment of the seismic code of Haiti in order to decrease physical and social impacts of earthquakes in the future. First of all, we present a theoretical part of concepts and fundamental elements to establish a seismic code, such as: description of the methodology for seismic hazard's assessment, presentation of the results in terms of acceleration maps for the whole country (in rock sites) and Uniform Hazard Spectrum (UHS) in the cities, and the criteria for soil classification and amplification factors for including site's effects, equivalent forces, etc. Then, we include a practical part where calculations and comparisons of five seismic codes of different countries (Eurocode 8, Spain, Canada, United States and Dominican Republic) are included, in order to have criteria for making the proposals for Haiti. Using the results of Benito et al (presented in this session S10) we compare the UHS in different cities of Haiti with the response spectrum derived from the application of the spectral shapes given by the aforementioned codes. Furthermore, the classification of soils and buildings have been also analyzed and contrasted with local data in order to propose the more suitable classification for Haiti. Finally, we have proposed a methodology for the forces estimation providing the values of the relevant coefficients. References: EN 1998-1:2004 (E): Eurocode 8, Design of structures for earthquake resistance, Part 1(General Rules, seismic actions and rules for buildings), 2004. -MTPTC, (2011). Règles de calcul intérimaires pour les bâtiments en Haïti, Ministère des Travaux Publics, Transports et Communications, Février 2011, Haïti. -NBCC 2005: National Building Code of Canada, vol1, National Research Council of Canada 2005. -NCSE-02: Norma de construcción sismorresistente de España. BOE num.244, Viernes 11 Octubre 2002. -NEHRP, 2009. Recommended Provisions for Seismic Regulations for new Buildings and Other Structures, FEMA P-750, February, Part 1 (Provisions) and Part 2 (Commentary). -R-001 (2011): Reglamento para el análisis y diseño sísmico de estructuras de República Dominicana. Decreto No. 201-11. Ministerio de Obras Públicas y Comunicaciones.

  20. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  1. Seismic Response Analysis of an Unanchored Steel Tank under Horizontal Excitation

    NASA Astrophysics Data System (ADS)

    Rulin, Zhang; Xudong, Cheng; Youhai, Guan

    2017-06-01

    The seismic performance of liquid storage tank affects the safety of people’s life and property. A 3-D finite element method (FEM) model of storage tank is established, which considers the liquid-solid coupling effect. Then, the displacement and stress distribution along the tank wall is studied under El Centro earthquake. Results show that, large amplitude sloshing with long period appears on liquid surface. The elephant-foot deformation occurs near the tank bottom, and at the elephant-foot deformation position maximum hoop stress and axial stress appear. The maximum axial compressive stress is very close to the allowable critical stress calculated by the design code, and may be local buckling failure occurs. The research can provide some reference for the seismic design of storage tanks.

  2. 49 CFR 41.117 - Buildings built with Federal assistance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... architect's authenticated verifications of seismic design codes, standards, and practices used in the design... financial assistance, after July 14, 1993 must be designed and constructed in accord with seismic standards... of compliance with the seismic design and construction requirements of this part is required prior to...

  3. Is 3D true non linear traveltime tomography reasonable ?

    NASA Astrophysics Data System (ADS)

    Herrero, A.; Virieux, J.

    2003-04-01

    The data sets requiring 3D analysis tools in the context of seismic exploration (both onshore and offshore experiments) or natural seismicity (micro seismicity surveys or post event measurements) are more and more numerous. Classical linearized tomographies and also earthquake localisation codes need an accurate 3D background velocity model. However, if the medium is complex and a priori information not available, a 1D analysis is not able to provide an adequate background velocity image. Moreover, the design of the acquisition layouts is often intrinsically 3D and renders difficult even 2D approaches, especially in natural seismicity cases. Thus, the solution relies on the use of a 3D true non linear approach, which allows to explore the model space and to identify an optimal velocity image. The problem becomes then practical and its feasibility depends on the available computing resources (memory and time). In this presentation, we show that facing a 3D traveltime tomography problem with an extensive non-linear approach combining fast travel time estimators based on level set methods and optimisation techniques such as multiscale strategy is feasible. Moreover, because management of inhomogeneous inversion parameters is more friendly in a non linear approach, we describe how to perform a jointly non-linear inversion for the seismic velocities and the sources locations.

  4. Analysis of the impact of large scale seismic retrofitting strategies through the application of a vulnerability-based approach on traditional masonry buildings

    NASA Astrophysics Data System (ADS)

    Ferreira, Tiago Miguel; Maio, Rui; Vicente, Romeu

    2017-04-01

    The buildings' capacity to maintain minimum structural safety levels during natural disasters, such as earthquakes, is recognisably one of the aspects that most influence urban resilience. Moreover, the public investment in risk mitigation strategies is fundamental, not only to promote social and urban and resilience, but also to limit consequent material, human and environmental losses. Despite the growing awareness of this issue, there is still a vast number of traditional masonry buildings spread throughout many European old city centres that lacks of adequate seismic resistance, requiring therefore urgent retrofitting interventions in order to both reduce their seismic vulnerability and to cope with the increased seismic requirements of recent code standards. Thus, this paper aims at contributing to mitigate the social and economic impacts of earthquake damage scenarios through the development of vulnerability-based comparative analysis of some of the most popular retrofitting techniques applied after the 1998 Azores earthquake. The influence of each technique individually and globally studied resorting to a seismic vulnerability index methodology integrated into a GIS tool and damage and loss scenarios are constructed and critically discussed. Finally, the economic balance resulting from the implementation of that techniques are also examined.

  5. A seismic data compression system using subband coding

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.; Pollara, F.

    1995-01-01

    This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  6. New "Risk-Targeted" Seismic Maps Introduced into Building Codes

    USGS Publications Warehouse

    Luco, Nicholas; Garrett, B.; Hayes, J.

    2012-01-01

    Throughout most municipalities of the United States, structural engineers design new buildings using the U.S.-focused International Building Code (IBC). Updated editions of the IBC are published every 3 years. The latest edition (2012) contains new "risk-targeted maximum considered earthquake" (MCER) ground motion maps, which are enabling engineers to incorporate a more consistent and better defined level of seismic safety into their building designs.

  7. Effect of Response Reduction Factor on Peak Floor Acceleration Demand in Mid-Rise RC Buildings

    NASA Astrophysics Data System (ADS)

    Surana, Mitesh; Singh, Yogendra; Lang, Dominik H.

    2017-06-01

    Estimation of Peak Floor Acceleration (PFA) demand along the height of a building is crucial for the seismic safety of nonstructural components. The effect of the level of inelasticity, controlled by the response reduction factor (strength ratio), is studied using incremental dynamic analysis. A total of 1120 nonlinear dynamic analyses, using a suite of 30 recorded ground motion time histories, are performed on mid-rise reinforced-concrete (RC) moment-resisting frame buildings covering a wide range in terms of their periods of vibration. The obtained PFA demands are compared with some of the major national seismic design and retrofit codes (IS 1893 draft version, ASCE 41, EN 1998, and NZS 1170.4). It is observed that the PFA demand at the building's roof level decreases with increasing period of vibration as well as with strength ratio. However, current seismic building codes do not account for these effects thereby producing very conservative estimates of PFA demands. Based on the identified parameters affecting the PFA demand, a model to obtain the PFA distribution along the height of a building is proposed. The proposed model is validated with spectrum-compatible time history analyses of the considered buildings with different strength ratios.

  8. Applications of the seismic hazard model of Italy: from a new building code to the L'Aquila trial against seismologists

    NASA Astrophysics Data System (ADS)

    Meletti, C.

    2013-05-01

    In 2003, a large national project fur updating the seismic hazard map and the seismic zoning in Italy started, according to the rules fixed by an Ordinance by Italian Prime Minister. New input elements for probabilistic seismic hazard assessment were compiled: the earthquake catalogue, the seismogenic zonation, the catalogue completeness, a set of new attenuation relationships. The map of expected PGA on rock soil condition with 10% probability of exceedance is the new reference seismic hazard map for Italy (http://zonesismiche.mi.ingv.it). In the following, further 9 probabilities of exceedance and the uniform hazard spectra up to 2 seconds together with the disaggregation of the PGA was also released. A comprehensive seismic hazard model that fully describes the seismic hazard in Italy was then available, accessible by a webGis application (http://esse1-gis.mi.ingv.it/en.php). The detailed information make possible to change the approach for evaluating the proper seismic action for designing: from a zone-dependent approach (in Italy there were 4 seismic zones, each one with a single design spectrum) to a site-dependent approach: the design spectrum is now defined at each site of a grid of about 11000 points covering the whole national territory. The new building code becomes mandatory only after the 6 April 2009 L'Aquila earthquake, the first strong event in Italy after the release of the seismic hazard map. The large number of recordings and the values of the experienced accelerations suggested the comparisons between the recorded spectra and spectra defined in the seismic codes Even if such comparisons could be robust only after several consecutive 50-year periods of observation and in a probabilistic approach it is not a single observation that can validate or not the hazard estimate, some of the comparisons that can be undertaken between the observed ground motions and the hazard model used for the seismic code have been performed and have shown that the assumptions and modeling choices made in the Italian hazard study are in line with the observations, by considering different return period, the soil condition at the recording stations and the uncertainties of the model. A further application of Italian seismic hazard model is in the identification of buildings and factories struck by the 2012 Emilia (Italy) earthquakes to be investigated in order to determine if they were still safe or not. The law states that no safety check is needed if the construction experienced a shaking greater than 70% of the design acceleration expected at the site, without abandoning the elastic behavior. The ground motion values are evaluated from the shakemaps available (http://shakemap.rm.ingv.it) and the design accelerations derived from the Building Code, which is based on the reference Italian seismic hazard model. Finally, the national seismic hazard model was one the most debated element during the trial in L'Aquila against the seismologists, experts of Civil Protection Department, sentenced to six years in prison on charges of manslaughter, because, according to the judge, they underestimated the risk in the region, giving a wrong message to the people, before the strong 2009 L'Aquila earthquake.

  9. Comment on "How can seismic hazard around the New Madrid seismic zone be similar to that in California?" by Arthur Frankel

    USGS Publications Warehouse

    Wang, Z.; Shi, B.; Kiefer, J.D.

    2005-01-01

    PSHA is the method used most to assess seismic hazards for input into various aspects of public and financial policy. For example, PSHA was used by the U.S. Geological Survey to develop the National Seismic Hazard Maps (Frankel et al., 1996, 2002). These maps are the basis for many national, state, and local seismic safety regulations and design standards, such as the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, the International Building Code, and the International Residential Code. Adoption and implementation of these regulations and design standards would have significant impacts on many communities in the New Madrid area, including Memphis, Tennessee and Paducah, Kentucky. Although "mitigating risks to society from earthquakes involves economic and policy issues" (Stein, 2004), seismic hazard assessment is the basis. Seismologists should provide the best information on seismic hazards and communicate them to users and policy makers. There is a lack of effort in communicating the uncertainties in seismic hazard assessment in the central U.S., however. Use of 10%, 5%, and 2% PE in 50 years causes confusion in communicating seismic hazard assessment. It would be easy to discuss and understand the design ground motions if the true meaning of the ground motion derived from PSHA were presented, i.e., the ground motion with the estimated uncertainty or the associated confidence level.

  10. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    NASA Astrophysics Data System (ADS)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  11. 7 CFR 4274.337 - Other regulatory requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... with the seismic provisions of one of the following model building codes or the latest edition of that...) Uniform Building Code; (ii) 1993 Building Officials and Code Administrators International, Inc. (BOCA) National Building Code; or (iii) 1992 Amendments to the Southern Building Code Congress International...

  12. 7 CFR 4274.337 - Other regulatory requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... with the seismic provisions of one of the following model building codes or the latest edition of that...) Uniform Building Code; (ii) 1993 Building Officials and Code Administrators International, Inc. (BOCA) National Building Code; or (iii) 1992 Amendments to the Southern Building Code Congress International...

  13. 7 CFR 4274.337 - Other regulatory requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... with the seismic provisions of one of the following model building codes or the latest edition of that...) Uniform Building Code; (ii) 1993 Building Officials and Code Administrators International, Inc. (BOCA) National Building Code; or (iii) 1992 Amendments to the Southern Building Code Congress International...

  14. Adaptive neuro-fuzzy inference systems for semi-automatic discrimination between seismic events: a study in Tehran region

    NASA Astrophysics Data System (ADS)

    Vasheghani Farahani, Jamileh; Zare, Mehdi; Lucas, Caro

    2012-04-01

    Thisarticle presents an adaptive neuro-fuzzy inference system (ANFIS) for classification of low magnitude seismic events reported in Iran by the network of Tehran Disaster Mitigation and Management Organization (TDMMO). ANFIS classifiers were used to detect seismic events using six inputs that defined the seismic events. Neuro-fuzzy coding was applied using the six extracted features as ANFIS inputs. Two types of events were defined: weak earthquakes and mining blasts. The data comprised 748 events (6289 signals) ranging from magnitude 1.1 to 4.6 recorded at 13 seismic stations between 2004 and 2009. We surveyed that there are almost 223 earthquakes with M ≤ 2.2 included in this database. Data sets from the south, east, and southeast of the city of Tehran were used to evaluate the best short period seismic discriminants, and features as inputs such as origin time of event, distance (source to station), latitude of epicenter, longitude of epicenter, magnitude, and spectral analysis (fc of the Pg wave) were used, increasing the rate of correct classification and decreasing the confusion rate between weak earthquakes and quarry blasts. The performance of the ANFIS model was evaluated for training and classification accuracy. The results confirmed that the proposed ANFIS model has good potential for determining seismic events.

  15. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  16. How a Country-Wide Seismological Network Can Improve Understanding of Seismicity and Seismic Hazard -- The Example of Bhutan

    NASA Astrophysics Data System (ADS)

    Hetényi, G.; Diehl, T.; Singer, J.; Kissling, E. H.; Clinton, J. F.; Wiemer, S.

    2015-12-01

    The Eastern Himalayas are home to a seemingly complex seismo-tectonic evolution. The rate of instrumental seismicity is lower than the average along the orogen, there is no record of large historical events, but both paleoseismology and GPS studies point to potentially large (M>8) earthquakes. Due to the lack of a permanent seismic monitoring system in the area, our current level of understanding is inappropriate to create a reliable quantitative seismic hazard model for the region. Existing maps are based on questionable hypotheses and show major inconsistencies when compared to each other. Here we present results on national and regional scales from a 38-station broadband seismological network we operated for almost 2 years in the Kingdom of Bhutan. A thorough, state-of-the-art analysis of local and regional earthquakes builds a comprehensive catalogue that reveals significantly (2-to-3 orders of magnitude) more events than detected from global networks. The seismotectonic analysis reveals new patterns of seismic activity as well as striking differences over relatively short distances within the Himalayas, only partly explained by surface observations such as geology. We compare a priori and a posteriori (BMC) magnitude of completeness maps and show that our network was able to detect all felt events during its operation. Some of these events could be felt at surprisingly large distances. Based on our experiment and experience, we draft the pillars on which a permanent seismological observatory for Bhutan could be constructed. Such a continuous monitoring system of seismic activity could then lead to a reliable quantitative seismic hazard model for Bhutan and surrounding regions, and serve as a base to improve building codes and general preparedness.

  17. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  18. The 2012 Ferrara seismic sequence: Regional crustal structure, earthquake sources, and seismic hazard

    NASA Astrophysics Data System (ADS)

    Malagnini, Luca; Herrmann, Robert B.; Munafò, Irene; Buttinelli, Mauro; Anselmi, Mario; Akinci, Aybige; Boschi, E.

    2012-10-01

    Inadequate seismic design codes can be dangerous, particularly when they underestimate the true hazard. In this study we use data from a sequence of moderate-sized earthquakes in northeast Italy to validate and test a regional wave propagation model which, in turn, is used to understand some weaknesses of the current design spectra. Our velocity model, while regionalized and somewhat ad hoc, is consistent with geophysical observations and the local geology. In the 0.02-0.1 Hz band, this model is validated by using it to calculate moment tensor solutions of 20 earthquakes (5.6 ≥ MW ≥ 3.2) in the 2012 Ferrara, Italy, seismic sequence. The seismic spectra observed for the relatively small main shock significantly exceeded the design spectra to be used in the area for critical structures. Observations and synthetics reveal that the ground motions are dominated by long-duration surface waves, which, apparently, the design codes do not adequately anticipate. In light of our results, the present seismic hazard assessment in the entire Pianura Padana, including the city of Milan, needs to be re-evaluated.

  19. Flexible Software Architecture for Visualization and Seismic Data Analysis

    NASA Astrophysics Data System (ADS)

    Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.

    2007-12-01

    Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.

  20. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  1. Development of Maximum Considered Earthquake Ground Motion Maps

    USGS Publications Warehouse

    Leyendecker, E.V.; Hunt, R.J.; Frankel, A.D.; Rukstales, K.S.

    2000-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings use a design procedure that is based on spectral response acceleration rather than the traditional peak ground acceleration, peak ground velocity, or zone factors. The spectral response accelerations are obtained from maps prepared following the recommendations of the Building Seismic Safety Council's (BSSC) Seismic Design Procedures Group (SDPG). The SDPG-recommended maps, the Maximum Considered Earthquake (MCE) Ground Motion Maps, are based on the U.S. Geological Survey (USGS) probabilistic hazard maps with additional modifications incorporating deterministic ground motions in selected areas and the application of engineering judgement. The MCE ground motion maps included with the 1997 NEHRP Provisions also serve as the basis for the ground motion maps used in the seismic design portions of the 2000 International Building Code and the 2000 International Residential Code. Additionally the design maps prepared for the 1997 NEHRP Provisions, combined with selected USGS probabilistic maps, are used with the 1997 NEHRP Guidelines for the Seismic Rehabilitation of Buildings.

  2. Scaled accelerographs for design of structures in Quetta, Baluchistan, Pakistan

    NASA Astrophysics Data System (ADS)

    Bhatti, Abdul Qadir

    2016-12-01

    Structural design for seismic excitation is usually based on peak values of forces and deformations over the duration of earthquake. In determining these peak values dynamic analysis is done which requires either response history analysis (RHA), also called time history analysis, or response spectrum analysis (RSA), both of which depend upon ground motion severity. In the past, PGA has been used to describe ground motion severity, because seismic force on a rigid body is proportional to the ground acceleration. However, it has been pointed out that single highest peak on accelerograms is a very unreliable description of the accelerograms as a whole. In this study, we are considering 0.2- and 1-s spectral acceleration. Seismic loading has been defined in terms of design spectrum and time history which will lead us to two methods of dynamic analysis. Design spectrum for Quetta will be constructed incorporating the parameters of ASCE 7-05/IBC 2006/2009, which is being used by modern codes and regulation of the world like IBC 2006/2009, ASCE 7-05, ATC-40, FEMA-356 and others. A suite of time history representing design earthquake will also be prepared, this will be a helpful tool to carryout time history dynamic analysis of structures in Quetta.

  3. Effectiveness of damped braces to mitigate seismic torsional response of unsymmetric-plan buildings

    NASA Astrophysics Data System (ADS)

    Mazza, Fabio; Pedace, Emilia; Favero, Francesco Del

    2017-02-01

    The seismic retrofitting of unsymmetric-plan reinforced concrete (r.c.) framed buildings can be carried out by the incorporation of damped braces (DBs). Yet most of the proposals to mitigate the seismic response of asymmetric framed buildings by DBs rest on the hypothesis of elastic (linear) structural response. The aim of the present work is to evaluate the effectiveness and reliability of a Displacement-Based Design procedure of hysteretic damped braces (HYDBs) based on the nonlinear behavior of the frame members, which adopts the extended N2 method considered by Eurocode 8 to evaluate the higher mode torsional effects. The Town Hall of Spilinga (Italy), a framed structure with an L-shaped plan built at the beginning of the 1960s, is supposed to be retrofitted with HYDBs to attain performance levels imposed by the Italian seismic code (NTC08) in a high-risk zone. Ten structural solutions are compared by considering two in-plan distributions of the HYDBs, to eliminate (elastic) torsional effects, and different design values of the frame ductility combined with a constant design value of the damper ductility. A computer code for the nonlinear dynamic analysis of r.c. spatial framed structures is adopted to evaluate the critical incident angle of bidirectional earthquakes. Beams and columns are simulated with a lumped plasticity model, including flat surface modeling of the axial load-biaxial bending moment elastic domain at the end sections, while a bilinear law is used to idealize the behavior of the HYDBs. Damage index domains are adopted to estimate the directions of least seismic capacity, considering artificial earthquakes whose response spectra match those adopted by NTC08 at serviceability and ultimate limit states.

  4. Extreme magnitude earthquakes and their economical impact: The Mexico City case

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Mario, C.

    2005-12-01

    The consequences (estimated by the human and economical losses) of the recent occurrence (worldwide) of extreme magnitude (for the region under consideration) earthquakes, such as the 19 09 1985 in Mexico (Ritchter magnitude Ms 8.1, moment magnitude Mw 8.01), or the one in Indonesia of the 26 12 2004 (Ms 9.4, Mw 9.3), stress the importance of performing seismic hazard analysis that, specifically, incorporate this possibility. Herewith, we present and apply a methodology, based on plausible extreme seismic scenarios and the computation of their associated synthetic accelerograms, to estimate the seismic hazard on Mexico City (MC) stiff and compressible surficial soils. The uncertainties about the characteristics of the potential finite seismic sources, as well as those related to the dynamic properties of MC compressible soils are taken into account. The economic consequences (i.e. the seismic risk = seismic hazard x economic cost) implicit in the seismic coefficients proposed in MC seismic Codes before (1976) and after the 1985 earthquake (2004) are analyzed. Based on the latter and on an acceptable risk criterion, a maximum seismic coefficient (MSC) of 1.4g (g = 9.81m/s2) of the elastic acceleration design spectra (5 percent damping), which has a probability of exceedance of 2.4 x 10-4, seems to be appropriate for analyzing the seismic behavior of infrastructure located on MC compressible soils, if extreme Mw 8.5 subduction thrust mechanism earthquakes (similar to the one occurred on 19 09 1985 with an observed, equivalent, MSC of 1g) occurred in the next 50 years.

  5. Mechanical Design and Analysis of LCLS II 2 K Cold Box

    NASA Astrophysics Data System (ADS)

    Yang, S.; Dixon, K.; Laverdure, N.; Rath, D.; Bevins, M.; Bai, H.; Kaminski, S.; Ravindranath, V.

    2017-12-01

    The mechanical design and analysis of the LCLS II 2 K cold box are presented. Its feature and functionality are discussed. ASME B31.3 was used to design its internal piping, and compliance of the piping code was ensured through flexibility analysis. The 2 K cold box was analyzed using ANSYS 17.2; the requirements of the applicable codes—ASME Section VIII Division 2 and ASCE 7-10—were satisfied. Seismic load was explicitly considered in both analyses.

  6. Structural and seismic analyses of waste facility reinforced concrete storage vaults

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C.Y.

    1995-07-01

    Facility 317 of Argonne National Laboratory consists of several reinforced concrete waste storage vaults designed and constructed in the late 1940`s through the early 1960`s. In this paper, structural analyses of these concrete vaults subjected to various natural hazards are described, emphasizing the northwest shallow vault. The natural phenomenon hazards considered include both earthquakes and tornados. Because these vaults are deeply embedded in the soil, the SASSI (System Analysis of Soil-Structure Interaction) code was utilized for the seismic calculations. The ultimate strength method was used to analyze the reinforced concrete structures. In all studies, moment and shear strengths at criticalmore » locations of the storage vaults were evaluated. Results of the structural analyses show that almost all the waste storage vaults meet the code requirements according to ACI 349--85. These vaults also satisfy the performance goal such that confinement of hazardous materials is maintained and functioning of the facility is not interrupted.« less

  7. Application of a moment tensor inversion code developed for mining-induced seismicity to fracture monitoring of civil engineering materials

    NASA Astrophysics Data System (ADS)

    Linzer, Lindsay; Mhamdi, Lassaad; Schumacher, Thomas

    2015-01-01

    A moment tensor inversion (MTI) code originally developed to compute source mechanisms from mining-induced seismicity data is now being used in the laboratory in a civil engineering research environment. Quantitative seismology methods designed for geological environments are being tested with the aim of developing techniques to assess and monitor fracture processes in structural concrete members such as bridge girders. In this paper, we highlight aspects of the MTI_Toolbox programme that make it applicable to performing inversions on acoustic emission (AE) data recorded by networks of uniaxial sensors. The influence of the configuration of a seismic network on the conditioning of the least-squares system and subsequent moment tensor results for a real, 3-D network are compared to a hypothetical 2-D version of the same network. This comparative analysis is undertaken for different cases: for networks consisting entirely of triaxial or uniaxial sensors; for both P and S-waves, and for P-waves only. The aim is to guide the optimal design of sensor configurations where only uniaxial sensors can be installed. Finally, the findings of recent laboratory experiments where the MTI_Toolbox has been applied to a concrete beam test are presented and discussed.

  8. Seismic fragility analysis of typical pre-1990 bridges due to near- and far-field ground motions

    NASA Astrophysics Data System (ADS)

    Mosleh, Araliya; Razzaghi, Mehran S.; Jara, José; Varum, Humberto

    2016-03-01

    Bridge damages during the past earthquakes caused several physical and economic impacts to transportation systems. Many of the existing bridges in earthquake prone areas are pre-1990 bridges and were designed with out of date regulation codes. The occurrences of strong motions in different parts of the world show every year the vulnerability of these structures. Nonlinear dynamic time history analyses were conducted to assess the seismic vulnerability of typical pre-1990 bridges. A family of existing concrete bridge representative of the most common bridges in the highway system in Iran is studied. The seismic demand consists in a set of far-field and near-field strong motions to evaluate the likelihood of exceeding the seismic capacity of the mentioned bridges. The peak ground accelerations (PGAs) were scaled and applied incrementally to the 3D models to evaluate the seismic performance of the bridges. The superstructure was assumed to remain elastic and the nonlinear behavior in piers was modeled by assigning plastic hinges in columns. In this study the displacement ductility and the PGA are selected as a seismic performance indicator and intensity measure, respectively. The results show that pre-1990 bridges subjected to near-fault ground motions reach minor and moderate damage states.

  9. Experimental investigation of damage behavior of RC frame members including non-seismically designed columns

    NASA Astrophysics Data System (ADS)

    Chen, Linzhi; Lu, Xilin; Jiang, Huanjun; Zheng, Jianbo

    2009-06-01

    Reinforced concrete (RC) frame structures are one of the mostly common used structural systems, and their seismic performance is largely determined by the performance of columns and beams. This paper describes horizontal cyclic loading tests of ten column and three beam specimens, some of which were designed according to the current seismic design code and others were designed according to the early non-seismic Chinese design code, aiming at reporting the behavior of the damaged or collapsed RC frame strctures observed during the Wenchuan earthquake. The effects of axial load ratio, shear span ratio, and transverse and longitudinal reinforcement ratio on hysteresis behavior, ductility and damage progress were incorporated in the experimental study. Test results indicate that the non-seismically designed columns show premature shear failure, and yield larger maximum residual crack widths and more concrete spalling than the seismically designed columns. In addition, longitudinal steel reinforcement rebars were severely buckled. The axial load ratio and shear span ratio proved to be the most important factors affecting the ductility, crack opening width and closing ability, while the longitudinal reinforcement ratio had only a minor effect on column ductility, but exhibited more influence on beam ductility. Finally, the transverse reinforcement ratio did not influence the maximum residual crack width and closing ability of the seismically designed columns.

  10. U.S. Seismic Design Maps Web Application

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Fee, J.

    2015-12-01

    The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.

  11. Enhancement of long period components of recorded and synthetic ground motions using InSAR

    USGS Publications Warehouse

    Abell, J.A.; Carlos de la Llera, J.; Wicks, C.W.

    2011-01-01

    Tall buildings and flexible structures require a better characterization of long period ground motion spectra than the one provided by current seismic building codes. Motivated by that, a methodology is proposed and tested to improve recorded and synthetic ground motions which are consistent with the observed co-seismic displacement field obtained from interferometric synthetic aperture radar (InSAR) analysis of image data for the Tocopilla 2007 earthquake (Mw=7.7) in Northern Chile. A methodology is proposed to correct the observed motions such that, after double integration, they are coherent with the local value of the residual displacement. Synthetic records are generated by using a stochastic finite-fault model coupled with a long period pulse to capture the long period fling effect. It is observed that the proposed co-seismic correction yields records with more accurate long-period spectral components as compared with regular correction schemes such as acausal filtering. These signals provide an estimate for the velocity and displacement spectra, which are essential for tall-building design. Furthermore, hints are provided as to the shape of long-period spectra for seismic zones prone to large co-seismic displacements such as the Nazca-South American zone. ?? 2011 Elsevier Ltd.

  12. Lateral testing of glued laminated timber tudor arch

    Treesearch

    Douglas R. Rammer; Philip Line

    2016-01-01

    Glued laminated timber Tudor arches have been in wide use in the United States since the 1930s, but detailed knowledge related to seismic design in modern U.S. building codes is lacking. FEMA P-695 (P-695) is a methodology to determine seismic performance factors for a seismic force resisting system. A limited P-695 study for glued laminated timber arch structures...

  13. Seismic performance evaluation of an historical concrete deck arch bridge using survey and drawing of the damages, in situ tests, dynamic identification and pushover analysis

    NASA Astrophysics Data System (ADS)

    Bergamo, Otello; Russo, Eleonora; Lodolo, Fabio

    2017-07-01

    The paper describes the performance evaluation of a retrofit historical multi-span (RC) deck arch bridge analyzed with in situ tests, dynamic identification and FEM analysis. The peculiarity of this case study lies in the structural typology of "San Felice" bridge, an historical concrete arch bridge built in the early 20th century, a quite uncommon feature in Italy. The preservation and retrofit of historic cultural heritage and infrastructures has been carefully analyzed in the international codes governing seismic response. A complete survey of the bridge was carried out prior to sketching a drawing of the existing bridge. Subsequently, the study consists in four steps: material investigation and dynamic vibration tests, FEM analysis and calibration, retrofit assessment, pushover analysis. The aim is to define an innovative approach to calibrate the FEM analysis through modern experimental investigations capable of taking structural deterioration into account, and to offer an appropriate and cost-effective retrofitting strategy.

  14. The Modularized Software Package ASKI - Full Waveform Inversion Based on Waveform Sensitivity Kernels Utilizing External Seismic Wave Propagation Codes

    NASA Astrophysics Data System (ADS)

    Schumacher, F.; Friederich, W.

    2015-12-01

    We present the modularized software package ASKI which is a flexible and extendable toolbox for seismic full waveform inversion (FWI) as well as sensitivity or resolution analysis operating on the sensitivity matrix. It utilizes established wave propagation codes for solving the forward problem and offers an alternative to the monolithic, unflexible and hard-to-modify codes that have typically been written for solving inverse problems. It is available under the GPL at www.rub.de/aski. The Gauss-Newton FWI method for 3D-heterogeneous elastic earth models is based on waveform sensitivity kernels and can be applied to inverse problems at various spatial scales in both Cartesian and spherical geometries. The kernels are derived in the frequency domain from Born scattering theory as the Fréchet derivatives of linearized full waveform data functionals, quantifying the influence of elastic earth model parameters on the particular waveform data values. As an important innovation, we keep two independent spatial descriptions of the earth model - one for solving the forward problem and one representing the inverted model updates. Thereby we account for the independent needs of spatial model resolution of forward and inverse problem, respectively. Due to pre-integration of the kernels over the (in general much coarser) inversion grid, storage requirements for the sensitivity kernels are dramatically reduced.ASKI can be flexibly extended to other forward codes by providing it with specific interface routines that contain knowledge about forward code-specific file formats and auxiliary information provided by the new forward code. In order to sustain flexibility, the ASKI tools must communicate via file output/input, thus large storage capacities need to be accessible in a convenient way. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion.

  15. Estimation of empirical site amplification factors in Taiwan

    NASA Astrophysics Data System (ADS)

    Chung, Chi-Hsuan; Wen, Kuo-Liang; Kuo, Chun-Hsiang

    2017-04-01

    Lots of infrastructures are under construction in metropolises in Taiwan in recent years and thus leads to increasement of population density and urbanization in those area. Taiwan island is located in plate boundaries in which the high seismicity is caused by active tectonic plates. The Chi-Chi earthquake (Mw 7.6) in 1999 caused a fatality of more than 2000, and the Meinong earthquake (Mw 6.5) in 2016 caused a fatality of 117 in Tainan city as well as damages on hundreds of buildings. The cases imply seismic vulnerability of urban area. During the improvements for seismic hazard analysis and seismic design, consideration of seismic site amplifications in different site conditions is one of important issues. This study used selected and processed strong motion records observed by the TSMIP network. The site conditions considered as Vs30 used in this study were investigated at most stations (Kuo et al. 2012; Kuo et al. 2016). Since strong motion records and site conditions are both available, we are able to use the data to analyze site amplifications of seismic waves at different periods. The result may be a reference for future modification of seismic design codes to decrease potential seismic hazards and losses. We adopted the strong motion and site database of the SSHAC (Senior Seismic Hazard Analysis Committee) Level 3 project in Taiwan. The selected significant crustal and subduction events of magnitude larger than six for analysis. The amplification factors of PGA, PGV, PGD, and spectra acceleration at 0.3, 1.0, and 3.0 seconds were evaluated using the processed strong motions. According to the recommendation of SSHAC Level 3 project, the site condition of Vs30 = 760 m/s is considered as the reference rock site in this study. The stations with Vs30 between 600 m/s and 900 m/s and used as the reference rock sites in reality. For each event, we find a reference rock site and other site within a certain distance (region dependent) to calculate site amplifications of ground motions. Relationships of site amplification factors and Vs30 are therefore derived for strong motions by regression analysis. Soil nonlinearity (decrease of amplifications) has to be considered at soft soil sites during a strong shaking. We also discuss amplification factors in terms of different intensities if data is available.

  16. Swept Impact Seismic Technique (SIST)

    USGS Publications Warehouse

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  17. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  18. Subband Coding Methods for Seismic Data Compression

    NASA Technical Reports Server (NTRS)

    Kiely, A.; Pollara, F.

    1995-01-01

    This paper presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The compression technique described could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  19. Risk Informed Design and Analysis Criteria for Nuclear Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salmon, Michael W.

    2015-06-17

    Target performance can be achieved by defining design basis ground motion from results of a probabilistic seismic hazards assessment, and introducing known levels of conservatism in the design above the DBE. ASCE 4, 43, DOE-STD-1020 defined the DBE at 4x10-4 and introduce only slight levels of conservatism in response. ASCE 4, 43, DOE-STD-1020 assume code capacities shoot for about 98% NEP. There is a need to have a uniform target (98% NEP) for code developers (ACI, AISC, etc.) to aim for. In considering strengthening options, one must also consider cost/risk reduction achieved.

  20. Seismic Vulnerability and Performance Level of confined brick walls

    NASA Astrophysics Data System (ADS)

    Ghalehnovi, M.; Rahdar, H. A.

    2008-07-01

    There has been an increase on the interest of Engineers and designers to use designing methods based on displacement and behavior (designing based on performance) Regarding to the importance of resisting structure design against dynamic loads such as earthquake, and inability to design according to prediction of nonlinear behavior element caused by nonlinear properties of constructional material. Economically speaking, easy carrying out and accessibility of masonry material have caused an enormous increase in masonry structures in villages, towns and cities. On the other hand, there is a necessity to study behavior and Seismic Vulnerability in these kinds of structures since Iran is located on the earthquake belt of Alpide. Different reasons such as environmental, economic, social, cultural and accessible constructional material have caused different kinds of constructional structures. In this study, some tied walls have been modeled with software and with relevant accelerator suitable with geology conditions under dynamic analysis to research on the Seismic Vulnerability and performance level of confined brick walls. Results from this analysis seem to be satisfactory after comparison of them with the values in Code ATC40, FEMA and standard 2800 of Iran.

  1. Societal and observational problems in earthquake risk assessments and their delivery to those most at risk

    NASA Astrophysics Data System (ADS)

    Bilham, Roger

    2013-01-01

    Losses from earthquakes continue to rise despite increasingly sophisticated methods to estimate seismic risk throughout the world. This article discusses five specific reasons why this should be. Loss of life is most pronounced in the developing nations where three factors - poverty, corruption and ignorance - conspire to reduce the effective application of seismic resistant codes. A fourth reason is that in many developing nations the application of seismic resistant construction is inadvertently restricted to wealthy, or civil segments of the community, and is either unobtainable or irrelevant to the most vulnerable segment of the public — the owner/occupiers of substandard dwellings. A fifth flaw in current seismic hazard studies is that sophisticated methodologies to evaluate risk are inappropriate in regions where strain rates are low, and where historical data are short compared to the return time of damaging earthquakes. The scientific community has remained largely unaware of the importance of these impediments to the development and application of appropriate seismic resistant code, and is ill-equipped to address them.

  2. Incorporation of Dynamic SSI Effects in the Design Response Spectra

    NASA Astrophysics Data System (ADS)

    Manjula, N. K.; Pillai, T. M. Madhavan; Nagarajan, Praveen; Reshma, K. K.

    2018-05-01

    Many studies in the past on dynamic soil-structure interactions have revealed the detrimental and advantageous effects of soil flexibility. Based on such studies, the design response spectra of international seismic codes are being improved worldwide. The improvements required for the short period range of the design response spectra in the Indian seismic code (IS 1893:2002) are presented in this paper. As the recent code revisions has not incorporated the short period amplifications, proposals given in this paper are equally applicable for the latest code also (IS 1893:2016). Analyses of single degree of freedom systems are performed to predict the required improvements. The proposed modifications to the constant acceleration portion of the spectra are evaluated with respect to the current design spectra in Eurocode 8.

  3. Recent Impacts on Mars: Cluster Properties and Seismic Signal Predictions

    NASA Astrophysics Data System (ADS)

    Justine Daubar, Ingrid; Schmerr, Nicholas; Banks, Maria; Marusiak, Angela; Golombek, Matthew P.

    2016-10-01

    Impacts are a key source of seismic waves that are a primary constraint on the formation, evolution, and dynamics of planetary objects. Geophysical missions such as InSight (Banerdt et al., 2013) will monitor seismic signals from internal and external sources. New martian craters have been identified in orbital images (Malin et al., 2006; Daubar et al., 2013). Seismically detecting such impacts and subsequently imaging the resulting craters will provide extremely accurate epicenters and source crater sizes, enabling calibration of seismic velocities, the efficiency of impact-seismic coupling, and retrieval of detailed regional and local internal structure.To investigate recent impact-induced seismicity on Mars, we have assessed ~100 new, dated impact sites. In approximately half of new impacts, the bolide partially disintegrates in the atmosphere, forming multiple craters in a cluster. We incorporate the resulting, more complex, seismic effects in our model. To characterize the variation between sites, we focus on clustered impacts. We report statistics of craters within clusters: diameters, morphometry indicating subsurface layering, strewn-field azimuths indicating impact direction, and dispersion within clusters indicating combined effects of bolide strength and elevation of breakup.Measured parameters are converted to seismic predictions for impact sources using a scaling law relating crater diameter to the momentum and source duration, calibrated for impacts recorded by Apollo (Lognonne et al., 2009). We use plausible ranges for target properties, bolide densities, and impact velocities to bound the seismic moment. The expected seismic sources are modeled in the near field using a 3-D wave propagation code (Petersson et al., 2010) and in the far field using a 1-D wave propagation code (Friederich et al., 1995), for a martian seismic model. Thus we calculate the amplitudes of seismic phases at varying distances, which can be used to evaluate the detectability of body and surface wave phases created by different sizes and types of impacts all over Mars.

  4. EMERALD: A Flexible Framework for Managing Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2010-12-01

    The seismological community is challenged by the vast quantity of new broadband seismic data provided by large-scale seismic arrays such as EarthScope’s USArray. While this bonanza of new data enables transformative scientific studies of the Earth’s interior, it also illuminates limitations in the methods used to prepare and preprocess those data. At a recent seismic data processing focus group workshop, many participants expressed the need for better systems to minimize the time and tedium spent on data preparation in order to increase the efficiency of scientific research. Another challenge related to data from all large-scale transportable seismic experiments is that there currently exists no system for discovering and tracking changes in station metadata. This critical information, such as station location, sensor orientation, instrument response, and clock timing data, may change over the life of an experiment and/or be subject to post-experiment correction. Yet nearly all researchers utilize metadata acquired with the downloaded data, even though subsequent metadata updates might alter or invalidate results produced with older metadata. A third long-standing issue for the seismic community is the lack of easily exchangeable seismic processing codes. This problem stems directly from the storage of seismic data as individual time series files, and the history of each researcher developing his or her preferred data file naming convention and directory organization. Because most processing codes rely on the underlying data organization structure, such codes are not easily exchanged between investigators. To address these issues, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The goal of the EMERALD project is to provide seismic researchers with a unified, user-friendly, extensible system for managing seismic event data, thereby increasing the efficiency of scientific enquiry. EMERALD stores seismic data and metadata in a state-of-the-art open source relational database (PostgreSQL), and can, on a timed basis or on demand, download the most recent metadata, compare it with previously acquired values, and alert the user to changes. The backend relational database is capable of easily storing and managing many millions of records. The extensible, plug-in architecture of the EMERALD system allows any researcher to contribute new visualization and processing methods written in any of 12 programming languages, and a central Internet-enabled repository for such methods provides users with the opportunity to download, use, and modify new processing methods on demand. EMERALD includes data acquisition tools allowing direct importation of seismic data, and also imports data from a number of existing seismic file formats. Pre-processed clean sets of data can be exported as standard sac files with user-defined file naming and directory organization, for use with existing processing codes. The EMERALD system incorporates existing acquisition and processing tools, including SOD, TauP, GMT, and FISSURES/DHI, making much of the functionality of those tools available in a unified system with a user-friendly web browser interface. EMERALD is now in beta test. See emerald.asu.edu or contact john.d.west@asu.edu for more details.

  5. Code for Calculating Regional Seismic Travel Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BALLARD, SANFORD; HIPP, JAMES; & BARKER, GLENN

    The RSTT software computes predictions of the travel time of seismic energy traveling from a source to a receiver through 2.5D models of the seismic velocity distribution within the Earth. The two primary applications for the RSTT library are tomographic inversion studies and seismic event location calculations. In tomographic inversions studies, a seismologist begins with number of source-receiver travel time observations and an initial starting model of the velocity distribution within the Earth. A forward travel time calculator, such as the RSTT library, is used to compute predictions of each observed travel time and all of the residuals (observed minusmore » predicted travel time) are calculated. The Earth model is then modified in some systematic way with the goal of minimizing the residuals. The Earth model obtained in this way is assumed to be a better model than the starting model if it has lower residuals. The other major application for the RSTT library is seismic event location. Given an Earth model, an initial estimate of the location of a seismic event, and some number of observations of seismic travel time thought to have originated from that event, location codes systematically modify the estimate of the location of the event with the goal of minimizing the difference between the observed and predicted travel times. The second application, seismic event location, is routinely implemented by the military as part of its effort to monitor the Earth for nuclear tests conducted by foreign countries.« less

  6. Seismic hazard map of North and Central America and the Caribbean

    USGS Publications Warehouse

    Shedlock, K.M.

    1999-01-01

    Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local government, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of North and Central America and the Caribbean is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful regional seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of North and Central America and the Caribbean depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings). The highest seismic hazard values in the region generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes.

  7. Non-Double-Couple Component Analysis of Induced Microearthquakes in the Val D'Agri Basin (Italy)

    NASA Astrophysics Data System (ADS)

    Roselli, P.; Improta, L.; Saccorotti, G.

    2017-12-01

    In recent years it has become accepted that earthquake source can attain significant Non-Double-Couple (NDC) components. Among the driving factors of deviation from normal double-couple (DC) mechanisms there is the opening/closing of fracture networks and the activation of pre-existing faults by pore fluid pressure perturbations. This observation makes the thorough analysis of source mechanism of key importance for the understanding of withdrawal/injection induced seismicity from geothermal and hydrocarbon reservoirs, as well as of water reservoir induced seismicity. In addition to the DC component, seismic moment tensor can be decomposed into isotropic (ISO) and compensated linear vector dipole (CLVD) components. In this study we performed a careful analysis of the seismic moment tensor of induced microseismicity recorded in the Val d'Agri (Southern Apennines, Italy) focusing our attention on the NDC component. The Val d'Agri is a Quaternary extensional basin that hosts the largest onshore European oil field and a water reservoir (Pertusillo Lake impoundment) characterized by severe seasonal level oscillations. Our input data-set includes swarm-type induced micro-seismicity recorded between 2005-2006 by a high-performance network and accurately localized by a reservoir-scale local earthquake tomography. We analyze two different seismicity clusters: (i) a swarm of 69 earthquakes with 0.3 ≤ ML ≤ 1.8 induced by a wastewater disposal well of the oilfield during the initial daily injection tests (10 days); (ii) 526 earthquakes with -0.2 ≤ ML ≤ 2.7 induced by seasonal volume changes of the artificial lake. We perform the seismic moment tensor inversion by using HybridMT code. After a very accurate signal-to-noise selection and hand-made picking of P-pulses, we obtain %DC, %ISO, %CLVD for each event. DC and NDC components are analyzed and compared with the spatio-temporal distribution of seismicity, the local stress field, the injection parameters and the water level in the impoundment. We find significant NDC components and abrupt temporal variations in the %DC and %ISO components that appear linked to the extremely variable parameters of the injection tests into the disposal well.

  8. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    NASA Astrophysics Data System (ADS)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  9. A first step to compare geodynamical models and seismic observations of the inner core

    NASA Astrophysics Data System (ADS)

    Lasbleis, M.; Waszek, L.; Day, E. A.

    2016-12-01

    Seismic observations have revealed a complex inner core, with lateral and radial heterogeneities at all observable scales. The dominant feature is the east-west hemispherical dichotomy in seismic velocity and attenuation. Several geodynamical models have been proposed to explain the observed structure: convective instabilities, external forces, crystallisation processes or influence of outer core convection. However, interpreting such geodynamical models in terms of the seismic observations is difficult, and has been performed only for very specific models (Geballe 2013, Lincot 2014, 2016). Here, we propose a common framework to make such comparisons. We have developed a Python code that propagates seismic ray paths through kinematic geodynamical models for the inner core, computing a synthetic seismic data set that can be compared to seismic observations. Following the method of Geballe 2013, we start with the simple model of translation. For this, the seismic velocity is proposed to be function of the age or initial growth rate of the material (since there is no deformation included in our models); the assumption is reasonable when considering translation, growth and super rotation of the inner core. Using both artificial (random) seismic ray data sets and a real inner core data set (from Waszek et al. 2011), we compare these different models. Our goal is to determine the model which best matches the seismic observations. Preliminary results show that super rotation successfully creates an eastward shift in properties with depth, as has been observed seismically. Neither the growth rate of inner core material nor the relationship between crystal size and seismic velocity are well constrained. Consequently our method does not directly compute the seismic travel times. Instead, here we use age, growth rate and other parameters as proxies for the seismic properties, which represent a good first step to compare geodynamical and seismic observations.Ultimately we aim to release our codes to broader scientific community, allowing researchers from all disciplines to test their models of inner core growth against seismic observations or create a kinematic model for the evolution of the inner core which matches new geophysical observations.

  10. Validation Data and Model Development for Fuel Assembly Response to Seismic Loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardet, Philippe; Ricciardi, Guillaume

    2016-01-31

    Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging tasks in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWRmore » fuel bundle behavior during seismic transients.« less

  11. GeoMO 2008--geotechnical earthquake engineering : site response.

    DOT National Transportation Integrated Search

    2008-10-01

    The theme of GeoMO2008 has recently become of more interest to the Midwest civil engineering community due to the perceived earthquake risks and new code requirements. The constant seismic reminder for the New Madrid Seismic Zone and new USGS hazard ...

  12. 49 CFR 41.119 - DOT regulated buildings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... compliance may include the engineer's and architect's authenticated verification of seismic design codes... and additions to existing buildings will ensure that each DOT regulated building is designed and constructed in accord with seismic design and construction standards as provided by this part. (b) This...

  13. Structural evaluation of the 2736Z Building for seismic loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giller, R.A.

    The 2736Z building structure is evaluated for high-hazard loads. The 2736Z building is analyzed herein for normal and seismic loads and is found to successfully meet the guidelines of UCRL-15910 along with the related codes requirements.

  14. Using faults for PSHA in a volcanic context: the Etna case (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Azzaro, Raffaele; D'Amico, Salvatore; Gee, Robin; Pace, Bruno; Peruzza, Laura

    2016-04-01

    At Mt. Etna volcano (Southern Italy), recurrent volcano-tectonic earthquakes affect the urbanised areas, with an overall population of about 400,000 and with important infrastructures and lifelines. For this reason, seismic hazard analyses have been undertaken in the last decade focusing on the capability of local faults to generate damaging earthquakes especially in the short-term (30-5 yrs); these results have to be intended as complementary to the regulatory seismic hazard maps, and devoted to establish priority in the seismic retrofitting of the exposed municipalities. Starting from past experience, in the framework of the V3 Project funded by the Italian Department of Civil Defense we performed a fully probabilistic seismic hazard assessment by using an original definition of seismic sources and ground-motion prediction equations specifically derived for this volcanic area; calculations are referred to a new brand topographic surface (Mt. Etna reaches more than 3,000 m in elevation, in less than 20 km from the coast), and to both Poissonian and time-dependent occurrence models. We present at first the process of defining seismic sources that includes individual faults, seismic zones and gridded seismicity; they are obtained by integrating geological field data with long-term (the historical macroseismic catalogue) and short-term earthquake data (the instrumental catalogue). The analysis of the Frequency Magnitude Distribution identifies areas in the volcanic complex, with a- and b-values of the Gutenberg-Richter relationship representative of different dynamic processes. Then, we discuss the variability of the mean occurrence times of major earthquakes along the main Etnean faults estimated by using a purely geologic approach. This analysis has been carried out through the software code FISH, a Matlab® tool developed to turn fault data representative of the seismogenic process into hazard models. The utilization of a magnitude-size scaling relationship specific for volcanic areas is a key element: the FiSH code may thus calculate the most probable values of characteristic expected magnitude (Mchar) with the associated standard deviation σ, the corresponding mean recurrence times (Tmean) and the aperiodicity factor  for each fault. Finally, we show some results obtained by the OpenQuake-engine by considering a conceptual logic tree model organised in several branches (zone and zoneless, historical and geological rates, Poisson and time-dependent assumptions). Maps are referred to various exposure periods (10% exceeding probability in 30-5 years) and different spectral accelerations. The volcanic region of Mt. Etna represents a perfect lab for fault-based PSHA; the large dataset of input parameters used in the calculations allows testing different methodological approaches and validating some conceptual procedures.

  15. HANFORD DST THERMAL & SEISMIC PROJECT ANSYS BENCHMARK ANALYSIS OF SEISMIC INDUCED FLUID STRUCTURE INTERACTION IN A HANFORD DOUBLE SHELL PRIMARY TANK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MACKEY, T.C.

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project-DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). Themore » overall seismic analysis of the DSTs is being performed with the general-purpose finite element code ANSYS. The overall model used for the seismic analysis of the DSTs includes the DST structure, the contained waste, and the surrounding soil. The seismic analysis of the DSTs must address the fluid-structure interaction behavior and sloshing response of the primary tank and contained liquid. ANSYS has demonstrated capabilities for structural analysis, but the capabilities and limitations of ANSYS to perform fluid-structure interaction are less well understood. The purpose of this study is to demonstrate the capabilities and investigate the limitations of ANSYS for performing a fluid-structure interaction analysis of the primary tank and contained waste. To this end, the ANSYS solutions are benchmarked against theoretical solutions appearing in BNL 1995, when such theoretical solutions exist. When theoretical solutions were not available, comparisons were made to theoretical solutions of similar problems and to the results from Dytran simulations. The capabilities and limitations of the finite element code Dytran for performing a fluid-structure interaction analysis of the primary tank and contained waste were explored in a parallel investigation (Abatt 2006). In conjunction with the results of the global ANSYS analysis reported in Carpenter et al. (2006), the results of the two investigations will be compared to help determine if a more refined sub-model of the primary tank is necessary to capture the important fluid-structure interaction effects in the tank and if so, how to best utilize a refined sub-model of the primary tank. Both rigid tank and flexible tank configurations were analyzed with ANSYS. The response parameters of interest are total hydrodynamic reaction forces, impulsive and convective mode frequencies, waste pressures, and slosh heights. To a limited extent: tank stresses are also reported. The results of this study demonstrate that the ANSYS model has the capability to adequately predict global responses such as frequencies and overall reaction forces. Thus, the model is suitable for predicting the global response of the tank and contained waste. On the other hand, while the ANSYS model is capable of adequately predicting waste pressures and primary tank stresses in a large portion of the waste tank, the model does not accurately capture the convective behavior of the waste near the free surface, nor did the model give accurate predictions of slosh heights. Based on the ability of the ANSYS benchmark model to accurately predict frequencies and global reaction forces and on the results presented in Abatt, et al. (2006), the global ANSYS model described in Carpenter et al. (2006) is sufficient for the seismic evaluation of all tank components except for local areas of the primary tank. Due to the limitations of the ANSYS model in predicting the convective response of the waste, the evaluation of primary tank stresses near the waste free surface should be supplemented by results from an ANSYS sub-model of the primary tank that incorporates pressures from theoretical solutions or from Dytran solutions. However, the primary tank is expected to have low demand to capacity ratios in the upper wall. Moreover, due to the less than desired mesh resolution in the primary tank knuckle of the global ANSYS model, the evaluation of the primary tank stresses in the lower knuckle should be supplemented by results from a more refined ANSYS sub-model of the primary tank that incorporates pressures from theoretical solutions or from Dytran solutions.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S; Larsen, S; Wagoner, J

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less

  17. Numerical Simulations of 3D Seismic Data Final Report CRADA No. TC02095.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedmann, S. J.; Kostov, C.

    This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of Califomia)/Lawrence-Livermore National Laboratory (LLNL) and Schlumberger Cambridge Research (SCR), to develop synthetic seismic data sets and supporting codes.

  18. Detection of sinkholes or anomalies using full seismic wave fields : phase II.

    DOT National Transportation Integrated Search

    2016-08-01

    A new 2-D Full Waveform Inversion (FWI) software code was developed to characterize layering and anomalies beneath the ground surface using seismic testing. The software is capable of assessing the shear and compression wave velocities (Vs and Vp) fo...

  19. Coupled Hydrodynamic and Wave Propagation Modeling for the Source Physics Experiment: Study of Rg Wave Sources for SPE and DAG series.

    NASA Astrophysics Data System (ADS)

    Larmat, C. S.; Delorey, A.; Rougier, E.; Knight, E. E.; Steedman, D. W.; Bradley, C. R.

    2017-12-01

    This presentation reports numerical modeling efforts to improve knowledge of the processes that affect seismic wave generation and propagation from underground explosions, with a focus on Rg waves. The numerical model is based on the coupling of hydrodynamic simulation codes (Abaqus, CASH and HOSS), with a 3D full waveform propagation code, SPECFEM3D. Validation datasets are provided by the Source Physics Experiment (SPE) which is a series of highly instrumented chemical explosions at the Nevada National Security Site with yields from 100kg to 5000kg. A first series of explosions in a granite emplacement has just been completed and a second series in alluvium emplacement is planned for 2018. The long-term goal of this research is to review and improve current existing seismic sources models (e.g. Mueller & Murphy, 1971; Denny & Johnson, 1991) by providing first principles calculations provided by the coupled codes capability. The hydrodynamic codes, Abaqus, CASH and HOSS, model the shocked, hydrodynamic region via equations of state for the explosive, borehole stemming and jointed/weathered granite. A new material model for unconsolidated alluvium materials has been developed and validated with past nuclear explosions, including the 10 kT 1965 Merlin event (Perret, 1971) ; Perret and Bass, 1975). We use the efficient Spectral Element Method code, SPECFEM3D (e.g. Komatitsch, 1998; 2002), and Geologic Framework Models to model the evolution of wavefield as it propagates across 3D complex structures. The coupling interface is a series of grid points of the SEM mesh situated at the edge of the hydrodynamic code domain. We will present validation tests and waveforms modeled for several SPE tests which provide evidence that the damage processes happening in the vicinity of the explosions create secondary seismic sources. These sources interfere with the original explosion moment and reduces the apparent seismic moment at the origin of Rg waves up to 20%.

  20. Applying new seismic analysis techniques to the lunar seismic dataset: New information about the Moon and planetary seismology on the eve of InSight

    NASA Astrophysics Data System (ADS)

    Dimech, J. L.; Weber, R. C.; Knapmeyer-Endrun, B.; Arnold, R.; Savage, M. K.

    2016-12-01

    The field of planetary science is poised for a major advance with the upcoming InSight mission to Mars due to launch in May 2018. Seismic analysis techniques adapted for use on planetary data are therefore highly relevant to the field. The heart of this project is in the application of new seismic analysis techniques to the lunar seismic dataset to learn more about the Moon's crust and mantle structure, with particular emphasis on `deep' moonquakes which are situated half-way between the lunar surface and its core with no surface expression. Techniques proven to work on the Moon might also be beneficial for InSight and future planetary seismology missions which face similar technical challenges. The techniques include: (1) an event-detection and classification algorithm based on `Hidden Markov Models' to reclassify known moonquakes and look for new ones. Apollo 17 gravimeter and geophone data will also be included in this effort. (2) Measurements of anisotropy in the lunar mantle and crust using `shear-wave splitting'. Preliminary measurements on deep moonquakes using the MFAST program are encouraging, and continued evaluation may reveal new structural information on the Moon's mantle. (3) Probabilistic moonquake locations using NonLinLoc, a non-linear hypocenter location technique, using a modified version of the codes designed to work with the Moon's radius. Successful application may provide a new catalog of moonquake locations with rigorous uncertainty information, which would be a valuable input into: (4) new fault plane constraints from focal mechanisms using a novel approach to Bayes' theorem which factor in uncertainties in hypocenter coordinates and S-P amplitude ratios. Preliminary results, such as shear-wave splitting measurements, will be presented and discussed.

  1. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    CRISIS is a computer tool for probabilistic seismic hazard analysis (PSHA), whose development started in the late 1980's at the Instituto de Ingeniería, UNAM, Mexico. It started circulating outside the Mexican borders at the beginning of the 1990's, when it was first distributed as part of SEISAN tools. Throughout the years, CRISIS has been used for seismic hazard studies in several countries in Latin America (Mexico, Guatemala, Belize, El Salvador, Honduras, Nicaragua, Costa Rica, Panama, Colombia, Venezuela, Ecuador, Peru, Argentina and Chile), and in many other countries of the World. CRISIS has always circulated free of charge for non-commercial applications. It is worth noting that CRISIS has been mainly written by people that are, at the same time, PSHA practitioners. Therefore, the development loop has been relatively short, and most of the modifications and improvements have been made to satisfy the needs of the developers themselves. CRISIS has evolved from a rather simple FORTRAN code to a relatively complex program with a friendly graphical interface, able to handle a variety of modeling possibilities for source geometries, seismicity descriptions and ground motion prediction models (GMPM). We will describe some of the improvements made for the newest version of the code: CRISIS 2012.These improvements, some of which were made in the frame of the Italian research project INGV-DPC S2 (http://nuovoprogettoesse2.stru.polimi.it/), funded by the Dipartimento della Protezione Civile (DPC; National Civil Protection Department), include: A wider variety of source geometries A wider variety of seismicity models, including the ability to handle non-Poissonian occurrence models and Poissonian smoothed-seismicity descriptions. Enhanced capabilities for using different kinds of GMPM: attenuation tables, built-in models and generalized attenuation models. In the case of built-in models, there is, by default, a set ready to use in CRISIS, but additional custom GMPMs may be freely developed and integrated without having to recompile the core code. Therefore, the users can build new external classes implementing custom GMPM modules by adhering to the programming-interface specification, which is delivered as part of the executable program. On the other hand, generalized attenuation models are non-parametric probabilistic descriptions of the ground motions produced by individual earthquakes with known magnitude and location. In the context of CRISIS, a generalized attenuation model is a collection of probabilistic footprints, one for each of the events considered in the analysis. Each footprint gives the geographical distribution of the intensities produced by this event. CRISIS permits now the inclusion of local site effects in hazard computations. Site effects are given to CRISIS in terms of amplification factors that depend on site location, period, and ground-motion level (in order to account for soil non-linearity). Enhanced capabilities to make logic-tree computations and to produce seismic disaggregation charts. A new presentation layer, developed for accessing the same functionalities of the desktop version via web (CRISISWeb). Examples will be presented and the program will be made available to all interested persons.

  2. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.

  3. New Evaluation of Seismic Hazard in Cental America and la Hispaniola

    NASA Astrophysics Data System (ADS)

    Benito, B.; Camacho, E. I.; Rojas, W.; Climent, A.; Alvarado-Induni, G.; Marroquin, G.; Molina, E.; Talavera, E.; Belizaire, D.; Pierristal, G.; Torres, Y.; Huerfano, V.; Polanco, E.; García, R.; Zevallos, F.

    2013-05-01

    The results from seismic hazard studies carried out in two seismic scenarios, Central America Region (CA) and La Hispaniola Island, are presented here. Both cases follow the Probabilistic Seismic Hazard Assessment (PSHA) methodology and they are developed in terms of PGA, and SA (T), for T of 0.1, 0.2, 0.5, 1 and 2s. In both anaysis, hybrid zonation models are considered, integrated by seismogenic zones and faults where data of slip rate and recurrence time are available. First, we present a new evaluation of seismic hazard in CA, starting with the results of a previous study by Benito et al (2011). Some improvements are now included, such as: updated catalogue till 2011, corrections in the zonning model in particular for subduction regime taken into account the variation of the dip in Costa Rica and Panama, and modelization of some faults as independent units for the hazard estimation. The results allow us to carry out a sensitivity analysis comparing the ones obtained with and without faults. In a second part we present the results of the PSHA in La Hispaniola, carried out as part of the cooperative project SISMO-HAITI supported by UPM and developed in cooperation with ONEV. It started a few months after the 2010 event, as an answer to a required help from the Haitian government to UPM. The study was aimed at obtaining results suitable for seismic design purposes and started with the elaboration of a seismic catalogue for the Hispaniola, requiring an exhaustive revision of data reported by around 30 seismic agencies, apart from these from Puerto Rico and Dominican Republic Seismic Networks. Seismotectonic models for the region were reviewed and a new regional zonation was proposed, taking into account different geophysical data. Attenuation models for subduction and crustal zones were also reviewed and the more suitable were calibrated with data recorded inside the Caribbean plate. As a result of the PSHA, different maps were generated for the quoted parameters, together with the UHS for the main cities in the country. The obtained values for PGA and return peridod of 475 y. are comparable to the ones of the Dominican Republic Building Code, with maximun PGA around 400 cm/s2 (in rock sites). However, the morphology of the map is quite similar to the previous one by Frankel et al (2011), althought ours presents lower PGA values. The results are available as a basis for the the first Haitian building code.

  4. Safety Identifying of Integral Abutment Bridges under Seismic and Thermal Loads

    PubMed Central

    Easazadeh Far, Narges; Barghian, Majid

    2014-01-01

    Integral abutment bridges (IABs) have many advantages over conventional bridges in terms of strength and maintenance cost. Due to the integrity of these structures uniform thermal and seismic loads are known important ones on the structure performance. Although all bridge design codes consider temperature and earthquake loads separately in their load combinations for conventional bridges, the thermal load is an “always on” load and, during the occurrence of an earthquake, these two important loads act on bridge simultaneously. Evaluating the safety level of IABs under combination of these loads becomes important. In this paper, the safety of IABs—designed by AASHTO LRFD bridge design code—under combination of thermal and seismic loads is studied. To fulfill this aim, first the target reliability indexes under seismic load have been calculated. Then, these analyses for the same bridge under combination of thermal and seismic loads have been repeated and the obtained reliability indexes are compared with target indexes. It is shown that, for an IAB designed by AASHTO LRFD, the indexes have been reduced under combined effects. So, the target level of safety during its design life is not provided and the code's load combination should be changed. PMID:25405232

  5. seismo-live: Training in Seismology using Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Igel, Heiner; Krischer, Lion; van Driel, Martin; Tape, Carl

    2017-04-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation and data processing technologies in research projects. At the same time well-engineered community codes make it easy to return results yet with the danger that the inherent traps of black-box solutions are not well understood. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations, with interactive, executable python codes. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing, noise analysis, and a variety of forward solvers for seismic wave propagation. In addition, an example is shown how Jupyter notebooks can be used to increase reproducibility of published results. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas. We present recent developments and new features.

  6. Effect of strong elastic contrasts on the propagation of seismic wave in hard-rock environments

    NASA Astrophysics Data System (ADS)

    Saleh, R.; Zheng, L.; Liu, Q.; Milkereit, B.

    2013-12-01

    Understanding the propagation of seismic waves in a presence of strong elastic contrasts, such as topography, tunnels and ore-bodies is still a challenge. Safety in mining is a major concern and seismic monitoring is the main tool here. For engineering purposes, amplitudes (peak particle velocity/acceleration) and travel times of seismic events (mostly blasts or microseismic events) are critical parameters that have to be determined at various locations in a mine. These parameters are useful in preparing risk maps or to better understand the process of spatial and temporal stress distributions in a mine. Simple constant velocity models used for monitoring studies in mining, cannot explain the observed complexities in scattered seismic waves. In hard-rock environments modeling of elastic seismic wavefield require detailed 3D petrophysical, infrastructure and topographical data to simulate the propagation of seismic wave with a frequencies up to few kilohertz. With the development of efficient numerical techniques, and parallel computation facilities, a solution for such a problem is achievable. In this study, the effects of strong elastic contrasts such as ore-bodies, rough topography and tunnels will be illustrated using 3D modeling method. The main tools here are finite difference code (SOFI3D)[1] that has been benchmarked for engineering studies, and spectral element code (SPECFEM) [2], which was, developed for global seismology problems. The modeling results show locally enhanced peak particle velocity due to presence of strong elastic contrast and topography in models. [1] Bohlen, T. Parallel 3-D viscoelastic finite difference seismic modeling. Computers & Geosciences 28 (2002) 887-899 [2] Komatitsch, D., and J. Tromp, Introduction to the spectral-element method for 3-D seismic wave propagation, Geophys. J. Int., 139, 806-822, 1999.

  7. The Sparta Fault, Southern Greece: From segmentation and tectonic geomorphology to seismic hazard mapping and time dependent probabilities

    NASA Astrophysics Data System (ADS)

    Papanikolaοu, Ioannis D.; Roberts, Gerald P.; Deligiannakis, Georgios; Sakellariou, Athina; Vassilakis, Emmanuel

    2013-06-01

    The Sparta Fault system is a major structure approximately 64 km long that bounds the eastern flank of the Taygetos Mountain front (2407 m) and shapes the present-day Sparta basin. It was activated in 464 B.C., devastating the city of Sparta. This fault is examined and described in terms of its geometry, segmentation, drainage pattern and post-glacial throw, emphasising how these parameters vary along strike. Qualitative analysis of long profile catchments shows a significant difference in longitudinal convexity between the central and both the south and north parts of the fault system, leading to the conclusion of varying uplift rate along strike. Catchments are sensitive in differential uplift as it is observed by the calculated differences of the steepness index ksn between the outer (ksn < 83) and central parts (121 < ksn < 138) of the Sparta Fault along strike the fault system. Based on fault throw-rates and the bedrock geology a seismic hazard map has been constructed that extracts a locality specific long-term earthquake recurrence record. Based on this map the town of Sparta would experience a destructive event similar to that in 464 B.C. approximately every 1792 ± 458 years. Since no other major earthquake M ~ 7.0 has been generated by this system since 464 B.C., a future event could be imminent. As a result, not only time-independent but also time-dependent probabilities, which incorporate the concept of the seismic cycle, have been calculated for the town of Sparta, showing a considerably higher time-dependent probability of 3.0 ± 1.5% over the next 30 years compared to the time-independent probability of 1.66%. Half of the hanging wall area of the Sparta Fault can experience intensities ≥ IX, but belongs to the lowest category of seismic risk of the national seismic building code. On view of these relatively high calculated probabilities, a reassessment of the building code might be necessary.

  8. The Sparta Fault, Southern Greece: From Segmentation and Tectonic Geomorphology to Seismic Hazard Mapping and Time Dependent Probabilities

    NASA Astrophysics Data System (ADS)

    Papanikolaou, Ioannis; Roberts, Gerald; Deligiannakis, Georgios; Sakellariou, Athina; Vassilakis, Emmanuel

    2013-04-01

    The Sparta Fault system is a major structure approximately 64 km long that bounds the eastern flank of the Taygetos Mountain front (2.407 m) and shapes the present-day Sparta basin. It was activated in 464 B.C., devastating the city of Sparta. This fault is examined and described in terms of its geometry, segmentation, drainage pattern and postglacial throw, emphasizing how these parameters vary along strike. Qualitative analysis of long profile catchments shows a significant difference in longitudinal convexity between the central and both the south and north parts of the fault system, leading to the conclusion of varying uplift rate along strike. Catchments are sensitive in differential uplift as it is observed by the calculated differences of the steepness index ksn between the outer (ksn<83) and central parts (121

  9. EDDIE Seismology: Introductory spectral analysis for undergraduates

    NASA Astrophysics Data System (ADS)

    Soule, D. C.; Gougis, R.; O'Reilly, C.

    2016-12-01

    We present a spectral seismology lesson in which students use spectral analysis to describe the frequency of seismic arrivals based on a conceptual presentation of waveforms and filters. The goal is for students to surpass basic waveform terminology and relate a time domain signals to their conjugates in the frequency domain. Although seismology instruction commonly engages students in analysis of authentic seismological data, this is less true for lower-level undergraduate seismology instruction due to coding barriers to many seismological analysis tasks. To address this, our module uses Seismic Canvas (Kroeger, 2015; https://seiscode.iris.washington.edu/projects/seismiccanvas), a graphically interactive application for accessing, viewing and analyzing waveform data, which we use to plot earthquake data in the time domain. Once students are familiarized with the general components of the waveform (i.e. frequency, wavelength, amplitude and period), they use Seismic Canvas to transform the data into the frequency domain. Bypassing the mathematics of Fourier Series allows focus on conceptual understanding by plotting and manipulating seismic data in both time and frequency domains. Pre/post-tests showed significant improvements in students' use of seismograms and spectrograms to estimate the frequency content of the primary wave, which demonstrated students' understanding of frequency and how data on the spectrogram and seismogram are related. Students were also able to identify the time and frequency of the largest amplitude arrival, indicating understanding of amplitude and use of a spectrogram as an analysis tool. Students were also asked to compare plots of raw data and the same data filtered with a high-pass filter, and identify the filter used to create the second plot. Students demonstrated an improved understanding of how frequency content can be removed from a signal in the spectral domain.

  10. Finite element analyses for seismic shear wall international standard problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Hofmayer, C.H.

    Two identical reinforced concrete (RC) shear walls, which consist of web, flanges and massive top and bottom slabs, were tested up to ultimate failure under earthquake motions at the Nuclear Power Engineering Corporation`s (NUPEC) Tadotsu Engineering Laboratory, Japan. NUPEC provided the dynamic test results to the OECD (Organization for Economic Cooperation and Development), Nuclear Energy Agency (NEA) for use as an International Standard Problem (ISP). The shear walls were intended to be part of a typical reactor building. One of the major objectives of the Seismic Shear Wall ISP (SSWISP) was to evaluate various seismic analysis methods for concrete structuresmore » used for design and seismic margin assessment. It also offered a unique opportunity to assess the state-of-the-art in nonlinear dynamic analysis of reinforced concrete shear wall structures under severe earthquake loadings. As a participant of the SSWISP workshops, Brookhaven National Laboratory (BNL) performed finite element analyses under the sponsorship of the U.S. Nuclear Regulatory Commission (USNRC). Three types of analysis were performed, i.e., monotonic static (push-over), cyclic static and dynamic analyses. Additional monotonic static analyses were performed by two consultants, F. Vecchio of the University of Toronto (UT) and F. Filippou of the University of California at Berkeley (UCB). The analysis results by BNL and the consultants were presented during the second workshop in Yokohama, Japan in 1996. A total of 55 analyses were presented during the workshop by 30 participants from 11 different countries. The major findings on the presented analysis methods, as well as engineering insights regarding the applicability and reliability of the FEM codes are described in detail in this report. 16 refs., 60 figs., 16 tabs.« less

  11. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  12. 49 CFR 41.115 - New buildings to be leased for DOT occupancy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... compliance may include the engineer's and architect's authenticated verifications of seismic design codes... design and construction of new buildings to be leased for DOT occupancy or use will ensure that each building is designed and constructed in accord with the seismic design and construction standards set out...

  13. 49 CFR 41.115 - New buildings to be leased for DOT occupancy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... compliance may include the engineer's and architect's authenticated verifications of seismic design codes... design and construction of new buildings to be leased for DOT occupancy or use will ensure that each building is designed and constructed in accord with the seismic design and construction standards set out...

  14. Mean and modal ϵ in the deaggregation of probabilistic ground motion

    USGS Publications Warehouse

    Harmsen, Stephen C.

    2001-01-01

    Mean and modal ϵ exhibit a wide variation geographically for any specified PE. Modal ϵ for the 2% in 50 yr PE exceeds 2 near the most active western California faults, is less than –1 near some less active faults of the western United States (principally in the Basin and Range), and may be less than 0 in areal fault zones of the central and eastern United States (CEUS). This geographic variation is useful for comparing probabilistic ground motions with ground motions from scenario earthquakes on dominating faults, often used in seismic-resistant provisions of building codes. An interactive seismic-hazard deaggregation menu item has been added to the USGS probabilistic seismic-hazard analysis Web site, http://geohazards.cr.usgs.gov/eq/, allowing visitors to compute mean and modal distance, magnitude, and ϵ corresponding to ground motions having mean return times from 250 to 5000 yr for any site in the United States.

  15. Exploring the Differences Between the European (SHARE) and the Reference Italian Seismic Hazard Models

    NASA Astrophysics Data System (ADS)

    Visini, F.; Meletti, C.; D'Amico, V.; Rovida, A.; Stucchi, M.

    2014-12-01

    The recent release of the probabilistic seismic hazard assessment (PSHA) model for Europe by the SHARE project (Giardini et al., 2013, www.share-eu.org) arises questions about the comparison between its results for Italy and the official Italian seismic hazard model (MPS04; Stucchi et al., 2011) adopted by the building code. The goal of such a comparison is identifying the main input elements that produce the differences between the two models. It is worthwhile to remark that each PSHA is realized with data and knowledge available at the time of the release. Therefore, even if a new model provides estimates significantly different from the previous ones that does not mean that old models are wrong, but probably that the current knowledge is strongly changed and improved. Looking at the hazard maps with 10% probability of exceedance in 50 years (adopted as the standard input in the Italian building code), the SHARE model shows increased expected values with respect to the MPS04 model, up to 70% for PGA. However, looking in detail at all output parameters of both the models, we observe a different behaviour for other spectral accelerations. In fact, for spectral periods greater than 0.3 s, the current reference PSHA for Italy proposes higher values than the SHARE model for many and large areas. This observation suggests that this behaviour could not be due to a different definition of seismic sources and relevant seismicity rates; it mainly seems the result of the adoption of recent ground-motion prediction equations (GMPEs) that estimate higher values for PGA and for accelerations with periods lower than 0.3 s and lower values for higher periods with respect to old GMPEs. Another important set of tests consisted in analysing separately the PSHA results obtained by the three source models adopted in SHARE (i.e., area sources, fault sources with background, and a refined smoothed seismicity model), whereas MPS04 only uses area sources. Results seem to confirm the strong impact of the new generation GMPEs on the seismic hazard estimates. Giardini D. et al., 2013. Seismic Hazard Harmonization in Europe (SHARE): Online Data Resource, doi:10.12686/SED-00000001-SHARE. Stucchi M. et al., 2011. Seismic Hazard Assessment (2003-2009) for the Italian Building Code. Bull. Seismol. Soc. Am. 101, 1885-1911.

  16. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  17. Seismic Vulnerability and Performance Level of confined brick walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghalehnovi, M.; Rahdar, H. A.

    2008-07-08

    There has been an increase on the interest of Engineers and designers to use designing methods based on displacement and behavior (designing based on performance) Regarding to the importance of resisting structure design against dynamic loads such as earthquake, and inability to design according to prediction of nonlinear behavior element caused by nonlinear properties of constructional material.Economically speaking, easy carrying out and accessibility of masonry material have caused an enormous increase in masonry structures in villages, towns and cities. On the other hand, there is a necessity to study behavior and Seismic Vulnerability in these kinds of structures since Iranmore » is located on the earthquake belt of Alpide.Different reasons such as environmental, economic, social, cultural and accessible constructional material have caused different kinds of constructional structures.In this study, some tied walls have been modeled with software and with relevant accelerator suitable with geology conditions under dynamic analysis to research on the Seismic Vulnerability and performance level of confined brick walls. Results from this analysis seem to be satisfactory after comparison of them with the values in Code ATC40, FEMA and standard 2800 of Iran.« less

  18. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  19. The Sacred Mountain of Varallo in Italy: seismic risk assessment by acoustic emission and structural numerical models.

    PubMed

    Carpinteri, Alberto; Lacidogna, Giuseppe; Invernizzi, Stefano; Accornero, Federico

    2013-01-01

    We examine an application of Acoustic Emission (AE) technique for a probabilistic analysis in time and space of earthquakes, in order to preserve the valuable Italian Renaissance Architectural Complex named "The Sacred Mountain of Varallo." Among the forty-five chapels of the Renaissance Complex, the structure of the Chapel XVII is of particular concern due to its uncertain structural condition and due to the level of stress caused by the regional seismicity. Therefore, lifetime assessment, taking into account the evolution of damage phenomena, is necessary to preserve the reliability and safety of this masterpiece of cultural heritage. A continuous AE monitoring was performed to assess the structural behavior of the Chapel. During the monitoring period, a correlation between peaks of AE activity in the masonry of the "Sacred Mountain of Varallo" and regional seismicity was found. Although the two phenomena take place on very different scales, the AE in materials and the earthquakes in Earth's crust, belong to the same class of invariance. In addition, an accurate finite element model, performed with DIANA finite element code, is presented to describe the dynamic behavior of Chapel XVII structure, confirming visual and instrumental inspections of regional seismic effects.

  20. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  1. Numerical investigation and Uncertainty Quantification of the Impact of the geological and geomechanical properties on the seismo-acoustic responses of underground chemical explosions

    NASA Astrophysics Data System (ADS)

    Ezzedine, S. M.; Pitarka, A.; Vorobiev, O.; Glenn, L.; Antoun, T.

    2017-12-01

    We have performed three-dimensional high resolution simulations of underground chemical explosions conducted recently in jointed rock outcrop as part of the Source Physics Experiments (SPE) being conducted at the Nevada National Security Site (NNSS). The main goal of the current study is to investigate the effects of the structural and geomechanical properties on the spall phenomena due to underground chemical explosions and its subsequent effect on the seismo-acoustic signature at far distances. Two parametric studies have been undertaken to assess the impact of different 1) conceptual geological models including a single layer and two layers model, with and without joints and with and without varying geomechanical properties, and 2) depth of bursts of the chemical explosions and explosion yields. Through these investigations we have explored not only the near-field response of the chemical explosions but also the far-field responses of the seismic and the acoustic signatures. The near-field simulations were conducted using the Eulerian and Lagrangian codes, GEODYN and GEODYN -L, respectively, while the far-field seismic simulations were conducted using the elastic wave propagation code, WPP, and the acoustic response using the Kirchhoff-Helmholtz-Rayleigh time-dependent approximation code, KHR. Though a series of simulations we have recorded the velocity field histories a) at the ground surface on an acoustic-source-patch for the acoustic simulations, and 2) on a seismic-source-box for the seismic simulations. We first analyzed the SPE3 experimental data and simulated results, then simulated SPE4-prime, SPE5, and SPE6 to anticipate their seismo-acoustic responses given conditions of uncertainties. SPE experiments were conducted in a granitic formation; we have extended the parametric study to include other geological settings such dolomite and alluvial formations. These parametric studies enabled us 1) investigating the geotechnical and geophysical key parameters that impact the seismo-acoustic responses of underground chemical explosions and 2) deciphering and ranking through a global sensitivity analysis the most important key parameters to be characterized on site to minimize uncertainties in prediction and discrimination.

  2. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.

  3. New Seismic Monitoring Station at Mohawk Ridge, Valles Caldera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Peter Morse

    Two new broadband digital seismic stations were installed in the Valles Caldera in 2011 and 2012. The first is located on the summit of Cerros del Abrigo (station code CDAB) and the second is located on the flanks of San Antonio Mountain (station code SAMT). Seismic monitoring stations in the caldera serve multiple purposes. These stations augment and expand the current coverage of the Los Alamos Seismic Network (LASN), which is operated to support seismic and volcanic hazards studies for LANL and northern New Mexico (Figure 1). They also provide unique continuous seismic data within the caldera that can bemore » used for scientific studies of the caldera’s substructure and detection of very small seismic signals that may indicate changes in the current and evolving state of remnant magma that is known to exist beneath the caldera. Since the installation of CDAB and SAMT, several very small earthquakes have already been detected near San Antonio Mountain just west of SAMT (Figure 2). These are the first events to be seen in that area. Caldera stations also improve the detection and epicenter determination quality for larger local earthquakes on the Pajarito Fault System east of the Preserve and the Nacimiento Uplift to the west. These larger earthquakes are a concern to LANL Seismic Hazards assessments and seismic monitoring of the Los Alamos region, including the VCNP, is a DOE requirement. Currently the next closest seismic stations to the caldera are on Pipeline Road (PPR) just west of Los Alamos, and Peralta Ridge (PER) south of the caldera. There is no station coverage near the resurgent dome, Redondo Peak, in the center of the caldera. Filling this “hole” is the highest priority for the next new LASN station. We propose to install this station in 2018 on Mohawk Ridge just east of Redondito, in the same area already occupied by other scientific installations, such as the MCON flux tower operated by UNM.« less

  4. 49 CFR 41.110 - New DOT owned buildings and additions to buildings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... architect's authenticated verifications of seismic design codes, standards, and practices used in the design... for the design and construction of new DOT Federally owned buildings will ensure that each building is designed and constructed in accord with the seismic design and construction standards set out in § 41.120...

  5. Updates to building-code maps for the 2015 NEHRP recommended seismic provisions

    USGS Publications Warehouse

    Luco, Nicolas; Bachman, Robert; Crouse, C.B; Harris, James R.; Hooper, John D.; Kircher, Charles A.; Caldwell, Phillp; Rukstales, Kenneth S.

    2015-01-01

    With the 2014 update of the U.S. Geological Survey (USGS) National Seismic Hazard Model (NSHM) as a basis, the Building Seismic Safety Council (BSSC) has updated the earthquake ground motion maps in the National Earthquake Hazards Reduction Program (NEHRP) Recommended Seismic Provisions for New Buildings and Other Structures, with partial funding from the Federal Emergency Management Agency. Anticipated adoption of the updated maps into the American Society of Civil Engineers Minimum Design Loads for Building and Other Structures and the International Building and Residential Codes is underway. Relative to the ground motions in the prior edition of each of these documents, most of the updated values are within a ±20% change. The larger changes are, in most cases, due to the USGS NSHM updates, reasons for which are given in companion publications. In some cases, the larger changes are partly due to a BSSC update of the slope of the fragility curve that is used to calculate the risk-targeted ground motions, and/or the introduction by BSSC of a quantitative definition of “active faults” used to calculate deterministic ground motions.

  6. Probabilistic and Scenario Seismic and Liquefaction Hazard Analysis of the Mississippi Embayment Incorporating Nonlinear Site Effects

    NASA Astrophysics Data System (ADS)

    Cramer, C. H.; Dhar, M. S.

    2017-12-01

    The influence of deep sediment deposits of the Mississippi Embayment (ME) on the propagation of seismic waves is poorly understood and remains a major source of uncertainty for site response analysis. Many researchers have studied the effects of these deposits on seismic hazard of the area using available information at the time. In this study, we have used updated and newly available resources for seismic and liquefaction hazard analyses of the ME. We have developed an improved 3D geological model. Additionally, we used surface geological maps from Cupples and Van Arsdale (2013) to prepare liquefaction hazard maps. Both equivalent linear and nonlinear site response codes were used to develop site amplification distributions for use in generating hazard maps. The site amplification distributions are created using the Monte Carlo approach of Cramer et al. (2004, 2006) on a 0.1-degree grid. The 2014 National Seismic Hazard model and attenuation relations (Petersen et al., 2014) are used to prepare seismic hazard maps. Then liquefaction hazard maps are generated using liquefaction probability curves from Holzer (2011) and Cramer et al. (2015). Equivalent linear response (w/ increased precision, restricted nonlinear behavior with depth) shows similar hazard for the ME compared to nonlinear analysis (w/o pore pressure) results. At short periods nonlinear deamplification dominates the hazard, but at long periods resonance amplification dominates. The liquefaction hazard tends to be high in Holocene and late Pleistocene lowland sediments, even with lowered ground water levels, and low in Pleistocene loess of the uplands. Considering pore pressure effects in nonlinear site response analysis at a test site on the lowlands shows amplification of ground motion at short periods. PGA estimates from ME liquefaction and MMI observations are in the 0.25 to 0.4 g range. Our estimated M7.5 PGA hazard within 10 km of the fault can exceed this. Ground motion observations from liquefaction sites in New Zealand and Japan support PGAs below 0.4 g, except at sites within 20 km exhibiting pore-pressure induced acceleration spikes due to cyclic mobility where PGA ranges from 0.5 to 1.5 g. This study is being extended to more detailed seismic and liquefaction hazard studies in five western Tennessee counties under a five year grant from HUD.

  7. The New Italian Seismic Hazard Model

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.

    2017-12-01

    In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme of the different components of the PSHA model that has been built through three different independent steps: a formal experts' elicitation, the outcomes of the testing phase, and the correlation between the outcomes. Finally, we explore through different techniques the influence on seismic hazard of the declustering procedure.

  8. EMERALD: Coping with the Explosion of Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2009-12-01

    The geosciences are currently generating an unparalleled quantity of new public broadband seismic data with the establishment of large-scale seismic arrays such as the EarthScope USArray, which are enabling new and transformative scientific discoveries of the structure and dynamics of the Earth’s interior. Much of this explosion of data is a direct result of the formation of the IRIS consortium, which has enabled an unparalleled level of open exchange of seismic instrumentation, data, and methods. The production of these massive volumes of data has generated new and serious data management challenges for the seismological community. A significant challenge is the maintenance and updating of seismic metadata, which includes information such as station location, sensor orientation, instrument response, and clock timing data. This key information changes at unknown intervals, and the changes are not generally communicated to data users who have already downloaded and processed data. Another basic challenge is the ability to handle massive seismic datasets when waveform file volumes exceed the fundamental limitations of a computer’s operating system. A third, long-standing challenge is the difficulty of exchanging seismic processing codes between researchers; each scientist typically develops his or her own unique directory structure and file naming convention, requiring that codes developed by another researcher be rewritten before they can be used. To address these challenges, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The overarching goal of the EMERALD project is to enable more efficient and effective use of seismic datasets ranging from just a few hundred to millions of waveforms with a complete database-driven system, leading to higher quality seismic datasets for scientific analysis and enabling faster, more efficient scientific research. We will present a preliminary (beta) version of EMERALD, an integrated, extensible, standalone database server system based on the open-source PostgreSQL database engine. The system is designed for fast and easy processing of seismic datasets, and provides the necessary tools to manage very large datasets and all associated metadata. EMERALD provides methods for efficient preprocessing of seismic records; large record sets can be easily and quickly searched, reviewed, revised, reprocessed, and exported. EMERALD can retrieve and store station metadata and alert the user to metadata changes. The system provides many methods for visualizing data, analyzing dataset statistics, and tracking the processing history of individual datasets. EMERALD allows development and sharing of visualization and processing methods using any of 12 programming languages. EMERALD is designed to integrate existing software tools; the system provides wrapper functionality for existing widely-used programs such as GMT, SOD, and TauP. Users can interact with EMERALD via a web browser interface, or they can directly access their data from a variety of database-enabled external tools. Data can be imported and exported from the system in a variety of file formats, or can be directly requested and downloaded from the IRIS DMC from within EMERALD.

  9. Improving Seismic Event Characterisation

    DTIC Science & Technology

    1996-07-22

    classificat i,; and further phase identification . 6.4.3 Seismic event interpretation The’ system of event processing is based on an assumption tree ...and is enhanced with usez by a network. 14, SUBJECT TERMSý 15. NUMBER OF PAGES seismic models, travel. timtes phase identification 16 PRICE CODE 17...hesimwinlia’ rati of t lieDl scisillograonis is 2/3 secondIs andI the receiver spaci mi is 1 /3 degreeus. ’lIi iiaiiiii iltdiwic’ ewe ii rayv-the~oret~icaIl

  10. New Site Coefficients and Site Classification System Used in Recent Building Seismic Code Provisions

    USGS Publications Warehouse

    Dobry, R.; Borcherdt, R.D.; Crouse, C.B.; Idriss, I.M.; Joyner, W.B.; Martin, G.R.; Power, M.S.; Rinne, E.E.; Seed, R.B.

    2000-01-01

    Recent code provisions for buildings and other structures (1994 and 1997 NEHRP Provisions, 1997 UBC) have adopted new site amplification factors and a new procedure for site classification. Two amplitude-dependent site amplification factors are specified: Fa for short periods and Fv for longer periods. Previous codes included only a long period factor S and did not provide for a short period amplification factor. The new site classification system is based on definitions of five site classes in terms of a representative average shear wave velocity to a depth of 30 m (V?? s). This definition permits sites to be classified unambiguously. When the shear wave velocity is not available, other soil properties such as standard penetration resistance or undrained shear strength can be used. The new site classes denoted by letters A - E, replace site classes in previous codes denoted by S1 - S4. Site classes A and B correspond to hard rock and rock, Site Class C corresponds to soft rock and very stiff / very dense soil, and Site Classes D and E correspond to stiff soil and soft soil. A sixth site class, F, is defined for soils requiring site-specific evaluations. Both Fa and Fv are functions of the site class, and also of the level of seismic hazard on rock, defined by parameters such as Aa and Av (1994 NEHRP Provisions), Ss and S1 (1997 NEHRP Provisions) or Z (1997 UBC). The values of Fa and Fv decrease as the seismic hazard on rock increases due to soil nonlinearity. The greatest impact of the new factors Fa and Fv as compared with the old S factors occurs in areas of low-to-medium seismic hazard. This paper summarizes the new site provisions, explains the basis for them, and discusses ongoing studies of site amplification in recent earthquakes that may influence future code developments.

  11. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    NASA Astrophysics Data System (ADS)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than scenarios in the historical record, and should properly be based on the underlying seismicity of subduction zones. Therefore, Probabilistic Tsunami Hazard Analysis (PTHA) consistent with source seismicity must be performed in addition to consideration of historical event scenarios. A method of Probabilistic Tsunami Hazard Analysis has been established that is generally consistent with Probabilistic Seismic Hazard Analysis in the treatment of uncertainty. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. Structural member acceptability criteria will be based on performance objectives for a 2,500-year Maximum Considered Tsunami. The approach developed by the ASCE Tsunami Loads and Effects Subcommittee of the ASCE 7 Standard would result in the first national unification of tsunami hazard criteria for design codes reflecting the modern approach of Performance-Based Engineering.

  12. System and method for generating micro-seismic events and characterizing properties of a medium with non-linear acoustic interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A.

    2015-12-29

    A method and system includes generating a first coded acoustic signal including pulses each having a modulated signal at a central frequency; and a second coded acoustic signal each pulse of which includes a modulated signal a central frequency of which is a fraction d of the central frequency of the modulated signal for the corresponding pulse in the first plurality of pulses. A receiver detects a third signal generated by a non-linear mixing process in the mixing zone and the signal is processed to extract the third signal to obtain an emulated micro-seismic event signal occurring at the mixingmore » zone; and to characterize properties of the medium or creating a 3D image of the properties of the medium, or both, based on the emulated micro-seismic event signal.« less

  13. Random vibration analysis of train-bridge under track irregularities and traveling seismic waves using train-slab track-bridge interaction model

    NASA Astrophysics Data System (ADS)

    Zeng, Zhi-Ping; Zhao, Yan-Gang; Xu, Wen-Tao; Yu, Zhi-Wu; Chen, Ling-Kun; Lou, Ping

    2015-04-01

    The frequent use of bridges in high-speed railway lines greatly increases the probability that trains are running on bridges when earthquakes occur. This paper investigates the random vibrations of a high-speed train traversing a slab track on a continuous girder bridge subjected to track irregularities and traveling seismic waves by the pseudo-excitation method (PEM). To derive the equations of motion of the train-slab track-bridge interaction system, the multibody dynamics and finite element method models are used for the train and the track and bridge, respectively. By assuming track irregularities to be fully coherent random excitations with time lags between different wheels and seismic accelerations to be uniformly modulated, non-stationary random excitations with time lags between different foundations, the random load vectors of the equations of motion are transformed into a series of deterministic pseudo-excitations based on PEM and the wheel-rail contact relationship. A computer code is developed to obtain the time-dependent random responses of the entire system. As a case study, the random vibration characteristics of an ICE-3 high-speed train traversing a seven-span continuous girder bridge simultaneously excited by track irregularities and traveling seismic waves are analyzed. The influence of train speed and seismic wave propagation velocity on the random vibration characteristics of the bridge and train are discussed.

  14. Seismic hazard map of the western hemisphere

    USGS Publications Warehouse

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of the Americas depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings). The largest seismic hazard values in the western hemisphere generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes. Although the largest earthquakes ever recorded are the 1960 Chile and 1964 Alaska subduction zone earthquakes, the largest seismic hazard (PGA) value in the Americas is in Southern California (U.S.), along the San Andreas fault.

  15. Building configuration and seismic design: The architecture of earthquake resistance

    NASA Astrophysics Data System (ADS)

    Arnold, C.; Reitherman, R.; Whitaker, D.

    1981-05-01

    The architecture of a building in relation to its ability to withstand earthquakes was determined. Aspects of round motion which are significant to building behavior are discussed. Results of a survey of configuration decisions that affect the performance of buildings with a focus on the architectural aspects of configuration design are provided. Configuration derivation, building type as it relates to seismic design, and seismic design, and seismic issues in the design process are examined. Case studies of the Veterans' Administration Hospital in Loma Linda, California, and the Imperial Hotel in Tokyo, Japan, are presented. The seismic design process is described paying special attention to the configuration issues. The need is stressed for guidelines, codes, and regulations to ensure design solutions that respect and balance the full range of architectural, engineering, and material influences on seismic hazards.

  16. 2008 United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; ,

    2008-01-01

    The U.S. Geological Survey recently updated the National Seismic Hazard Maps by incorporating new seismic, geologic, and geodetic information on earthquake rates and associated ground shaking. The 2008 versions supersede those released in 1996 and 2002. These maps are the basis for seismic design provisions of building codes, insurance rate structures, earthquake loss studies, retrofit priorities, and land-use planning. Their use in design of buildings, bridges, highways, and critical infrastructure allows structures to better withstand earthquake shaking, saving lives and reducing disruption to critical activities following a damaging event. The maps also help engineers avoid costs from over-design for unlikely levels of ground motion.

  17. The effect of alternative seismotectonic models on PSHA results - a sensitivity study for two sites in Israel

    NASA Astrophysics Data System (ADS)

    Avital, Matan; Kamai, Ronnie; Davis, Michael; Dor, Ory

    2018-02-01

    We present a full probabilistic seismic hazard analysis (PSHA) sensitivity analysis for two sites in southern Israel - one in the near field of a major fault system and one farther away. The PSHA analysis is conducted for alternative source representations, using alternative model parameters for the main seismic sources, such as slip rate and Mmax, among others. The analysis also considers the effect of the ground motion prediction equation (GMPE) on the hazard results. In this way, the two types of epistemic uncertainty - modelling uncertainty and parametric uncertainty - are treated and addressed. We quantify the uncertainty propagation by testing its influence on the final calculated hazard, such that the controlling knowledge gaps are identified and can be treated in future studies. We find that current practice in Israel, as represented by the current version of the building code, grossly underestimates the hazard, by approximately 40 % in short return periods (e.g. 10 % in 50 years) and by as much as 150 % in long return periods (e.g. 10E-5). The analysis shows that this underestimation is most probably due to a combination of factors, including source definitions as well as the GMPE used for analysis.

  18. Deterministic seismic hazard macrozonation of India

    NASA Astrophysics Data System (ADS)

    Kolathayar, Sreevalsa; Sitharam, T. G.; Vipin, K. S.

    2012-10-01

    Earthquakes are known to have occurred in Indian subcontinent from ancient times. This paper presents the results of seismic hazard analysis of India (6°-38°N and 68°-98°E) based on the deterministic approach using latest seismicity data (up to 2010). The hazard analysis was done using two different source models (linear sources and point sources) and 12 well recognized attenuation relations considering varied tectonic provinces in the region. The earthquake data obtained from different sources were homogenized and declustered and a total of 27,146 earthquakes of moment magnitude 4 and above were listed in the study area. The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided into small grids of size 0.1° × 0.1° (approximately 10 × 10 km), and the hazard parameters were calculated at the center of each of these grid cells by considering all the seismic sources within a radius of 300 to 400 km. Rock level peak horizontal acceleration (PHA) and spectral accelerations for periods 0.1 and 1 s have been calculated for all the grid points with a deterministic approach using a code written in MATLAB. Epistemic uncertainty in hazard definition has been tackled within a logic-tree framework considering two types of sources and three attenuation models for each grid point. The hazard evaluation without logic tree approach also has been done for comparison of the results. The contour maps showing the spatial variation of hazard values are presented in the paper.

  19. Sweetwater, Texas Large N Experiment

    NASA Astrophysics Data System (ADS)

    Sumy, D. F.; Woodward, R.; Barklage, M.; Hollis, D.; Spriggs, N.; Gridley, J. M.; Parker, T.

    2015-12-01

    From 7 March to 30 April 2014, NodalSeismic, Nanometrics, and IRIS PASSCAL conducted a collaborative, spatially-dense seismic survey with several thousand nodal short-period geophones complemented by a backbone array of broadband sensors near Sweetwater, Texas. This pilot project demonstrates the efficacy of industry and academic partnerships, and leveraged a larger, commercial 3D survey to collect passive source seismic recordings to image the subsurface. This innovative deployment of a large-N mixed-mode array allows industry to explore array geometries and investigate the value of broadband recordings, while affording academics a dense wavefield imaging capability and an operational model for high volume instrument deployment. The broadband array consists of 25 continuously-recording stations from IRIS PASSCAL and Nanometrics, with an array design that maximized recording of horizontal-traveling seismic energy for surface wave analysis over the primary target area with sufficient offset for imaging objectives at depth. In addition, 2639 FairfieldNodal Zland nodes from NodalSeismic were deployed in three sub-arrays: the outlier, backbone, and active source arrays. The backbone array consisted of 292 nodes that covered the entire survey area, while the outlier array consisted of 25 continuously-recording nodes distributed at a ~3 km distance away from the survey perimeter. Both the backbone and outlier array provide valuable constraints for the passive source portion of the analysis. This project serves as a learning platform to develop best practices in the support of large-N arrays with joint industry and academic expertise. Here we investigate lessons learned from a facility perspective, and present examples of data from the various sensors and array geometries. We will explore first-order results from local and teleseismic earthquakes, and show visualizations of the data across the array. Data are archived at the IRIS DMC under stations codes XB and 1B.

  20. Appalachian Basin Play Fairway Analysis: Thermal Quality Analysis in Low-Temperature Geothermal Play Fairway Analysis (GPFA-AB

    DOE Data Explorer

    Teresa E. Jordan

    2015-11-15

    This collection of files are part of a larger dataset uploaded in support of Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB, DOE Project DE-EE0006726). Phase 1 of the GPFA-AB project identified potential Geothermal Play Fairways within the Appalachian basin of Pennsylvania, West Virginia and New York. This was accomplished through analysis of 4 key criteria or ‘risks’: thermal quality, natural reservoir productivity, risk of seismicity, and heat utilization. Each of these analyses represent a distinct project task, with the fifth task encompassing combination of the 4 risks factors. Supporting data for all five tasks has been uploaded into the Geothermal Data Repository node of the National Geothermal Data System (NGDS). This submission comprises the data for Thermal Quality Analysis (project task 1) and includes all of the necessary shapefiles, rasters, datasets, code, and references to code repositories that were used to create the thermal resource and risk factor maps as part of the GPFA-AB project. The identified Geothermal Play Fairways are also provided with the larger dataset. Figures (.png) are provided as examples of the shapefiles and rasters. The regional standardized 1 square km grid used in the project is also provided as points (cell centers), polygons, and as a raster. Two ArcGIS toolboxes are available: 1) RegionalGridModels.tbx for creating resource and risk factor maps on the standardized grid, and 2) ThermalRiskFactorModels.tbx for use in making the thermal resource maps and cross sections. These toolboxes contain “item description” documentation for each model within the toolbox, and for the toolbox itself. This submission also contains three R scripts: 1) AddNewSeisFields.R to add seismic risk data to attribute tables of seismic risk, 2) StratifiedKrigingInterpolation.R for the interpolations used in the thermal resource analysis, and 3) LeaveOneOutCrossValidation.R for the cross validations used in the thermal interpolations. Some file descriptions make reference to various 'memos'. These are contained within the final report submitted October 16, 2015. Each zipped file in the submission contains an 'about' document describing the full Thermal Quality Analysis content available, along with key sources, authors, citation, use guidelines, and assumptions, with the specific file(s) contained within the .zip file highlighted.

  1. Probabilistic seismic hazard at the archaeological site of Gol Gumbaz in Vijayapura, south India

    NASA Astrophysics Data System (ADS)

    Patil, Shivakumar G.; Menon, Arun; Dodagoudar, G. R.

    2018-03-01

    Probabilistic seismic hazard analysis (PSHA) is carried out for the archaeological site of Vijayapura in south India in order to obtain hazard consistent seismic input ground-motions for seismic risk assessment and design of seismic protection measures for monuments, where warranted. For this purpose the standard Cornell-McGuire approach, based on seismogenic zones with uniformly distributed seismicity is employed. The main features of this study are the usage of an updated and unified seismic catalogue based on moment magnitude, new seismogenic source models and recent ground motion prediction equations (GMPEs) in logic tree framework. Seismic hazard at the site is evaluated for level and rock site condition with 10% and 2% probabilities of exceedance in 50 years, and the corresponding peak ground accelerations (PGAs) are 0.074 and 0.142 g, respectively. In addition, the uniform hazard spectra (UHS) of the site are compared to the Indian code-defined spectrum. Comparisons are also made with results from National Disaster Management Authority (NDMA 2010), in terms of PGA and pseudo spectral accelerations (PSAs) at T = 0.2, 0.5, 1.0 and 1.25 s for 475- and 2475-yr return periods. Results of the present study are in good agreement with the PGA calculated from isoseismal map of the Killari earthquake, {M}w = 6.4 (1993). Disaggregation of PSHA results for the PGA and spectral acceleration ({S}a) at 0.5 s, displays the controlling scenario earthquake for the study region as low to moderate magnitude with the source being at a short distance from the study site. Deterministic seismic hazard (DSHA) is also carried out by taking into account three scenario earthquakes. The UHS corresponding to 475-yr return period (RP) is used to define the target spectrum and accordingly, the spectrum-compatible natural accelerograms are selected from the suite of recorded accelerograms.

  2. Method for rapid high-frequency seismogram calculation

    NASA Astrophysics Data System (ADS)

    Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo

    2009-02-01

    We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).

  3. A Real-time, Borehole, Geophysical Observatory Above The Cascadia Subduction Zone

    NASA Astrophysics Data System (ADS)

    Collins, J. A.; McGuire, J. J.; Becker, K.; O'Brien, J. K.; von der Heydt, K.; Heesemann, M.; Davis, E. E.

    2017-12-01

    In July 2016, a team from WHOI and RSMAS installed a suite of seismic, geodetic and geothermal sensors in IODP borehole U1364A on the Cascadia Accretionary Prism offshore Vancouver Island. The borehole observatory was connected to the Clayoquot Slope node of the Ocean Networks Canada NEPTUNE Observatory in June 2017. The 3 km long extension cable provides power, timing, and internet connectivity. The borehole sits 4 km above the subduction zone thrust interface, and when drilled in 2010 was instrumented with an ACORK (Advanced Circulation Obviation Retrofit Kit) that allows monitoring and sampling of fluids from multiple zones within the 330 m drilled formation. The borehole ground-motion sensors consist of a broadband seismometer and two geodetic-quality (nano-radian resolution) two-axis tilt sensors clamped to the borehole casing wall at a depth of 277 m below the seafloor. The tilt sensors were selected to detect non-seismic, strain-related transients. A 24-thermistor cable extends from the seafloor to just above the seismometer and tilt-sensor package. The seismic and geodetic data have been flowing from the observatory (network code NV, station code CQS64, location codes B1, B2, and B3) since June and are available from the IRIS DMC. Initial inspection of the seismic and geodetic data shows that all sensors are operating well. We will report on station performance and detection thresholds using an anticipated 5 month duration data set.

  4. Modeling the Fluid Withdraw and Injection Induced Earthquakes

    NASA Astrophysics Data System (ADS)

    Meng, C.

    2016-12-01

    We present an open source numerical code, Defmod, that allows one to model the induced seismicity in an efficient and standalone manner. The fluid withdraw and injection induced earthquake has been a great concern to the industries including oil/gas, wastewater disposal and CO2 sequestration. Being able to numerically model the induced seismicity is long desired. To do that, one has to consider at lease two processes, a steady process that describes the inducing and aseismic stages before and in between the seismic events, and an abrupt process that describes the dynamic fault rupture accompanied by seismic energy radiations during the events. The steady process can be adequately modeled by a quasi-static model, while the abrupt process has to be modeled by a dynamic model. In most of the published modeling works, only one of these processes is considered. The geomechanicists and reservoir engineers are focused more on the quasi-static modeling, whereas the geophysicists and seismologists are focused more on the dynamic modeling. The finite element code Defmod combines these two models into a hybrid model that uses the failure criterion and frictional laws to adaptively switch between the (quasi-)static and dynamic states. The code is capable of modeling episodic fault rupture driven by quasi-static loading, e.g. due to reservoir fluid withdraw and/or injection, and by dynamic loading, e.g. due to the foregoing earthquakes. We demonstrate a case study for the 2013 Azle earthquake.

  5. Masonry Infilling Effect On Seismic Vulnerability and Performance Level of High Ductility RC Frames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghalehnovi, M.; Shahraki, H.

    2008-07-08

    In last years researchers preferred behavior-based design of structure to force-based one for designing and construction of the earthquake-resistance structures, this method is named performance based designing. The main goal of this method is designing of structure members for a certain performance or behavior. On the other hand in most of buildings, load bearing frames are infilled with masonry materials which leads to considerable changes in mechanical properties of frames. But usually infilling wall's effect has been ignored in nonlinear analysis of structures because of complication of the problem and lack of simple logical solution. As a result lateral stiffness,more » strength, ductility and performance of the structure will be computed with less accuracy. In this paper by use of Smooth hysteretic model for masonry infillings, some high ductile RC frames (4, 8 stories including 1, 2 and 3 spans) designed according to Iranian code are considered. They have been analyzed by nonlinear dynamic method in two states, with and without infilling. Then their performance has been determined with criteria of ATC 40 and compared with recommended performance in Iranian seismic code (standard No. 2800)« less

  6. Inversion of ground-motion data from a seismometer array for rotation using a modification of Jaeger's method

    USGS Publications Warehouse

    Chi, Wu-Cheng; Lee, W.H.K.; Aston, J.A.D.; Lin, C.J.; Liu, C.-C.

    2011-01-01

    We develop a new way to invert 2D translational waveforms using Jaeger's (1969) formula to derive rotational ground motions about one axis and estimate the errors in them using techniques from statistical multivariate analysis. This procedure can be used to derive rotational ground motions and strains using arrayed translational data, thus providing an efficient way to calibrate the performance of rotational sensors. This approach does not require a priori information about the noise level of the translational data and elastic properties of the media. This new procedure also provides estimates of the standard deviations of the derived rotations and strains. In this study, we validated this code using synthetic translational waveforms from a seismic array. The results after the inversion of the synthetics for rotations were almost identical with the results derived using a well-tested inversion procedure by Spudich and Fletcher (2009). This new 2D procedure can be applied three times to obtain the full, three-component rotations. Additional modifications can be implemented to the code in the future to study different features of the rotational ground motions and strains induced by the passage of seismic waves.

  7. VS30, site amplifications and some comparisons: The Adapazari (Turkey) case

    NASA Astrophysics Data System (ADS)

    Ozcep, Tazegul; Ozcep, Ferhat; Ozel, Oguz

    The aim of this study was to investigate the role of VS30 in site amplifications in the Adapazari region, Turkey. To fulfil this aim, amplifications from VS30 measurements were compared with earthquake data for different soil types in the seismic design codes. The Adapazari area was selected as the study area, and shear-wave velocity distribution was obtained by the multichannel analysis of surface waves (MASWs) method at 100 sites for the top 50 m of soil. Aftershock data following the Mw 7.4 Izmit earthquake of 17 August 1999 gave magnitudes between 4.0 and 5.6 at six stations installed in and around the Adapazari Basin, at Babalı, Şeker, Genç, Hastane, Toyota and Imar. This data was used to estimate site amplifications by the reference-station method. In addition, the fundamental periods of the station sites were estimated by the single station method. Site classifications based on VS30 in the seismic design codes were compared with the fundamental periods and amplification values. It was found that site amplifications (from earthquake data) and relevant spectra (from VS30) are not in good agreement for soils in Adapazari (Turkey).

  8. Probabilistic Seismic Hazard Analysis of Victoria, British Columbia, Canada: Considering an Active Leech River Fault

    NASA Astrophysics Data System (ADS)

    Kukovica, J.; Molnar, S.; Ghofrani, H.

    2017-12-01

    The Leech River fault is situated on Vancouver Island near the city of Victoria, British Columbia, Canada. The 60km transpressional reverse fault zone runs east to west along the southern tip of Vancouver Island, dividing the lithologic units of Jurassic-Cretaceous Leech River Complex schists to the north and Eocene Metchosin Formation basalts to the south. This fault system poses a considerable hazard due to its proximity to Victoria and 3 major hydroelectric dams. The Canadian seismic hazard model for the 2015 National Building Code of Canada (NBCC) considered the fault system to be inactive. However, recent paleoseismic evidence suggests there to be at least 2 surface-rupturing events to have exceeded a moment magnitude (M) of 6.5 within the last 15,000 years (Morell et al. 2017). We perform a Probabilistic Seismic Hazard Analysis (PSHA) for the city of Victoria with consideration of the Leech River fault as an active source. A PSHA for Victoria which replicates the 2015 NBCC estimates is accomplished to calibrate our PSHA procedure. The same seismic source zones, magnitude recurrence parameters, and Ground Motion Prediction Equations (GMPEs) are used. We replicate the uniform hazard spectrum for a probability of exceedance of 2% in 50 years for a 500 km radial area around Victoria. An active Leech River fault zone is then added; known length and dip. We are determining magnitude recurrence parameters based on a Gutenberg-Richter relationship for the Leech River fault from various catalogues of the recorded seismicity (M 2-3) within the fault's vicinity and the proposed paleoseismic events. We seek to understand whether inclusion of an active Leech River fault source will significantly increase the probabilistic seismic hazard for Victoria. Morell et al. 2017. Quaternary rupture of a crustal fault beneath Victoria, British Columbia, Canada. GSA Today, 27, doi: 10.1130/GSATG291A.1

  9. New seismic hazard maps for Puerto Rico and the U.S. Virgin Islands

    USGS Publications Warehouse

    Mueller, C.; Frankel, A.; Petersen, M.; Leyendecker, E.

    2010-01-01

    The probabilistic methodology developed by the U.S. Geological Survey is applied to a new seismic hazard assessment for Puerto Rico and the U.S. Virgin Islands. Modeled seismic sources include gridded historical seismicity, subduction-interface and strike-slip faults with known slip rates, and two broad zones of crustal extension with seismicity rates constrained by GPS geodesy. We use attenuation relations from western North American and worldwide data, as well as a Caribbean-specific relation. Results are presented as maps of peak ground acceleration and 0.2- and 1.0-second spectral response acceleration for 2% and 10% probabilities of exceedance in 50 years (return periods of about 2,500 and 500 years, respectively). This paper describes the hazard model and maps that were balloted by the Building Seismic Safety Council and recommended for the 2003 NEHRP Provisions and the 2006 International Building Code. ?? 2010, Earthquake Engineering Research Institute.

  10. Development of damage probability matrices based on Greek earthquake damage data

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.

    2011-03-01

    A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio α g/ a o, where α g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.

  11. Wavelet extractor: A Bayesian well-tie and wavelet extraction program

    NASA Astrophysics Data System (ADS)

    Gunning, James; Glinsky, Michael E.

    2006-06-01

    We introduce a new open-source toolkit for the well-tie or wavelet extraction problem of estimating seismic wavelets from seismic data, time-to-depth information, and well-log suites. The wavelet extraction model is formulated as a Bayesian inverse problem, and the software will simultaneously estimate wavelet coefficients, other parameters associated with uncertainty in the time-to-depth mapping, positioning errors in the seismic imaging, and useful amplitude-variation-with-offset (AVO) related parameters in multi-stack extractions. It is capable of multi-well, multi-stack extractions, and uses continuous seismic data-cube interpolation to cope with the problem of arbitrary well paths. Velocity constraints in the form of checkshot data, interpreted markers, and sonic logs are integrated in a natural way. The Bayesian formulation allows computation of full posterior uncertainties of the model parameters, and the important problem of the uncertain wavelet span is addressed uses a multi-model posterior developed from Bayesian model selection theory. The wavelet extraction tool is distributed as part of the Delivery seismic inversion toolkit. A simple log and seismic viewing tool is included in the distribution. The code is written in Java, and thus platform independent, but the Seismic Unix (SU) data model makes the inversion particularly suited to Unix/Linux environments. It is a natural companion piece of software to Delivery, having the capacity to produce maximum likelihood wavelet and noise estimates, but will also be of significant utility to practitioners wanting to produce wavelet estimates for other inversion codes or purposes. The generation of full parameter uncertainties is a crucial function for workers wishing to investigate questions of wavelet stability before proceeding to more advanced inversion studies.

  12. Neo-deterministic seismic hazard scenarios for India—a preventive tool for disaster mitigation

    NASA Astrophysics Data System (ADS)

    Parvez, Imtiyaz A.; Magrin, Andrea; Vaccari, Franco; Ashish; Mir, Ramees R.; Peresan, Antonella; Panza, Giuliano Francesco

    2017-11-01

    Current computational resources and physical knowledge of the seismic wave generation and propagation processes allow for reliable numerical and analytical models of waveform generation and propagation. From the simulation of ground motion, it is easy to extract the desired earthquake hazard parameters. Accordingly, a scenario-based approach to seismic hazard assessment has been developed, namely the neo-deterministic seismic hazard assessment (NDSHA), which allows for a wide range of possible seismic sources to be used in the definition of reliable scenarios by means of realistic waveforms modelling. Such reliable and comprehensive characterization of expected earthquake ground motion is essential to improve building codes, particularly for the protection of critical infrastructures and for land use planning. Parvez et al. (Geophys J Int 155:489-508, 2003) published the first ever neo-deterministic seismic hazard map of India by computing synthetic seismograms with input data set consisting of structural models, seismogenic zones, focal mechanisms and earthquake catalogues. As described in Panza et al. (Adv Geophys 53:93-165, 2012), the NDSHA methodology evolved with respect to the original formulation used by Parvez et al. (Geophys J Int 155:489-508, 2003): the computer codes were improved to better fit the need of producing realistic ground shaking maps and ground shaking scenarios, at different scale levels, exploiting the most significant pertinent progresses in data acquisition and modelling. Accordingly, the present study supplies a revised NDSHA map for India. The seismic hazard, expressed in terms of maximum displacement (Dmax), maximum velocity (Vmax) and design ground acceleration (DGA), has been extracted from the synthetic signals and mapped on a regular grid over the studied territory.

  13. Evaluation of ground motion scaling methods for analysis of structural systems

    USGS Publications Warehouse

    O'Donnell, A. P.; Beltsar, O.A.; Kurama, Y.C.; Kalkan, E.; Taflanidis, A.A.

    2011-01-01

    Ground motion selection and scaling comprises undoubtedly the most important component of any seismic risk assessment study that involves time-history analysis. Ironically, this is also the single parameter with the least guidance provided in current building codes, resulting in the use of mostly subjective choices in design. The relevant research to date has been primarily on single-degree-of-freedom systems, with only a few studies using multi-degree-of-freedom systems. Furthermore, the previous research is based solely on numerical simulations with no experimental data available for the validation of the results. By contrast, the research effort described in this paper focuses on an experimental evaluation of selected ground motion scaling methods based on small-scale shake-table experiments of re-configurable linearelastic and nonlinear multi-story building frame structure models. Ultimately, the experimental results will lead to the development of guidelines and procedures to achieve reliable demand estimates from nonlinear response history analysis in seismic design. In this paper, an overview of this research effort is discussed and preliminary results based on linear-elastic dynamic response are presented. ?? ASCE 2011.

  14. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  15. Numerical assessment of the influence of different joint hysteretic models over the seismic behaviour of Moment Resisting Steel Frames

    NASA Astrophysics Data System (ADS)

    Giordano, V.; Chisari, C.; Rizzano, G.; Latour, M.

    2017-10-01

    The main aim of this work is to understand how the prediction of the seismic performance of moment-resisting (MR) steel frames depends on the modelling of their dissipative zones when the structure geometry (number of stories and bays) and seismic excitation source vary. In particular, a parametric analysis involving 4 frames was carried out, and, for each one, the full-strength beam-to-column connections were modelled according to 4 numerical approaches with different degrees of sophistication (Smooth Hysteretic Model, Bouc-Wen, Hysteretic and simple Elastic-Plastic models). Subsequently, Incremental Dynamic Analyses (IDA) were performed by considering two different earthquakes (Spitak and Kobe). The preliminary results collected so far pointed out that the influence of the joint modelling on the overall frame response is negligible up to interstorey drift ratio values equal to those conservatively assumed by the codes to define conventional collapse (0.03 rad). Conversely, if more realistic ultimate interstorey drift values are considered for the q-factor evaluation, the influence of joint modelling can be significant, and thus may require accurate modelling of its cyclic behavior.

  16. Analytical simulation of nonlinear response to seismic test excitations of HDR-VKL (Heissdampfreaktor-Versuchskreislauf) piping system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinivasan, M.G.; Kot, C.A.; Mojtahed, M.

    The paper describes the analytical modeling, calculations, and results of the posttest nonlinear simulation of high-level seismic testing of the VKL piping system at the HDR Test Facility in Germany. One of the objectives of the tests was to evaluate analytical methods for calculating the nonlinear response of realistic piping systems subjected to high-level seismic excitation that would induce significant plastic deformation. Two out of the six different pipe-support configurations, (ranging from a stiff system with struts and snubbers to a very flexible system with practically no seismic supports), subjected to simulated earthquakes, were tested at very high levels. Themore » posttest nonlinear calculations cover the KWU configuration, a reasonably compliant system with only rigid struts. Responses for 800% safe-shutdown-earthquake loading were calculated using the NONPIPE code. The responses calculated with NONPIPE were found generally to have the same time trends as the measurements but contained under-, over-, and correct estimates of peak values, almost in equal proportions. The only exceptions were the peak strut forces, which were underestimated as a group. The scatter in the peak value estimate of displacements and strut forces was smaller than that for the strains. The possible reasons for the differences and the effort on further analysis are discussed.« less

  17. The Sacred Mountain of Varallo in Italy: Seismic Risk Assessment by Acoustic Emission and Structural Numerical Models

    PubMed Central

    Carpinteri, Alberto; Invernizzi, Stefano; Accornero, Federico

    2013-01-01

    We examine an application of Acoustic Emission (AE) technique for a probabilistic analysis in time and space of earthquakes, in order to preserve the valuable Italian Renaissance Architectural Complex named “The Sacred Mountain of Varallo.” Among the forty-five chapels of the Renaissance Complex, the structure of the Chapel XVII is of particular concern due to its uncertain structural condition and due to the level of stress caused by the regional seismicity. Therefore, lifetime assessment, taking into account the evolution of damage phenomena, is necessary to preserve the reliability and safety of this masterpiece of cultural heritage. A continuous AE monitoring was performed to assess the structural behavior of the Chapel. During the monitoring period, a correlation between peaks of AE activity in the masonry of the “Sacred Mountain of Varallo” and regional seismicity was found. Although the two phenomena take place on very different scales, the AE in materials and the earthquakes in Earth's crust, belong to the same class of invariance. In addition, an accurate finite element model, performed with DIANA finite element code, is presented to describe the dynamic behavior of Chapel XVII structure, confirming visual and instrumental inspections of regional seismic effects. PMID:24381511

  18. Experimental study on the seismic performance of new sandwich masonry walls

    NASA Astrophysics Data System (ADS)

    Xiao, Jianzhuang; Pu, Jie; Hu, Yongzhong

    2013-03-01

    Sandwich masonry walls are widely used as energy-saving panels since the interlayer between the outer leaves can act as an insulation layer. New types of sandwich walls are continually being introduced in research and applications, and due to their unique bond patterns, experimental studies have been performed to investigate their mechanical properties, especially with regard to their seismic performance. In this study, three new types of sandwich masonry wall have been designed, and cyclic lateral loading tests were carried out on five specimens. The results showed that the specimens failed mainly due to slippage along the bottom cracks or the development of diagonal cracks, and the failure patterns were considerably influenced by the aspect ratio. Analysis was undertaken on the seismic response of the new walls, which included ductility, stiffness degradation and energy dissipation capacity, and no obvious difference was observed between the seismic performance of the new walls and traditional walls. Comparisons were made between the experimental results and the calculated results of the shear capacity. It is concluded that the formulas in the two Chinese codes (GB 50011 and GB 50003) are suitable for the calculation of the shear capacity for the new types of walls, and the formula in GB 50011 tends to be more conservative.

  19. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  20. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Structures. Copies are available from the American Society of Civil Engineers, Publications Marketing Department, 1801 Alexander Bell Drive, Reston, VA 20191-4400. E-mail: marketing@asce.org. Telephone: (800) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code...

  1. Revealing small-scale diffracting discontinuities by an optimization inversion algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Caixia; Zhao, Jingtao; Wang, Yanfei

    2017-02-01

    Small-scale diffracting geologic discontinuities play a significant role in studying carbonate reservoirs. The seismic responses of them are coded in diffracted/scattered waves. However, compared with reflections, the energy of these valuable diffractions is generally one or even two orders of magnitude weaker. This means that the information of diffractions is strongly masked by reflections in the seismic images. Detecting the small-scale cavities and tiny faults from the deep carbonate reservoirs, mainly over 6 km, poses an even bigger challenge to seismic diffractions, as the signals of seismic surveyed data are weak and have a low signal-to-noise ratio (SNR). After analyzing the mechanism of the Kirchhoff migration method, the residual of prestack diffractions located in the neighborhood of the first Fresnel aperture is found to remain in the image space. Therefore, a strategy for extracting diffractions in the image space is proposed and a regularized L 2-norm model with a smooth constraint to the local slopes is suggested for predicting reflections. According to the focusing conditions of residual diffractions in the image space, two approaches are provided for extracting diffractions. Diffraction extraction can be directly accomplished by subtracting the predicted reflections from seismic imaging data if the residual diffractions are focused. Otherwise, a diffraction velocity analysis will be performed for refocusing residual diffractions. Two synthetic examples and one field application demonstrate the feasibility and efficiency of the two proposed methods in detecting the small-scale geologic scatterers, tiny faults and cavities.

  2. Assessment of seismic loading on structures based on airborne LiDAR data from the Kalochori urban area (N. Greece)

    NASA Astrophysics Data System (ADS)

    Rovithis, Emmanouil; Kirtas, Emmanouil; Marini, Eleftheria; Bliziotis, Dimitris; Maltezos, Evangelos; Pitilakis, Dimitris; Makra, Konstantia; Savvaidis, Alexandros

    2016-08-01

    Airborne LiDAR monitoring integrated with field data is employed to assess the fundamental period and the seismic loading of structures composing an urban area under prescribed earthquake scenarios. Α piecewise work-flow is adopted by combining geometrical data of the building stock derived from a LiDAR-based 3D city model, structural data from in-situ inspections on representative city blocks and results of soil response analyses. The procedure is implemented in the residential area of Kalochori, (west of Thessaloniki in Northern Greece). Special attention is paid to the in-situ inspection of the building stock in order to discriminate recordings between actual buildings and man-made constructions that do not conform to seismic design codes and to acquire additional building stock data on structural materials, typologies and number of stories which is not feasible by the LiDAR process. The processed LiDAR and field data are employed to compute the fundamental period of each building by means of code-defined formulas. Knowledge of soil conditions in the Kalochoti area allows for soil response analyses to obtain free-field at ground surface under earthquake scenarios with varying return period. Upon combining the computed vibrational characteristics of the structures with the free-field response spectra, the seismic loading imposed on the structures of the urban area under investigation is derived for each one of the prescribed seismic motions. Results are presented in GIS environment in the form of spatially distributed spectral accelerations with direct implications in seismic vulnerability studies of an urban area.

  3. 78 FR 59732 - Revisions to Design of Structures, Components, Equipment, and Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-27

    ...,'' Section 3.7.2, ``Seismic System Analysis,'' Section 3.7.3, ``Seismic Subsystem Analysis,'' Section 3.8.1... Analysis,'' (Accession No. ML13198A223); Section 3.7.3, ``Seismic Subsystem Analysis,'' (Accession No..., ``Seismic System Analysis,'' Section 3.7.3, ``Seismic Subsystem Analysis,'' Section 3.8.1, ``Concrete...

  4. Updated Colombian Seismic Hazard Map

    NASA Astrophysics Data System (ADS)

    Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

    2013-05-01

    The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is possible to determinate environments and scenarios where the seismic hazard is a function of distance and magnitude and also the principal seismic sources that contribute to the seismic hazard at each site (dissagregation). This project was conducted by the Servicio Geológico Colombiano (Colombian Geological Survey) and the Universidad Nacional de Colombia (National University of Colombia), with the collaboration of national and foreign experts and the National System of Prevention and Attention of Disaster (SNPAD). It is important to stand out that this new seismic hazard map was used in the updated national building code (NSR-10). A new process is ongoing in order to improve and present the Seismic Hazard Map in terms of intensity. This require new knowledge in site effects, in both local and regional scales, checking the existing and develop new acceleration to intensity relationships, in order to obtain results more understandable and useful for a wider range of users, not only in the engineering field, but also all the risk assessment and management institutions, research and general community.

  5. An under-designed RC frame: Seismic assessment through displacement based approach and possible refurbishment with FRP strips and RC jacketing

    NASA Astrophysics Data System (ADS)

    Valente, Marco; Milani, Gabriele

    2017-07-01

    Many existing reinforced concrete buildings in Southern Europe were built (and hence designed) before the introduction of displacement based design in national seismic codes. They are obviously highly vulnerable to seismic actions. In such a situation, simplified methodologies for the seismic assessment and retrofitting of existing structures are required. In this study, a displacement based procedure using non-linear static analyses is applied to a four-story existing RC frame. The aim is to obtain an estimation of its overall structural inadequacy as well as the effectiveness of a specific retrofitting intervention by means of GFRP laminates and RC jacketing. Accurate numerical models are developed within a displacement based approach to reproduce the seismic response of the RC frame in the original configuration and after strengthening.

  6. Geomorphic Proxies to Test Strain Accommodation in Southwestern Puerto Rico from Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Barrios Galindez, I. M.; Xue, L.; Laó-Dávila, D. A.

    2017-12-01

    The Puerto Rico and the Virgin Island microplate is located in at the northeastern corner of the Caribbean plate boundary with North America is placed within an oblique subduction zone in which strain patterns remain unresolved. Seismic hazard is a major concern in the region as seen from the seismic history of the Caribbean-North America plate boundary zone. Most of the tectonic models of the microplate show the accommodation of strain occurring offshore, despite evidence from seismic activity, trench studies, and geodetic studies suggesting the existence of strain accomodation in southwest Puerto Rico. These studies also suggest active faulting specially in the western part of the island, but limited work has been done regarding their mechanism. Therefore, this work aims to define and map these active faults in western Puerto Rico by integrating data from analysis of fluvial terrains, and detailed mapping using digital elevation model (DEM) extracted from Shuttle Radar Topography Mission (SRTM) and LIDAR data. The goal is to (1) identify structural features such as surface lineaments and fault scarps for the Cerro Goden fault, South Lajas fault, and other active faults in the western of Puerto Rico, (2) correlate these information with the distribution pattern and values of the geomorphic proxies, including Chi integral (χ), normalized steepness (ksn) and Asymmetric factor (AF). Our preliminary results from geomorphic proxies and Lidar data provide some insight of the displacement and stage of activities of these faults (e.g. Boqueron-Punta Malva Fault and Cerro Goden fault). Also, the anomaly of the geomorphic proxies generally correlate with the locations of the landslides in the southwestern Puerto Rico. The geomorphic model of this work include new information of active faulting fundamental to produce better seismic hazards maps. Additionally, active tectonics studies are vital to issue and adjust construction buildings codes and zonification codes.

  7. Experimental study on lateral strength of wall-slab joint subjected to lateral cyclic load

    NASA Astrophysics Data System (ADS)

    Masrom, Mohd Asha'ari; Mohamad, Mohd Elfie; Hamid, Nor Hayati Abdul; Yusuff, Amer

    2017-10-01

    Tunnel form building has been utilised in building construction since 1960 in Malaysia. This method of construction has been applied extensively in the construction of high rise residential house (multistory building) such as condominium and apartment. Most of the tunnel form buildings have been designed according to British standard (BS) whereby there is no provision for seismic loading. The high-rise tunnel form buildings are vulnerable to seismic loading. The connections between slab and shear walls in the tunnel-form building constitute an essential link in the lateral load resisting mechanism. Malaysia is undergoing a shifting process from BS code to Eurocode (EC) for building construction since the country has realised the safety threats of earthquake. Hence, this study is intended to compare the performance of the interior wall slab joint for a tunnel form structure designed based on Euro and British codes. The experiment included a full scale test of the wall slab joint sub-assemblages under reversible lateral cyclic loading. Two sub-assemblage specimens of the wall slab joint were designed and constructed based on both codes. Each specimen was tested using lateral displacement control (drift control). The specimen designed by using Eurocode was found could survive up to 3.0% drift while BS specimen could last to 1.5% drift. The analysis results indicated that the BS specimen was governed by brittle failure modes with Ductility Class Low (DCL) while the EC specimen behaved in a ductile manner with Ductility Class Medium (DCM). The low ductility recorded in BS specimen was resulted from insufficient reinforcement provided in the BS code specimen. Consequently, the BS specimen could not absorb energy efficiently (low energy dissipation) and further sustain under inelastic deformation.

  8. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spears, Robert Edward; Coleman, Justin Leigh

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soilmore » and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE’s) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This methodology will be known as, NonLinear Soil-Structure Interaction (NLSSI). In general NLSSI analysis should provide a more accurate representation of the seismic demands on nuclear facilities their systems and components. INL, in collaboration with a Nuclear Power Plant Vender (NPP-V), will develop a generic Nuclear Power Plant (NPP) structural design to be used in development of the methodology and for comparison with SASSI. This generic NPP design has been evaluated for the INL soil site because of the ease of access and quality of the site specific data. It is now being evaluated for a second site at Vogtle which is located approximately 15 miles East-Northeast of Waynesboro, Georgia and adjacent to Savanna River. The Vogtle site consists of many soil layers spanning down to a depth of 1058 feet. The reason that two soil sites are chosen is to demonstrate the methodology across multiple soil sites. The project will drive the models (soil and structure) using successively increasing acceleration time histories with amplitudes. The models will be run in time domain codes such as ABAQUS, LS-DYNA, and/or ESSI and compared with the same models run in SASSI. The project is focused on developing and documenting a method for performing time domain, non-linear seismic soil structure interaction (SSI) analysis. Development of this method will provide the Department of Energy (DOE) and industry with another tool to perform seismic SSI analysis.« less

  9. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part-I. Bureau of Indian Standards, New Delhi, 2002]. Not only holistic treatment of earthquake catalog and seismogenic zones has been performed, but also higher resolution in spatial distribution could be achieved. The COV maps have been provided with the strong ground-motion maps under various conditions to show the confidence in the results obtained. Results obtained in the present study would be helpful for risk assessment and other disaster mitigation-related studies.

  10. Visualizing how Seismic Waves Propagate Across Seismic Arrays using the IRIS DMS Ground Motion Visualization (GMV) Products and Codes

    NASA Astrophysics Data System (ADS)

    Taber, J.; Bahavar, M.; Bravo, T. K.; Butler, R. F.; Kilb, D. L.; Trabant, C.; Woodward, R.; Ammon, C. J.

    2011-12-01

    Data from dense seismic arrays can be used to visualize the propagation of seismic waves, resulting in animations effective for teaching both general and advanced audiences. One of the first visualizations of this type was developed using Objective C code and EarthScope/USArray data, which was then modified and ported to the Matlab platform and has now been standardized and automated as an IRIS Data Management System (IRIS-DMS) data product. These iterative code developments and improvements were completed by C. Ammon, R. Woodward and M. Bahavar, respectively. Currently, an automated script creates Ground Motion Visualizations (GMVs) for all global earthquakes over magnitude 6 recorded by EarthScope's USArray Transportable Array (USArray TA) network. The USArray TA network is a rolling array of 400 broadband stations deployed on a uniform 70-km grid. These near real-time GMV visualizations are typically available for download within 4 hours or less of their occurrence (see: www.iris.edu/dms/products/usarraygmv/). The IRIS-DMS group has recently added a feature that allows users to highlight key elements within the GMVs, by providing an online tool for creating customized GMVs. This new interface allows users to select the stations, channels, and time window of interest, adjust the mapped areal extent of the view, and specify high and low pass filters. An online tutorial available from the IRIS Education and Public Outreach (IRIS-EPO) website, listed below, steps through a teaching sequence that can be used to explain the basic features of the GMVs. For example, they can be used to demonstrate simple concepts such as relative P, S and surface wave velocities and corresponding wavelengths for middle-school students, or more advanced concepts such as the influence of focal mechanism on waveforms, or how seismic waves converge at an earthquake's antipode. For those who desire a greater level of customization, including the ability to use the GMV framework with data sets not stored within the IRIS-DMS, the Matlab GMV code is now also available from the IRIS-DMS website. These GMV codes have been applied to sac-formatted data from the Quake Catcher Network (QCN). Through a collaboration between NSF-funded programs and projects (e.g., IRIS and QCN) we are striving to make these codes user friendly enough to be routinely incorporated in undergraduate and graduate seismology classes. In this way, we will help provide a research tool for students to explore never-looked-at-before data, similar to actual seismology research. As technology is advancing quickly, we now have more data than seismologists can easily examine. Given this, we anticipate students using our codes can perform a 'citizen scientist' role in that they can help us identify key signals within the unexamined vast data streams we are acquiring.

  11. TOMO3D: 3-D joint refraction and reflection traveltime tomography parallel code for active-source seismic data—synthetic test

    NASA Astrophysics Data System (ADS)

    Meléndez, A.; Korenaga, J.; Sallarès, V.; Miniussi, A.; Ranero, C. R.

    2015-10-01

    We present a new 3-D traveltime tomography code (TOMO3D) for the modelling of active-source seismic data that uses the arrival times of both refracted and reflected seismic phases to derive the velocity distribution and the geometry of reflecting boundaries in the subsurface. This code is based on its popular 2-D version TOMO2D from which it inherited the methods to solve the forward and inverse problems. The traveltime calculations are done using a hybrid ray-tracing technique combining the graph and bending methods. The LSQR algorithm is used to perform the iterative regularized inversion to improve the initial velocity and depth models. In order to cope with an increased computational demand due to the incorporation of the third dimension, the forward problem solver, which takes most of the run time (˜90 per cent in the test presented here), has been parallelized with a combination of multi-processing and message passing interface standards. This parallelization distributes the ray-tracing and traveltime calculations among available computational resources. The code's performance is illustrated with a realistic synthetic example, including a checkerboard anomaly and two reflectors, which simulates the geometry of a subduction zone. The code is designed to invert for a single reflector at a time. A data-driven layer-stripping strategy is proposed for cases involving multiple reflectors, and it is tested for the successive inversion of the two reflectors. Layers are bound by consecutive reflectors, and an initial velocity model for each inversion step incorporates the results from previous steps. This strategy poses simpler inversion problems at each step, allowing the recovery of strong velocity discontinuities that would otherwise be smoothened.

  12. United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; ,

    2008-01-01

    The U.S. Geological Survey?s maps of earthquake shaking hazards provide information essential to creating and updating the seismic design provisions of building codes and insurance rates used in the United States. Periodic revisions of these maps incorporate the results of new research. Buildings, bridges, highways, and utilities built to meet modern seismic design provisions are better able to withstand earthquakes, not only saving lives but also enabling critical activities to continue with less disruption. These maps can also help people assess the hazard to their homes or places of work and can also inform insurance rates.

  13. Seismic Waveform Modeling of Broadband Data From a Temporary High-Density Deployment in the Los Angeles Basin

    NASA Astrophysics Data System (ADS)

    Herrman, M.; Polet, J.

    2016-12-01

    A total of 73 broadband seismometers were deployed for a passive source seismic experiment called the Los Angeles Syncline Seismic Interferometry Experiment (LASSIE) from September to November of 2014. The purpose of this experiment was to collect high density seismic data for the Los Angeles Basin (LAB) to better understand basin structure and response. This research will use the data collected from LASSIE to assess and refine current velocity models of the LAB using a full waveform modeling approach. To this end we will compare seismograms recorded by LASSIE for a subset of the 53 earthquakes and quarry blasts located by the Southern California Seismic Network (SCSN) that occurred within or near the LAB during the deployment period to synthetic seismograms generated by the Frequency-Wavenumber (FK) code developed by Zhu and Rivera (2002). A first analysis of the data indicates that roughly 25 of the 53 events have waveforms with sufficiently high signal to noise ratio, providing approximately 500 seismograms that are of suitable quality for comparison. We observe significant changes in waveform characteristics between stations with a very small separation distance of approximately 1 km. Focal mechanisms for most of these events have been obtained from Dr. Egill Hauksson (personal communication). We will show comparisons between the broadband velocity waveforms recorded by stations across the LASSIE array and FK synthetics determined for a variety of 1D velocity models that have been developed for the LAB area (such as Hadley and Kanamori, 1977; Hauksson, 1989, 1995 and Magistrale, 1992). The results of these comparisons will be analyzed to provide additional constraints on the subsurface seismic velocity structure within the Los Angeles basin.

  14. VS30 – A site-characterization parameter for use in building Codes, simplified earthquake resistant design, GMPEs, and ShakeMaps

    USGS Publications Warehouse

    Borcherdt, Roger D.

    2012-01-01

    VS30, defined as the average seismic shear-wave velocity from the surface to a depth of 30 meters, has found wide-spread use as a parameter to characterize site response for simplified earthquake resistant design as implemented in building codes worldwide. VS30 , as initially introduced by the author for the US 1994 NEHRP Building Code, provides unambiguous definitions of site classes and site coefficients for site-dependent response spectra based on correlations derived from extensive borehole logging and comparative ground-motion measurement programs in California. Subsequent use of VS30 for development of strong ground motion prediction equations (GMPEs) and measurement of extensive sets of VS borehole data have confirmed the previous empirical correlations and established correlations of SVS30 with VSZ at other depths. These correlations provide closed form expressions to predict S30 V at a large number of additional sites and further justify S30 V as a parameter to characterize site response for simplified building codes, GMPEs, ShakeMap, and seismic hazard mapping.

  15. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  16. Romanian Data Center: A modern way for seismic monitoring

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin

    2014-05-01

    The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating ShakeMap products and interaction with global data centers. National Data Center developed tools to enable centralizing of data from software like Antelope and Seiscomp3. These tools allow rapid distribution of information about damages observed after an earthquake to the public. Another feature of the developed application is the alerting of designated persons, via email and SMS, based on the earthquake parameters. In parallel, Seiscomp3 sends automatic notifications (emails) with the earthquake parameters. The real-time seismic network and software acquisition and data processing used in the National Data Center development have increased the number of events detected locally and globally, the increase of the quality parameters obtained by data processing and potentially increasing visibility on the national and internationally.

  17. Effect of repeated earthquake on inelastic moment resisting concrete frame

    NASA Astrophysics Data System (ADS)

    Tahara, R. M. K.; Majid, T. A.; Zaini, S. S.; Faisal, A.

    2017-10-01

    This paper investigates the response of inelastic moment resisting concrete building under repeated earthquakes. 2D models consist of 3-storey, 6-storey and 9-storey representing low to medium rise building frame were designed using seismic load and ductility class medium (DCM) according to the requirements set by Euro Code 8. Behaviour factor and stiffness degradation were also taken into consideration. Seven sets of real repeated earthquakes as opposed to artificial earthquakes data were used. The response of the frame was measured in terms of the inter-storey drift and maximum displacement. By adopting repeated earthquake, the recorded mean IDR increased in the range of 3% - 21%. Similarly, in the case of maximum displacement, the values also increased from 20 mm to 40 mm. The findings concluded that the effect of using repeated earthquake in seismic analysis considerably influenced the inter-storey drift and the maximum displacement.

  18. Joint inversion of seismic refraction and resistivity data using layered models - applications to hydrogeology

    NASA Astrophysics Data System (ADS)

    Juhojuntti, N. G.; Kamm, J.

    2010-12-01

    We present a layered-model approach to joint inversion of shallow seismic refraction and resistivity (DC) data, which we believe is a seldom tested method of addressing the problem. This method has been developed as we believe that for shallow sedimentary environments (roughly <100 m depth) a model with a few layers and sharp layer boundaries better represents the subsurface than a smooth minimum-structure (grid) model. Due to the strong assumption our model parameterization implies on the subsurface, only a low number of well resolved model parameters has to be estimated, and provided that this assumptions holds our method can also be applied to other environments. We are using a least-squares inversion, with lateral smoothness constraints, allowing lateral variations in the seismic velocity and the resistivity but no vertical variations. One exception is a positive gradient in the seismic velocity in the uppermost layer in order to get diving rays (the refractions in the deeper layers are modeled as head waves). We assume no connection between seismic velocity and resistivity, and these parameters are allowed to vary individually within the layers. The layer boundaries are, however, common for both parameters. During the inversion lateral smoothing can be applied to the layer boundaries as well as to the seismic velocity and the resistivity. The number of layers is specified before the inversion, and typically we use models with three layers. Depending on the type of environment it is possible to apply smoothing either to the depth of the layer boundaries or to the thickness of the layers, although normally the former is used for shallow sedimentary environments. The smoothing parameters can be chosen independently for each layer. For the DC data we use a finite-difference algorithm to perform the forward modeling and to calculate the Jacobian matrix, while for the seismic data the corresponding entities are retrieved via ray-tracing, using components from the RAYINVR package. The modular layout of the code makes it straightforward to include other types of geophysical data, i.e. gravity. The code has been tested using synthetic examples with fairly simple 2D geometries, mainly for checking the validity of the calculations. The inversion generally converges towards the correct solution, although there could be stability problems if the starting model is too erroneous. We have also applied the code to field data from seismic refraction and multi-electrode resistivity measurements at typical sand-gravel groundwater reservoirs. The tests are promising, as the calculated depths agree fairly well with information from drilling and the velocity and resistivity values appear reasonable. Current work includes better regularization of the inversion as well as defining individual weight factors for the different datasets, as the present algorithm tends to constrain the depths mainly by using the seismic data. More complex synthetic examples will also be tested, including models addressing the seismic hidden-layer problem.

  19. Key science issues in the central and eastern United States for the next version of the USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Peterson, M.D.; Mueller, C.S.

    2011-01-01

    The USGS National Seismic Hazard Maps are updated about every six years by incorporating newly vetted science on earthquakes and ground motions. The 2008 hazard maps for the central and eastern United States region (CEUS) were updated by using revised New Madrid and Charleston source models, an updated seismicity catalog and an estimate of magnitude uncertainties, a distribution of maximum magnitudes, and several new ground-motion prediction equations. The new models resulted in significant ground-motion changes at 5 Hz and 1 Hz spectral acceleration with 5% damping compared to the 2002 version of the hazard maps. The 2008 maps have now been incorporated into the 2009 NEHRP Recommended Provisions, the 2010 ASCE-7 Standard, and the 2012 International Building Code. The USGS is now planning the next update of the seismic hazard maps, which will be provided to the code committees in December 2013. Science issues that will be considered for introduction into the CEUS maps include: 1) updated recurrence models for New Madrid sources, including new geodetic models and magnitude estimates; 2) new earthquake sources and techniques considered in the 2010 model developed by the nuclear industry; 3) new NGA-East ground-motion models (currently under development); and 4) updated earthquake catalogs. We will hold a regional workshop in late 2011 or early 2012 to discuss these and other issues that will affect the seismic hazard evaluation in the CEUS.

  20. Documentation for the 2008 Update of the United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Haller, Kathleen M.; Wheeler, Russell L.; Wesson, Robert L.; Zeng, Yuehua; Boyd, Oliver S.; Perkins, David M.; Luco, Nicolas; Field, Edward H.; Wills, Chris J.; Rukstales, Kenneth S.

    2008-01-01

    The 2008 U.S. Geological Survey (USGS) National Seismic Hazard Maps display earthquake ground motions for various probability levels across the United States and are applied in seismic provisions of building codes, insurance rate structures, risk assessments, and other public policy. This update of the maps incorporates new findings on earthquake ground shaking, faults, seismicity, and geodesy. The resulting maps are derived from seismic hazard curves calculated on a grid of sites across the United States that describe the frequency of exceeding a set of ground motions. The USGS National Seismic Hazard Mapping Project developed these maps by incorporating information on potential earthquakes and associated ground shaking obtained from interaction in science and engineering workshops involving hundreds of participants, review by several science organizations and State surveys, and advice from two expert panels. The National Seismic Hazard Maps represent our assessment of the 'best available science' in earthquake hazards estimation for the United States (maps of Alaska and Hawaii as well as further information on hazard across the United States are available on our Web site at http://earthquake.usgs.gov/research/hazmaps/).

  1. The 2008 U.S. Geological Survey national seismic hazard models and maps for the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Boyd, Oliver S.; Luco, Nicolas; Wheeler, Russell L.; Rukstales, Kenneth S.; Haller, Kathleen M.

    2012-01-01

    In this paper, we describe the scientific basis for the source and ground-motion models applied in the 2008 National Seismic Hazard Maps, the development of new products that are used for building design and risk analyses, relationships between the hazard maps and design maps used in building codes, and potential future improvements to the hazard maps.

  2. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  3. Forecasting probabilistic seismic shaking for greater Tokyo from 400 years of intensity observations

    USGS Publications Warehouse

    Bozkurt, S.B.; Stein, R.S.; Toda, S.

    2007-01-01

    The long recorded history of earthquakes in Japan affords an opportunity to forecast seismic shaking exclusively from past shaking. We calculate the time-averaged (Poisson) probability of severe shaking by using more than 10,000 intensity observations recorded since AD 1600 in a 350 km-wide box centered on Tokyo. Unlike other hazard-assessment methods, source and site effects are included without modeling, and we do not need to know the size or location of any earthquake nor the location and slip rate of any fault. The two key assumptions are that the slope of the observed frequency-intensity relation at every site is the same, and that the 400-year record is long enough to encompass the full range of seismic behavior. Tests we conduct here suggest that both assumptions are sound. The resulting 30-year probability of IJMA ??? 6 shaking (??? PGA ??? 0.4 g or MMI ??? IX) is 30%-40% in Tokyo, Kawasaki, and Yokohama, and 10% 15% in Chiba and Tsukuba. This result means that there is a 30% chance that 4 million people will be subjected to IJMA ??? 6 shaking during an average 30-year period. We also produce exceedance maps of PGA for building-code regulations, and calculate short-term hazard associated with a hypothetical catastrophe bond. Our results resemble an independent assessment developed from conventional seismic hazard analysis for greater Tokyo. ?? 2007, Earthquake Engineering Research Institute.

  4. Viscoelastic Finite Difference Modeling Using Graphics Processing Units

    NASA Astrophysics Data System (ADS)

    Fabien-Ouellet, G.; Gloaguen, E.; Giroux, B.

    2014-12-01

    Full waveform seismic modeling requires a huge amount of computing power that still challenges today's technology. This limits the applicability of powerful processing approaches in seismic exploration like full-waveform inversion. This paper explores the use of Graphics Processing Units (GPU) to compute a time based finite-difference solution to the viscoelastic wave equation. The aim is to investigate whether the adoption of the GPU technology is susceptible to reduce significantly the computing time of simulations. The code presented herein is based on the freely accessible software of Bohlen (2002) in 2D provided under a General Public License (GNU) licence. This implementation is based on a second order centred differences scheme to approximate time differences and staggered grid schemes with centred difference of order 2, 4, 6, 8, and 12 for spatial derivatives. The code is fully parallel and is written using the Message Passing Interface (MPI), and it thus supports simulations of vast seismic models on a cluster of CPUs. To port the code from Bohlen (2002) on GPUs, the OpenCl framework was chosen for its ability to work on both CPUs and GPUs and its adoption by most of GPU manufacturers. In our implementation, OpenCL works in conjunction with MPI, which allows computations on a cluster of GPU for large-scale model simulations. We tested our code for model sizes between 1002 and 60002 elements. Comparison shows a decrease in computation time of more than two orders of magnitude between the GPU implementation run on a AMD Radeon HD 7950 and the CPU implementation run on a 2.26 GHz Intel Xeon Quad-Core. The speed-up varies depending on the order of the finite difference approximation and generally increases for higher orders. Increasing speed-ups are also obtained for increasing model size, which can be explained by kernel overheads and delays introduced by memory transfers to and from the GPU through the PCI-E bus. Those tests indicate that the GPU memory size and the slow memory transfers are the limiting factors of our GPU implementation. Those results show the benefits of using GPUs instead of CPUs for time based finite-difference seismic simulations. The reductions in computation time and in hardware costs are significant and open the door for new approaches in seismic inversion.

  5. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Bonito, L.; Lampasi, A.; Revellino, P.; Guerriero, L.; Sappa, G.; Guadagno, F. M.

    2015-06-01

    SiSeRHMap is a computerized methodology capable of drawing up prediction maps of seismic response. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code-architecture composed of five interdependent modules. A GIS (Geographic Information System) Cubic Model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A metamodeling process confers a hybrid nature to the methodology. In this process, the one-dimensional linear equivalent analysis produces acceleration response spectra of shear wave velocity-thickness profiles, defined as trainers, which are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated Evolutionary Algorithm (EA) and the Levenberg-Marquardt Algorithm (LMA) as the final optimizer. In the final step, the GCM Maps Executor module produces a serial map-set of a stratigraphic seismic response at different periods, grid-solving the calibrated Spectra model. In addition, the spectra topographic amplification is also computed by means of a numerical prediction model. This latter is built to match the results of the numerical simulations related to isolate reliefs using GIS topographic attributes. In this way, different sets of seismic response maps are developed, on which, also maps of seismic design response spectra are defined by means of an enveloping technique.

  6. Slope topography-induced spatial variation correlation with observed building damages in Corso during the May 21, 2003, M w 6.8, Boumerdes earthquake (Algeria)

    NASA Astrophysics Data System (ADS)

    Messaoudi, Akila; Laouami, Nasser; Mezouar, Nourredine

    2017-07-01

    During the May 21, 2003 M w 6.8 Boumerdes earthquake, in the "Cité des 102 Logements" built on a hilltop, in Corso, heavy damages were observed: near the crest, a four-story RC building collapsed while others experienced severe structural damage and far from the crest, slight damage was observed. In the present paper, we perform a 2D slope topography seismic analysis and investigate its effects on the response at the plateau as well as the correlation with the observed damage distribution. A site-specific seismic scenario is used involving seismological, geological, and geotechnical data. 2D finite element numerical seismic study of the idealized Corso site subjected to vertical SV wave propagation is carried out by the universal code FLUSH. The results highlighted the main factors that explain the causes of block collapse, located 8-26 m far from the crest. These are as follows: (i) a significant spatial variation of ground response along the plateau due to the topographic effect, (ii) this spatial variation presents high loss of coherence, (iii) the seismic ground responses (PGA and response spectra) reach their maxima, and (iv) the fundamental frequency of the collapsed blocks coincides with the frequency content of the topographic component. For distances far from the crest where slight damages were observed, the topographic contribution is found negligible. On the basis of these results, it is important to take into account the topographic effect and the induced spatial variability in the seismic design of structures sited near the crest of slope.

  7. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and resultant loss of income produces widespread default on payments. With increased computational power and more complete inventories of exposure, Monte Carlo methods may provide more accurate estimation of severe losses and the opportunity to increase resilience of vulnerable systems and communities.

  8. New constraints on the magmatic system beneath Newberry Volcano from the analysis of active and passive source seismic data, and ambient noise

    NASA Astrophysics Data System (ADS)

    Heath, B.; Toomey, D. R.; Hooft, E. E. E.

    2014-12-01

    Magmatic systems beneath arc-volcanoes are often poorly resolved by seismic imaging due to the small spatial scale and large magnitude of crustal heterogeneity in combination with field experiments that sparsely sample the wavefield. Here we report on our continued analysis of seismic data from a line of densely-spaced (~300 m), three-component seismometers installed on Newberry Volcano in central Oregon for ~3 weeks; the array recorded an explosive shot, ~20 teleseismic events, and ambient noise. By jointly inverting both active and passive-source travel time data, the resulting tomographic image reveals a more detailed view of the presumed rhyolitic magma chamber at ~3-5 km depth, previously imaged by Achauer et al. (1988) and Beachly et al. (2012). The magma chamber is elongated perpendicular to the trend of extensional faulting and encircled by hypocenters of small (M < 2) earthquakes located by PNSN. We also model teleseismic waveforms using a 2-D synthetic seismogram code to recreate anomalous amplitudes observed in the P-wave coda for sites within the caldera. Autocorrelation of ambient noise data also reveals large amplitude waveforms for a small but spatially grouped set of stations, also located within the caldera. On the basis of these noise observations and 2-D synthetic models, which both require slow seismic speeds at depth, we conclude that our tomographic model underestimates low-velocity anomalies associated with the inferred crustal magma chamber; this is due in large part to wavefront healing, which reduces observed travel time anomalies, and regularization constraints, which minimize model perturbations. Only by using various methods that interrogate different aspects of the seismic data are we able to more realistically constrain the complicated, heterogeneous volcanic system. In particular, modeling of waveform characteristics provides a better measure of the spatial scale and magnitude of crustal velocities near magmatic systems.

  9. Geophysical investigation and dynamic modelling of unstable slopes: case-study of Kainama (Kyrgyzstan)

    NASA Astrophysics Data System (ADS)

    Danneels, G.; Bourdeau, C.; Torgoev, I.; Havenith, H.-B.

    2008-10-01

    The presence of massive Quaternary loess units at the eastern border of the Fergana Basin (Kyrgyzstan, Central Asia) makes this area particularly prone to the development of catastrophic loess earthflows, causing damages and injuries almost every year. Efficient disaster management requires a good understanding of the main causes of these mass movements, that is, increased groundwater pressure and seismic shaking. This paper focuses on the Kainama earthflow, mainly composed of loess, which occurred in 2004 April. Its high velocity and the long run-out zone caused the destruction of 12 houses and the death of 33 people. In summer 2005, a field survey consisting of geophysical and seismological measurements was carried out along the adjacent slope. By combination and geostatistical analysis of these data, a reliable 3-D model of the geometry and properties of the subsurface layers, as shown in the first part of the paper, was created. The analysis of the seismological data allowed us to point out a correlation between the thickness of the loess cover and the measured resonance frequencies and associated amplification potential. The second part of this paper is focused on the study of the seismic response of the slope by numerical simulations, using a 2-D finite difference code named FLAC. Modelling of the seismic amplification potential along the slope confirmed the results obtained from the seismological survey-strong amplifications at the crest and bottom of the slope where there is a thick loess cover and almost no amplification in the middle part of the slope. Furthermore, dynamic slope stability analyses were conducted to assess the influence of local amplifications and increased groundwater pressures on the slope failure. The results of the dynamic modelling, although preliminary, show that a combination of seismic and hydrologic origin (pore pressure build-up during the seismic shaking) is the most probable scenario responsible for the 2004 failure.

  10. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    ,

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  11. Considering potential seismic sources in earthquake hazard assessment for Northern Iran

    NASA Astrophysics Data System (ADS)

    Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila

    2014-07-01

    Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.

  12. Site Effect Analysis in the Izmit Basin of Turkey: Preliminary Results from the Wave Propagation Simulation using the Spectral Element Method

    NASA Astrophysics Data System (ADS)

    Firtana Elcomert, Karolin; Kocaoglu, Argun

    2014-05-01

    Sedimentary basins affect the propagation characteristics of the seismic waves and cause significant ground motion amplification during an earthquake. While the impedance contrast between the sedimentary layer and bedrock predominantly controls the resonance frequencies and their amplitudes (seismic amplification), surface waves generated within the basin, make the waveforms more complex and longer in duration. When a dense network of weak and/or strong motion sensors is available, site effect or more specifically sedimentary basin amplification can be directly estimated experimentally provided that significant earthquakes occur during the period of study. Alternatively, site effect can be investigated through simulation of ground motion. The objective of this study is to investigate the 2-D site effect in the Izmit Basin located in the eastern Marmara region of Turkey, using the currently available bedrock topography and shear-wave velocity data. The Izmit Basin was formed in Plio-Quaternary period and is known to be a pull-apart basin controlled by the northern branch of the North Anatolian Fault Zone (Şengör et al. 2005). A thorough analysis of seismic hazard is important since the city of Izmit and its metropolitan area is located in this region. Using a spectral element code, SPECFEM2D (Komatitsch et al. 1998), this work presents some of the preliminary results of the 2-D seismic wave propagation simulations for the Izmit basin. The spectral-element method allows accurate and efficient simulation of seismic wave propagation due to its advantages over the other numerical modeling techniques by means of representation of the wavefield and the computational mesh. The preliminary results of this study suggest that seismic wave propagation simulations give some insight into the site amplification phenomena in the Izmit basin. Comparison of seismograms recorded on the top of sedimentary layer with those recorded on the bedrock show more complex waveforms with higher amplitudes on seismograms recorded at the free surface. Furthermore, modeling reveals that observed seismograms include surface waves whose excitation is clearly related to the basin geometry.

  13. Technology Innovation for the CTBT, the National Laboratory Contribution

    NASA Astrophysics Data System (ADS)

    Goldstein, W. H.

    2016-12-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) and its Protocol are the result of a long history of scientific engagement and international technical collaboration. The U.S. Department of Energy National Laboratories have been conducting nuclear explosive test-ban research for over 50 years and have made significant contributions to this legacy. Recent examples include the RSTT (regional seismic travel time) computer code and the Smart Sampler—both of these products are the result of collaborations among Livermore, Sandia, Los Alamos, and Pacific Northwest National Laboratories. The RSTT code enables fast and accurate seismic event locations using regional data. This code solves the long-standing problem of using teleseismic and regional seismic data together to locate events. The Smart Sampler is designed for use in On-site Inspections to sample soil gases to look for noble gas fission products from a potential underground nuclear explosive test. The Smart Sampler solves the long-standing problem of collecting soil gases without contaminating the sample with gases from the atmosphere by operating only during atmospheric low-pressure events. Both these products are being evaluated by the Preparatory Commission for the CTBT Organization and the international community. In addition to R&D, the National Laboratories provide experts to support U.S. policy makers in ongoing discussions such as CTBT Working Group B, which sets policy for the development of the CTBT monitoring and verification regime.

  14. The SERGISAI procedure for seismic risk assessment

    NASA Astrophysics Data System (ADS)

    Zonno, G.; Garcia-Fernandez, M.; Jimenez, M.J.; Menoni, S.; Meroni, F.; Petrini, V.

    The European project SERGISAI developed a computational tool where amethodology for seismic risk assessment at different geographical scales hasbeen implemented. Experts of various disciplines, including seismologists,engineers, planners, geologists, and computer scientists, co-operated in anactual multidisciplinary process to develop this tool. Standard proceduralcodes, Geographical Information Systems (GIS), and Artificial Intelligence(AI) techniques compose the whole system, that will enable the end userto carry out a complete seismic risk assessment at three geographical scales:regional, sub-regional and local. At present, single codes or models thathave been incorporated are not new in general, but the modularity of theprototype, based on a user-friendly front-end, offers potential users thepossibility of updating or replacing any code or model if desired. Theproposed procedure is a first attempt to integrate tools, codes and methodsfor assessing expected earthquake damage, and it was mainly designedto become a useful support for civil defence and land use planning agencies.Risk factors have been treated in the most suitable way for each one, interms of level of detail, kind of parameters and units of measure.Identifying various geographical scales is not a mere question of dimension;since entities to be studied correspond to areas defined by administrativeand geographical borders. The procedure was applied in the following areas:Toscana in Italy, for the regional scale, the Garfagnana area in Toscana, forthe sub-regional scale, and a part of Barcelona city, Spain, for the localscale.

  15. Preliminary Earthquake Hazard Map of Afghanistan

    USGS Publications Warehouse

    Boyd, Oliver S.; Mueller, Charles S.; Rukstales, Kenneth S.

    2007-01-01

    Introduction Earthquakes represent a serious threat to the people and institutions of Afghanistan. As part of a United States Agency for International Development (USAID) effort to assess the resource potential and seismic hazards of Afghanistan, the Seismic Hazard Mapping group of the United States Geological Survey (USGS) has prepared a series of probabilistic seismic hazard maps that help quantify the expected frequency and strength of ground shaking nationwide. To construct the maps, we do a complete hazard analysis for each of ~35,000 sites in the study area. We use a probabilistic methodology that accounts for all potential seismic sources and their rates of earthquake activity, and we incorporate modeling uncertainty by using logic trees for source and ground-motion parameters. See the Appendix for an explanation of probabilistic seismic hazard analysis and discussion of seismic risk. Afghanistan occupies a southward-projecting, relatively stable promontory of the Eurasian tectonic plate (Ambraseys and Bilham, 2003; Wheeler and others, 2005). Active plate boundaries, however, surround Afghanistan on the west, south, and east. To the west, the Arabian plate moves northward relative to Eurasia at about 3 cm/yr. The active plate boundary trends northwestward through the Zagros region of southwestern Iran. Deformation is accommodated throughout the territory of Iran; major structures include several north-south-trending, right-lateral strike-slip fault systems in the east and, farther to the north, a series of east-west-trending reverse- and strike-slip faults. This deformation apparently does not cross the border into relatively stable western Afghanistan. In the east, the Indian plate moves northward relative to Eurasia at a rate of about 4 cm/yr. A broad, transpressional plate-boundary zone extends into eastern Afghanistan, trending southwestward from the Hindu Kush in northeast Afghanistan, through Kabul, and along the Afghanistan-Pakistan border. Deformation here is expressed as a belt of major, north-northeast-trending, left-lateral strike-slip faults and abundant seismicity. The seismicity intensifies farther to the northeast and includes a prominent zone of deep earthquakes associated with northward subduction of the Indian plate beneath Eurasia that extends beneath the Hindu Kush and Pamirs Mountains. Production of the seismic hazard maps is challenging because the geological and seismological data required to produce a seismic hazard model are limited. The data that are available for this project include historical seismicity and poorly constrained slip rates on only a few of the many active faults in the country. Much of the hazard is derived from a new catalog of historical earthquakes: from 1964 to the present, with magnitude equal to or greater than about 4.5, and with depth between 0 and 250 kilometers. We also include four specific faults in the model: the Chaman fault with an assigned slip rate of 10 mm/yr, the Central Badakhshan fault with an assigned slip rate of 12 mm/yr, the Darvaz fault with an assigned slip rate of 7 mm/yr, and the Hari Rud fault with an assigned slip rate of 2 mm/yr. For these faults and for shallow seismicity less than 50 km deep, we incorporate published ground-motion estimates from tectonically active regions of western North America, Europe, and the Middle East. Ground-motion estimates for deeper seismicity are derived from data in subduction environments. We apply estimates derived for tectonic regions where subduction is the main tectonic process for intermediate-depth seismicity between 50- and 250-km depth. Within the framework of these limitations, we have developed a preliminary probabilistic seismic-hazard assessment of Afghanistan, the type of analysis that underpins the seismic components of modern building codes in the United States. The assessment includes maps of estimated peak ground-acceleration (PGA), 0.2-second spectral acceleration (SA), and 1.0-secon

  16. Study on Frequency content in seismic hazard analysis in West Azarbayjan and East Azarbayjan provinces (Iran)

    NASA Astrophysics Data System (ADS)

    Behzadafshar, K.; Abbaszadeh Shahri, A.; Isfandiari, K.

    2012-12-01

    ABSTRACT: Iran plate is prone to earthquake, occurrence of destructive earthquakes approximately every 5 years certify it. Due to existence of happened great earthquakes and large number of potential seismic sources (active faults) which some of them are responsible for great earthquakes the North-West of Iran which is located in junction of Alborz and Zagros seismotectonic provinces (Mirzaii et al, 1998) is an interesting area for seismologists. Considering to population and existence of large cities like Tabriz, Ardabil and Orumiyeh which play crucial role in industry and economy of Iran, authors decided to focus on study of seismic hazard assessment in these two provinces to achieve ground acceleration in different frequency content and indicate critical frequencies in the studied area. It is important to note that however lots of studies have been done in North -West of Iran, but building code modifications also need frequency content analysis to asses seismic hazard more precisely which has been done in the present study. Furthermore, in previous studies have been applied free download softwares which were provided before 2000 but the most important advantage of this study is applying professional industrial software which has been written in 2009 and provided by authors. This applied software can cover previous software weak points very well such as gridding potential sources, attention to the seismogenic zone and applying attenuation relationships directly. Obtained hazard maps illustrate that maximum accelerations will be experienced in North West to South East direction which increased by frequency reduction from 100 Hz to 10 Hz then decreased by frequency reduce (to 0.25 Hz). Maximum acceleration will be occurred in the basement in 10 HZ frequency content. Keywords: hazard map, Frequency content, seismogenic zone, Iran

  17. Drift Reliability Assessment of a Four Storey Frame Residential Building Under Seismic Loading Considering Multiple Factors

    NASA Astrophysics Data System (ADS)

    Sil, Arjun; Longmailai, Thaihamdau

    2017-09-01

    The lateral displacement of Reinforced Concrete (RC) frame building during an earthquake has an important impact on the structural stability and integrity. However, seismic analysis and design of RC building needs more concern due to its complex behavior as the performance of the structure links to the features of the system having many influencing parameters and other inherent uncertainties. The reliability approach takes into account the factors and uncertainty in design influencing the performance or response of the structure in which the safety level or the probability of failure could be ascertained. This present study, aims to assess the reliability of seismic performance of a four storey residential RC building seismically located in Zone-V as per the code provisions given in the Indian Standards IS: 1893-2002. The reliability assessment performed by deriving an explicit expression for maximum roof-lateral displacement as a failure function by regression method. A total of 319, four storey RC buildings were analyzed by linear static method using SAP2000. However, the change in the lateral-roof displacement with the variation of the parameters (column dimension, beam dimension, grade of concrete, floor height and total weight of the structure) was observed. A generalized relation established by regression method which could be used to estimate the expected lateral displacement owing to those selected parameters. A comparison made between the displacements obtained from analysis with that of the equation so formed. However, it shows that the proposed relation could be used directly to determine the expected maximum lateral displacement. The data obtained from the statistical computations was then used to obtain the probability of failure and the reliability.

  18. High-resolution seismicity catalog of Italian peninsula in the period 1981-2015

    NASA Astrophysics Data System (ADS)

    Michele, M.; Latorre, D.; Castello, B.; Di Stefano, R.; Chiaraluce, L.

    2017-12-01

    In order to provide an updated reference catalog of Italian seismicity, the absolute location of the last 35 years (1981-2015) of seismic activity was computed with a three-dimensional VP and VS velocity model covering the whole Italian territory. The NonLinLoc code (Lomax et al., 2000), which is based on a probabilistic approach, was used to provide a complete and robust description of the uncertainties associated to the locations corresponding to the hypocentral solutions with the highest probability density. Moreover, the code using a finite difference approximation of the eikonal equation (Podvin and Lecomte, 1991), allows to manage very contrasted velocity models in the arrival time computation. To optimize the earthquakes location, we included the station corrections in the inverse problem. For each year, the number of available earthquakes depends on both the network detection capability and the occurrence of major seismic sequences. The starting earthquakes catalog was based on 2.6 million P and 1.9 million S arrival time picks for 278.607 selected earthquakes, recorded at least by 3 seismic stations of the Italian seismic network. The new catalog compared to the previous ones consisting of hypocentral locations retrieved with linearized location methods, shows a very good improvement as testified by the location parameters assessing the quality of the solution (i.e., RMS, azimuthal gap, formal error on horizontal and vertical components). In addition, we used the distance between the expected and the maximum likelihood hypocenter location to establish the unimodal (high-resolved location) or multimodal (poor-resolved location) character of the probability distribution. We used these parameters to classify the resulting locations in four classes (A, B, C and D) considering the simultaneous goodness of the previous parameters. The upper classes (A and B) include the 65% of the relocated earthquake, while the lowest class (D) only includes the 7% of the seismicity. We present the new catalog, consisting of 272.847 events, showing some example of earthquakes location related to the background as well as small to large seismic sequences occurred in Italy the last 35 years.

  19. PSHAe (Probabilistic Seismic Hazard enhanced): the case of Istanbul.

    NASA Astrophysics Data System (ADS)

    Stupazzini, Marco; Allmann, Alexander; Infantino, Maria; Kaeser, Martin; Mazzieri, Ilario; Paolucci, Roberto; Smerzini, Chiara

    2016-04-01

    The Probabilistic Seismic Hazard Analysis (PSHA) only relying on GMPEs tends to be insufficiently constrained at short distances and data only partially account for the rupture process, seismic wave propagation and three-dimensional (3D) complex configurations. Given a large and representative set of numerical results from 3D scenarios, analysing the resulting database from a statistical point of view and implementing the results as a generalized attenuation function (GAF) into the classical PSHA might be an appealing way to deal with this problem (Villani et al., 2014). Nonetheless, the limited amount of computational resources or time available tend to pose substantial constrains in a broad application of the previous method and, furthermore, the method is only partially suitable for taking into account the spatial correlation of ground motion as modelled by each forward physics-based simulation (PBS). Given that, we envision a streamlined and alternative implementation of the previous approach, aiming at selecting a limited number of scenarios wisely chosen and associating them a probability of occurrence. The experience gathered in the past year regarding 3D modelling of seismic wave propagation in complex alluvial basin (Pilz et al., 2011, Guidotti et al., 2011, Smerzini and Villani, 2012) allowed us to enhance the choice of simulated scenarios in order to explore the variability of ground motion, preserving the full spatial correlation necessary for risk modelling, on one hand and on the other the simulated losses for a given location and a given building stock. 3D numerical modelling of scenarios occurring the North Anatolian Fault in the proximity of Istanbul are carried out through the spectral element code SPEED (http://speed.mox.polimi.it). The results are introduced in a PSHA, exploiting the capabilities of the proposed methodology against a traditional approach based on GMPE. References Guidotti R, M Stupazzini, C Smerzini, R Paolucci, P Ramieri, "Numerical Study on the Role of Basin Geometry and Kinematic Seismic Source in 3D Ground Motion Simulation of the 22 February 2011 M-W 6.2 Christchurch Earthquake", SRL 11/2011; 82(6):767-782. DOI:10.1785/gssrl.82.6.767 Pilz M,Parolai S, Stupazzini M, Paolucci P and Zschau J, "Modelling basin effects on earthquake ground motion in the Santiago de Chile basin by a spectral element code", GJI 11/2011, 187(2):929-945. DOI: 10.1111/j.1365-246X.2011.05183.x Smerzini C and Villani M, "Broadband Numerical Simulations in Complex Near-Field Geological Configurations: The Case of the 2009 Mw 6.3 L'Aquila Earthquake", BSSA 12/2012; 102(6):2436-2451. DOI:10.1785/0120120002 Villani M, Faccioli E, Ordaz M, Stupazzini M, "High-Resolution Seismic Hazard Analysis in a Complex Geological Configuration: The Case of the Sulmona Basin in Central Italy", Earthquake Spectra, 11/2014; 30(4):1801-1824. DOI: 10.1193/112911EQS288M

  20. Nuclear Test Depth Determination with Synthetic Modelling: Global Analysis from PNEs to DPRK-2016

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Stachnik, Joshua; Baker, Ben; Epiphansky, Alexey; Bobrov, Dmitry

    2016-04-01

    Seismic event depth determination is critical for the event screening process at the International Data Center, CTBTO. A thorough determination of the event depth can be conducted mostly through additional special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface making the depth screening criterion not applicable. Further it may result in a heavier workload to manually distinguish between subsurface and deeper crustal events. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the depth phases, cross correlation between observed and theoretic seismograms can provide a basis for the event depth estimation, and so an expansion to the screening process. We applied this approach mostly to events at teleseismic and partially regional distances. The approach was found efficient for the seismic event screening process, with certain caveats related mostly to poorly defined source and receiver crustal models which can shift the depth estimate. An adjustable teleseismic attenuation model (t*) for synthetics was used since this characteristic is not known for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to account for the complex source topography. The software prototype is designed to be used for the Expert Technical Analysis at the IDC. With this, the design effectively reuses the NDC-in-a-Box code and can be comfortably utilized by the NDC users. The package uses Geotool as a front-end for data retrieval and pre-processing. After the event database is compiled, the control is passed to the driver software, running the external processing and plotting toolboxes, which controls the final stage and produces the final result. The modules are mostly Python coded, C-coded (Raysynth3D complex topography regional synthetics) and FORTRAN coded synthetics from the CPS330 software package by Robert Herrmann of Saint Louis University. The extension of this single station depth determination method is under development and uses joint information from all stations participating in processing. It is based on simultaneous depth and moment tensor determination for both short and long period seismic phases. A novel approach recently developed for microseismic event location utilizing only phase waveform information was migrated to a global scale. It should provide faster computation as it does not require intensive synthetic modelling, and might benefit processing noisy signals. A consistent depth estimate for all recent nuclear tests was produced for the vast number of IMS stations (primary and auxiliary) used in processing.

  1. 49 CFR 41.120 - Acceptable model codes.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... seismic safety substantially equivalent to that provided by use of the 1988 National Earthquake Hazards Reduction Program (NEHRP) Recommended Provisions (Copies are available from the Office of Earthquakes and...

  2. 49 CFR 41.120 - Acceptable model codes.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... seismic safety substantially equivalent to that provided by use of the 1988 National Earthquake Hazards Reduction Program (NEHRP) Recommended Provisions (Copies are available from the Office of Earthquakes and...

  3. 49 CFR 41.120 - Acceptable model codes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... seismic safety substantially equivalent to that provided by use of the 1988 National Earthquake Hazards Reduction Program (NEHRP) Recommended Provisions (Copies are available from the Office of Earthquakes and...

  4. 49 CFR 41.120 - Acceptable model codes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... seismic safety substantially equivalent to that provided by use of the 1988 National Earthquake Hazards Reduction Program (NEHRP) Recommended Provisions (Copies are available from the Office of Earthquakes and...

  5. 49 CFR 41.120 - Acceptable model codes.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... seismic safety substantially equivalent to that provided by use of the 1988 National Earthquake Hazards Reduction Program (NEHRP) Recommended Provisions (Copies are available from the Office of Earthquakes and...

  6. Seismometers on Europa: Insights from Modeling and Antarctic Ice Shelf Analogs (Invited)

    NASA Astrophysics Data System (ADS)

    Schmerr, N. C.; Brunt, K. M.; Cammarano, F.; Hurford, T. A.; Lekic, V.; Panning, M. P.; Rhoden, A.; Sauber, J. M.

    2013-12-01

    The outer satellites of the Solar System are a diverse suite of objects that span a large spectrum of sizes, compositions, and evolutionary histories; constraining their internal structures is key for understanding their formation, evolution, and dynamics. In particular, Jupiter's icy satellite Europa has compelling evidence for the existence of a global subsurface ocean beneath a surface layer of water ice. This ocean decouples the ice shell from the solid silicate mantle, and amplifies tidally driven large-scale surface deformation. The complex fissures and cracks seen by orbital flybys suggest brittle failure is an ongoing and active process in the ice crust, therefore indicating a high level of associated seismic activity. Seismic probing of the ice, oceanic, and rocky layers would provide altogether new information on the structure, evolution, and even habitability of Europa. Any future missions (penetrators, landers, and rovers) planning to take advantage of seismometers to image the Europan interior would need to be built around predictions for the expected background noise levels, seismicity, wavefields, and elastic properties of the interior. A preliminary suite of seismic velocity profiles for Europa has been calculated using moment of inertia constraints, planetary mass and density, estimates of moon composition, thermal structure, and experimentally determined relationships of elastic properties for relevant materials at pressure, temperature and depth. While the uncertainties in these models are high, they allow us to calculate a first-order seismic response using 1-D and 3-D high frequency wave propagation codes for global and regional scale structures. Here, we show how future seismic instruments could provide detailed elastic information and reduced uncertainties on the internal structure of Europa. For example, receiver functions and surface wave orbits calculated for a single seismic instrument would provide information on crustal thickness and the depth of an ocean layer. Likewise, evaluation of arrival times of reflected wave multiples observed at a single seismic station would record properties of the mantle and core of Europa. Cluster analysis of waveforms from various seismic source mechanisms could be used to classify different types of seismicity originating from the ice and rocky parts of the moon. We examine examples of single station results for analogous seismic experiments on Earth, e.g., where broadband, 3-component seismometers have been placed upon the Ross Ice Shelf of Antarctica. Ultimately this work reveals that seismometer deployments will be essential for understanding the internal dynamics, habitability, and surface evolution of Europa, and that seismic instruments need to be a key component of future missions to surface of Europa and outer satellites.

  7. A C Language Implementation of the SRO (Murdock) Detector/Analyzer

    USGS Publications Warehouse

    Murdock, James N.; Halbert, Scott E.

    1991-01-01

    A signal detector and analyzer algorithm was described by Murdock and Hutt in 1983. The algorithm emulates the performance of a human interpreter of seismograms. It estimates the signal onset, the direction of onset (positive or negative), the quality of these determinations, the period and amplitude of the signal, and the background noise at the time of the signal. The algorithm has been coded in C language for implementation as a 'blackbox' for data similar to that of the China Digital Seismic Network. A driver for the algorithm is included, as are suggestions for other drivers. In all of these routines, plus several FIR filters that are included as well, floating point operations are not required. Multichannel operation is supported. Although the primary use of the code has been for in-house processing of broadband and short period data of the China Digital Seismic Network, provisions have been made to process the long period and very long period data of that system as well. The code for the in-house detector, which runs on a mini-computer, is very similar to that of the field system, which runs on a microprocessor. The code is documented.

  8. Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering

    NASA Astrophysics Data System (ADS)

    Filiatrault, Andre; Sullivan, Timothy

    2014-08-01

    With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major knowledge gaps that will need to be filled by future research. Furthermore, considering recent trends in earthquake engineering, the paper explores how performance-based seismic design might be conceived for nonstructural components, drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.

  9. Extending the life of mature basins in the North Sea and imaging sub-basalt and sub-intrusive structures using seismic intensity monitoring.

    NASA Astrophysics Data System (ADS)

    De Siena, Luca; Rawlinson, Nicholas

    2016-04-01

    Non-standard seismic imaging (velocity, attenuation, and scattering tomography) of the North Sea basins by using unexploited seismic intensities from previous passive and active surveys are key for better imaging and monitoring fluid under the subsurface. These intensities provide unique solutions to the problem of locating/tracking gas/fluid movements in the crust and depicting sub-basalt and sub-intrusives in volcanic reservoirs. The proposed techniques have been tested in volcanic Islands (Deception Island) and have been proved effective at monitoring fracture opening, imaging buried fluid-filled bodies, and tracking water/gas interfaces. These novel seismic attributes are modelled in space and time and connected with the lithology of the sampled medium, specifically density and permeability with as key output a novel computational code with strong commercial potential.

  10. Seismic risk management of non-engineered buildings

    NASA Astrophysics Data System (ADS)

    Winar, Setya

    Earthquakes have long been feared as one of nature's most terrifying and devastating events. Although seismic codes clearly exist in countries with a high seismic risk to save lives and human suffering, earthquakes still continue to cause tragic events with high death tolls, particularly due to the collapse of widespread non-engineered buildings with non-seismic resistance in developing countries such as Indonesia. The implementation of seismic codes in non-engineered construction is the key to ensuring earthquake safety. In fact, such implementation is not simple, because it comprises all forms of cross disciplinary and cross sectoral linkages at different levels of understanding, commitment, and skill. This fact suggests that a widely agreed framework can help to harmonise the various perspectives. Hence, this research is aimed at developing an integrated framework for guiding and monitoring seismic risk reduction of non-engineered buildings in Indonesia via a risk management method.Primarily, the proposed framework for the study has drawn heavily on wider literature, the three existing frameworks around the world, and on the contribution of various stakeholders who participated in the study. A postal questionnaire survey, selected interviews, and workshop event constituted the primary data collection methods. As a robust framework needed to be achieved, the following two workshop events, which were conducted in Yogyakarta City and Bengkulu City in Indonesia, were carried out for practicality, validity, and moderation or any identifiable improvement requirements. The data collected was analysed with the assistance of SPSS and NVivo software programmes.This research found that the content of the proposed framework comprises 63 pairs of characteristic-indicators complemented by (a) three important factors of effective seismic risk management of non-engineered buildings, (b) three guiding principles for sustainable dissemination to the grass root communities and (c) a map of agents of change. Among the 63 pairs, there are 19 technical interventions and 44 non-technical interventions. These findings contribute to the wider knowledge in the domain of the seismic risk management of non-engineered buildings, in order to: (a) provide a basis for effective political advocacy, (b) reflect the multidimensional and inter-disciplinary nature of seismic risk reduction, (c) assist a wide range of users in determining roles, responsibilities, and accountabilities, and (d) provide the basis for setting goals and targets.

  11. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  12. Implied preference for seismic design level and earthquake insurance.

    PubMed

    Goda, K; Hong, H P

    2008-04-01

    Seismic risk can be reduced by implementing newly developed seismic provisions in design codes. Furthermore, financial protection or enhanced utility and happiness for stakeholders could be gained through the purchase of earthquake insurance. If this is not so, there would be no market for such insurance. However, perceived benefit associated with insurance is not universally shared by stakeholders partly due to their diverse risk attitudes. This study investigates the implied seismic design preference with insurance options for decisionmakers of bounded rationality whose preferences could be adequately represented by the cumulative prospect theory (CPT). The investigation is focused on assessing the sensitivity of the implied seismic design preference with insurance options to model parameters of the CPT and to fair and unfair insurance arrangements. Numerical results suggest that human cognitive limitation and risk perception can affect the implied seismic design preference by the CPT significantly. The mandatory purchase of fair insurance will lead the implied seismic design preference to the optimum design level that is dictated by the minimum expected lifecycle cost rule. Unfair insurance decreases the expected gain as well as its associated variability, which is preferred by risk-averse decisionmakers. The obtained results of the implied preference for the combination of the seismic design level and insurance option suggest that property owners, financial institutions, and municipalities can take advantage of affordable insurance to establish successful seismic risk management strategies.

  13. Casual instrument corrections for short-period and broadband seismometers

    USGS Publications Warehouse

    Haney, Matthew M.; Power, John; West, Michael; Michaels, Paul

    2012-01-01

    Of all the filters applied to recordings of seismic waves, which include source, path, and site effects, the one we know most precisely is the instrument filter. Therefore, it behooves seismologists to accurately remove the effect of the instrument from raw seismograms. Applying instrument corrections allows analysis of the seismogram in terms of physical units (e.g., displacement or particle velocity of the Earth’s surface) instead of the output of the instrument (e.g., digital counts). The instrument correction can be considered the most fundamental processing step in seismology since it relates the raw data to an observable quantity of interest to seismologists. Complicating matters is the fact that, in practice, the term “instrument correction” refers to more than simply the seismometer. The instrument correction compensates for the complete recording system including the seismometer, telemetry, digitizer, and any anti‐alias filters. Knowledge of all these components is necessary to perform an accurate instrument correction. The subject of instrument corrections has been covered extensively in the literature (Seidl, 1980; Scherbaum, 1996). However, the prospect of applying instrument corrections still evokes angst among many seismologists—the authors of this paper included. There may be several reasons for this. For instance, the seminal paper by Seidl (1980) exists in a journal that is not currently available in electronic format and cannot be accessed online. Also, a standard method for applying instrument corrections involves the programs TRANSFER and EVALRESP in the Seismic Analysis Code (SAC) package (Goldstein et al., 2003). The exact mathematical methods implemented in these codes are not thoroughly described in the documentation accompanying SAC.

  14. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    NASA Astrophysics Data System (ADS)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.

  15. Dynamic response analysis of a 24-story damped steel structure

    NASA Astrophysics Data System (ADS)

    Feng, Demin; Miyama, Takafumi

    2017-10-01

    In Japanese and Chinese building codes, a two-stage design philosophy, damage limitation (small earthquake, Level 1) and life safety (extreme large earthquake, Level 2), is adopted. It is very interesting to compare the design method of a damped structure based on the two building codes. In the Chinese code, in order to be consistent with the conventional seismic design method, the damped structure is also designed at the small earthquake level. The effect of damper systems is considered by the additional damping ratio concept. The design force will be obtained from the damped design spectrum considering the reduction due to the additional damping ratio. The additional damping ratio by the damper system is usually calculated by a time history analysis method at the small earthquake level. The velocity dependent type dampers such as viscous dampers can function well even in the small earthquake level. But, if steel damper is used, which usually remains elastic in the small earthquake, there will be no additional damping ratio achieved. On the other hand, a time history analysis is used in Japan both for small earthquake and extreme large earthquake level. The characteristics of damper system and ductility of the structure can be modelled well. An existing 24-story steel frame is modified to demonstrate the design process of the damped structure based on the two building codes. Viscous wall type damper and low yield steel panel dampers are studied as the damper system.

  16. Light Steel-Timber Frame with Composite and Plaster Bracing Panels

    PubMed Central

    Scotta, Roberto; Trutalli, Davide; Fiorin, Laura; Pozza, Luca; Marchi, Luca; De Stefani, Lorenzo

    2015-01-01

    The proposed light-frame structure comprises steel columns for vertical loads and an innovative bracing system to efficiently resist seismic actions. This seismic force resisting system consists of a light timber frame braced with an Oriented Strand Board (OSB) sheet and an external technoprene plaster-infilled slab. Steel brackets are used as foundation and floor connections. Experimental cyclic-loading tests were conduced to study the seismic response of two shear-wall specimens. A numerical model was calibrated on experimental results and the dynamic non-linear behavior of a case-study building was assessed. Numerical results were then used to estimate the proper behavior factor value, according to European seismic codes. Obtained results demonstrate that this innovative system is suitable for the use in seismic-prone areas thanks to the high ductility and dissipative capacity achieved by the bracing system. This favorable behavior is mainly due to the fasteners and materials used and to the correct application of the capacity design approach. PMID:28793642

  17. Light Steel-Timber Frame with Composite and Plaster Bracing Panels.

    PubMed

    Scotta, Roberto; Trutalli, Davide; Fiorin, Laura; Pozza, Luca; Marchi, Luca; De Stefani, Lorenzo

    2015-11-03

    The proposed light-frame structure comprises steel columns for vertical loads and an innovative bracing system to efficiently resist seismic actions. This seismic force resisting system consists of a light timber frame braced with an Oriented Strand Board (OSB) sheet and an external technoprene plaster-infilled slab. Steel brackets are used as foundation and floor connections. Experimental cyclic-loading tests were conduced to study the seismic response of two shear-wall specimens. A numerical model was calibrated on experimental results and the dynamic non-linear behavior of a case-study building was assessed. Numerical results were then used to estimate the proper behavior factor value, according to European seismic codes. Obtained results demonstrate that this innovative system is suitable for the use in seismic-prone areas thanks to the high ductility and dissipative capacity achieved by the bracing system. This favorable behavior is mainly due to the fasteners and materials used and to the correct application of the capacity design approach.

  18. Seismic response of reinforced concrete frames at different damage levels

    NASA Astrophysics Data System (ADS)

    Morales-González, Merangeli; Vidot-Vega, Aidcer L.

    2017-03-01

    Performance-based seismic engineering is focused on the definition of limit states to represent different levels of damage, which can be described by material strains, drifts, displacements or even changes in dissipating properties and stiffness of the structure. This study presents a research plan to evaluate the behavior of reinforced concrete (RC) moment resistant frames at different performance levels established by the ASCE 41-06 seismic rehabilitation code. Sixteen RC plane moment frames with different span-to-depth ratios and three 3D RC frames were analyzed to evaluate their seismic behavior at different damage levels established by the ASCE 41-06. For each span-to-depth ratio, four different beam longitudinal reinforcement steel ratios were used that varied from 0.85 to 2.5% for the 2D frames. Nonlinear time history analyses of the frames were performed using scaled ground motions. The impact of different span-to-depth and reinforcement ratios on the damage levels was evaluated. Material strains, rotations and seismic hysteretic energy changes at different damage levels were studied.

  19. The Optimizer Topology Characteristics in Seismic Hazards

    NASA Astrophysics Data System (ADS)

    Sengor, T.

    2015-12-01

    The characteristic data of the natural phenomena are questioned in a topological space approach to illuminate whether there is an algorithm behind them bringing the situation of physics of phenomena to optimized states even if they are hazards. The optimized code designing the hazard on a topological structure mashes the metric of the phenomena. The deviations in the metric of different phenomena push and/or pull the fold of the other suitable phenomena. For example if the metric of a specific phenomenon A fits to the metric of another specific phenomenon B after variation processes generated with the deviation of the metric of previous phenomenon A. Defining manifold processes covering the metric characteristics of each of every phenomenon is possible for all the physical events; i.e., natural hazards. There are suitable folds in those manifold groups so that each subfold fits to the metric characteristics of one of the natural hazard category at least. Some variation algorithms on those metric structures prepare a gauge effect bringing the long time stability of Earth for largely scaled periods. The realization of that stability depends on some specific conditions. These specific conditions are called optimized codes. The analytical basics of processes in topological structures are developed in [1]. The codes are generated according to the structures in [2]. Some optimized codes are derived related to the seismicity of NAF beginning from the quakes of the year 1999. References1. Taner SENGOR, "Topological theory and analytical configuration for a universal community model," Procedia- Social and Behavioral Sciences, Vol. 81, pp. 188-194, 28 June 2013, 2. Taner SENGOR, "Seismic-Climatic-Hazardous Events Estimation Processes via the Coupling Structures in Conserving Energy Topologies of the Earth," The 2014 AGU Fall Meeting, Abstract no.: 31374, ABD.

  20. Evaluation of liquefaction potential for building code

    NASA Astrophysics Data System (ADS)

    Nunziata, C.; De Nisco, G.; Panza, G. F.

    2008-07-01

    The standard approach for the evaluation of the liquefaction susceptibility is based on the estimation of a safety factor between the cyclic shear resistance to liquefaction and the earthquake induced shear stress. Recently, an updated procedure based on shear-wave velocities (Vs) has been proposed which could be more easily applied. These methods have been applied at La Plaja beach of Catania, that experienced liquefaction because of the 1693 earthquake. The detailed geotechnical and Vs information and the realistic ground motion computed for the 1693 event let us compare the two approaches. The successful application of the Vs procedure, slightly modified to fit historical and safety factor information, even if additional field performances are needed, encourages the development of a guide for liquefaction potential analysis, based on well defined Vs profiles to be included in the italian seismic code.

  1. CISN ShakeAlert: Faster Warning Information Through Multiple Threshold Event Detection in the Virtual Seismologist (VS) Early Warning Algorithm

    NASA Astrophysics Data System (ADS)

    Cua, G. B.; Fischer, M.; Caprio, M.; Heaton, T. H.; Cisn Earthquake Early Warning Project Team

    2010-12-01

    The Virtual Seismologist (VS) earthquake early warning (EEW) algorithm is one of 3 EEW approaches being incorporated into the California Integrated Seismic Network (CISN) ShakeAlert system, a prototype EEW system that could potentially be implemented in California. The VS algorithm, implemented by the Swiss Seismological Service at ETH Zurich, is a Bayesian approach to EEW, wherein the most probable source estimate at any given time is a combination of contributions from a likehihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS codes have been running in real-time at the Southern California Seismic Network since July 2008, and at the Northern California Seismic Network since February 2009. We discuss recent enhancements to the VS EEW algorithm that are being integrated into CISN ShakeAlert. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to initiate an event declaration, with the goal of reducing false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and the requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) into an on-site/regional approach capable of providing a continuously evolving stream of EEW information starting from the first P-detection. Real-time and offline analysis on Swiss and California waveform datasets indicate that the multiple-threshold approach is faster and more reliable for larger events than the earlier version of the VS codes. In addition, we provide evolutionary estimates of the probability of false alarms (PFA), which is an envisioned output stream of the CISN ShakeAlert system. The real-time decision-making approach envisioned for CISN ShakeAlert users, where users specify a threshhold PFA in addition to thresholds on peak ground motion estimates, has the potential to increase the available warning time for users with high tolerance to false alarms without compromising the needs of users with lower tolerances to false alarms.

  2. Seismic performance of spherical liquid storage tanks: a case study

    NASA Astrophysics Data System (ADS)

    Fiore, Alessandra; Demartino, Cristoforo; Greco, Rita; Rago, Carlo; Sulpizio, Concetta; Vanzi, Ivo

    2018-02-01

    Spherical storage tanks are widely used for various types of liquids, including hazardous contents, thus requiring suitable and careful design for seismic actions. On this topic, a significant case study is described in this paper, dealing with the dynamic analysis of a spherical storage tank containing butane. The analyses are based on a detailed finite element (FE) model; moreover, a simplified single-degree-of-freedom idealization is also set up and used for verification of the FE results. Particular attention is paid to the influence of sloshing effects and of the soil-structure interaction for which no special provisions are contained in technical codes for this reference case. Sloshing effects are investigated according to the current literature state of the art. An efficient methodology based on an "impulsive-convective" decomposition of the container-fluid motion is adopted for the calculation of the seismic force. With regard to the second point, considering that the tank is founded on piles, soil-structure interaction is taken into account by computing the dynamic impedances. Comparison between seismic action effects, obtained with and without consideration of sloshing and soil-structure interaction, shows a rather important influence of these parameters on the final results. Sloshing effects and soil-structure interaction can produce, for the case at hand, beneficial effects. For soil-structure interaction, this depends on the increase of the fundamental period and of the effective damping of the overall system, which leads to reduced design spectral values.

  3. A Bayesian approach to the modelling of α Cen A

    NASA Astrophysics Data System (ADS)

    Bazot, M.; Bourguignon, S.; Christensen-Dalsgaard, J.

    2012-12-01

    Determining the physical characteristics of a star is an inverse problem consisting of estimating the parameters of models for the stellar structure and evolution, and knowing certain observable quantities. We use a Bayesian approach to solve this problem for α Cen A, which allows us to incorporate prior information on the parameters to be estimated, in order to better constrain the problem. Our strategy is based on the use of a Markov chain Monte Carlo (MCMC) algorithm to estimate the posterior probability densities of the stellar parameters: mass, age, initial chemical composition, etc. We use the stellar evolutionary code ASTEC to model the star. To constrain this model both seismic and non-seismic observations were considered. Several different strategies were tested to fit these values, using either two free parameters or five free parameters in ASTEC. We are thus able to show evidence that MCMC methods become efficient with respect to more classical grid-based strategies when the number of parameters increases. The results of our MCMC algorithm allow us to derive estimates for the stellar parameters and robust uncertainties thanks to the statistical analysis of the posterior probability densities. We are also able to compute odds for the presence of a convective core in α Cen A. When using core-sensitive seismic observational constraints, these can rise above ˜40 per cent. The comparison of results to previous studies also indicates that these seismic constraints are of critical importance for our knowledge of the structure of this star.

  4. Effects of non-structural components and soil-structure interaction on the seismic response of framed structures

    NASA Astrophysics Data System (ADS)

    Ditommaso, Rocco; Auletta, Gianluca; Iacovino, Chiara; Nigro, Antonella; Carlo Ponzo, Felice

    2017-04-01

    In this paper, several nonlinear numerical models of reinforced concrete framed structures have been defined in order to evaluate the effects of non-structural elements and soil-structure interaction on the elastic dynamic behaviour of buildings. In the last few years, many and various studies have highlighted the significant effects derived from the interaction between structural and non-structural components on the main dynamic characteristics of a building. Usually, structural and non-structural elements act together, adding both masses and stiffness. The presence of infill panels is generally neglected in the design process of structural elements, although these elements can significantly increase the lateral stiffness of a structure leading to a modification in the dynamic properties. Particularly, at the Damage Limit State (where an elastic behaviour is expected), soil-structure interaction effects and non-structural elements may further affect the elastic natural period of buildings, changing the spectral accelerations compared with those provided by seismic codes in case of static analyses. In this work, a parametric study has been performed in order to evaluate the elastic fundamental period of vibration of buildings as a function of structural morphology (height, plan area, ratio between plan dimensions), infills presence and distribution and soil characteristics. Acknowledgements This study was partially funded by the Italian Department of Civil Protection within the project DPC-RELUIS 2016 - RS4 ''Seismic observatory of structures and health monitoring'' and by the "Centre of Integrated Geomorphology for the Mediterranean Area - CGIAM" within the Framework Agreement with the University of Basilicata "Study, Research and Experimentation in the Field of Analysis and Monitoring of Seismic Vulnerability of Strategic and Relevant Buildings for the purposes of Civil Protection and Development of Innovative Strategies of Seismic Reinforcement".

  5. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, Gerardo; Bonito, Laura; Lampasi, Alessandro; Revellino, Paola; Guerriero, Luigi; Sappa, Giuseppe; Guadagno, Francesco Maria

    2016-04-01

    The SiSeRHMap (simulator for mapped seismic response using a hybrid model) is a computerized methodology capable of elaborating prediction maps of seismic response in terms of acceleration spectra. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code architecture composed of five interdependent modules. A GIS (geographic information system) cubic model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A meta-modelling process confers a hybrid nature to the methodology. In this process, the one-dimensional (1-D) linear equivalent analysis produces acceleration response spectra for a specified number of site profiles using one or more input motions. The shear wave velocity-thickness profiles, defined as trainers, are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Emul-spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated evolutionary algorithm (EA) and the Levenberg-Marquardt algorithm (LMA) as the final optimizer. In the final step, the GCM maps executor module produces a serial map set of a stratigraphic seismic response at different periods, grid solving the calibrated Emul-spectra model. In addition, the spectra topographic amplification is also computed by means of a 3-D validated numerical prediction model. This model is built to match the results of the numerical simulations related to isolate reliefs using GIS morphometric data. In this way, different sets of seismic response maps are developed on which maps of design acceleration response spectra are also defined by means of an enveloping technique.

  6. Magma intrusion near Volcan Tancitaro: Evidence from seismic analysis

    DOE PAGES

    Pinzon, Juan I.; Nunez-Cornu, Francisco J.; Rowe, Charlotte Anne

    2016-11-17

    Between May and June 2006, an earthquake swarm occurred near Volcan Tancítaro in Mexico, which was recorded by a temporary seismic deployment known as the MARS network. We located ~1000 events from this seismic swarm. Previous earthquake swarms in the area were reported in the years 1997, 1999 and 2000. We relocate and analyze the evolution and properties of the 2006 earthquake swarm, employing a waveform cross-correlation-based phase repicking technique. Hypocenters from 911 events were located and divided into eighteen families having a correlation coefficient at or above 0.75. 90% of the earthquakes provide at least sixteen phase picks. Wemore » used the single-event location code Hypo71 and the P-wave velocity model used by the Jalisco Seismic and Accelerometer Network to improve hypocenters based on the correlation-adjusted phase arrival times. We relocated 121 earthquakes, which show clearly two clusters, between 9–10 km and 3–4 km depth respectively. The average location error estimates are <1 km epicentrally, and <2 km in depth, for the largest event in each cluster. Depths of seismicity migrate upward from 16 to 3.5 km and exhibit a NE-SW trend. The swarm first migrated toward Paricutin Volcano but by mid-June began propagating back toward Volcán Tancítaro. In addition to its persistence, noteworthy aspects of this swarm include a quasi-exponential increase in the rate of activity within the first 15 days; a b-value of 1.47; a jug-shaped hypocenter distribution; a shoaling rate of ~5 km/month within the deeper cluster, and a composite focal mechanism solution indicating largely reverse faulting. As a result, these features of the swarm suggest a magmatic source elevating the crustal strain beneath Volcan Tancítaro.« less

  7. Magma intrusion near Volcan Tancitaro: Evidence from seismic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinzon, Juan I.; Nunez-Cornu, Francisco J.; Rowe, Charlotte Anne

    Between May and June 2006, an earthquake swarm occurred near Volcan Tancítaro in Mexico, which was recorded by a temporary seismic deployment known as the MARS network. We located ~1000 events from this seismic swarm. Previous earthquake swarms in the area were reported in the years 1997, 1999 and 2000. We relocate and analyze the evolution and properties of the 2006 earthquake swarm, employing a waveform cross-correlation-based phase repicking technique. Hypocenters from 911 events were located and divided into eighteen families having a correlation coefficient at or above 0.75. 90% of the earthquakes provide at least sixteen phase picks. Wemore » used the single-event location code Hypo71 and the P-wave velocity model used by the Jalisco Seismic and Accelerometer Network to improve hypocenters based on the correlation-adjusted phase arrival times. We relocated 121 earthquakes, which show clearly two clusters, between 9–10 km and 3–4 km depth respectively. The average location error estimates are <1 km epicentrally, and <2 km in depth, for the largest event in each cluster. Depths of seismicity migrate upward from 16 to 3.5 km and exhibit a NE-SW trend. The swarm first migrated toward Paricutin Volcano but by mid-June began propagating back toward Volcán Tancítaro. In addition to its persistence, noteworthy aspects of this swarm include a quasi-exponential increase in the rate of activity within the first 15 days; a b-value of 1.47; a jug-shaped hypocenter distribution; a shoaling rate of ~5 km/month within the deeper cluster, and a composite focal mechanism solution indicating largely reverse faulting. As a result, these features of the swarm suggest a magmatic source elevating the crustal strain beneath Volcan Tancítaro.« less

  8. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  9. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    NASA Astrophysics Data System (ADS)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  10. Magma intrusion near Volcan Tancítaro: Evidence from seismic analysis

    NASA Astrophysics Data System (ADS)

    Pinzón, Juan I.; Núñez-Cornú, Francisco J.; Rowe, Charlotte A.

    2017-01-01

    Between May and June 2006, an earthquake swarm occurred near Volcan Tancítaro in Mexico, which was recorded by a temporary seismic deployment known as the MARS network. We located ∼1000 events from this seismic swarm. Previous earthquake swarms in the area were reported in the years 1997, 1999 and 2000. We relocate and analyze the evolution and properties of the 2006 earthquake swarm, employing a waveform cross-correlation-based phase repicking technique. Hypocenters from 911 events were located and divided into eighteen families having a correlation coefficient at or above 0.75. 90% of the earthquakes provide at least sixteen phase picks. We used the single-event location code Hypo71 and the P-wave velocity model used by the Jalisco Seismic and Accelerometer Network to improve hypocenters based on the correlation-adjusted phase arrival times. We relocated 121 earthquakes, which show clearly two clusters, between 9-10 km and 3-4 km depth respectively. The average location error estimates are <1 km epicentrally, and <2 km in depth, for the largest event in each cluster. Depths of seismicity migrate upward from 16 to 3.5 km and exhibit a NE-SW trend. The swarm first migrated toward Paricutin Volcano but by mid-June began propagating back toward Volcán Tancítaro. In addition to its persistence, noteworthy aspects of this swarm include a quasi-exponential increase in the rate of activity within the first 15 days; a b-value of 1.47; a jug-shaped hypocenter distribution; a shoaling rate of ∼5 km/month within the deeper cluster, and a composite focal mechanism solution indicating largely reverse faulting. These features of the swarm suggest a magmatic source elevating the crustal strain beneath Volcan Tancítaro.

  11. Fault Specific Seismic Hazard Maps as Input to Loss Reserves Calculation for Attica Buildings

    NASA Astrophysics Data System (ADS)

    Deligiannakis, Georgios; Papanikolaou, Ioannis; Zimbidis, Alexandros; Roberts, Gerald

    2014-05-01

    Greece is prone to various natural disasters, such as wildfires, floods, landslides and earthquakes, due to the special environmental and geological conditions dominating in tectonic plate boundaries. Seismic is the predominant risk, in terms of damages and casualties in the Greek territory. The historical record of earthquakes in Greece has been published from various researchers, providing useful data in seismic hazard assessment of Greece. However, the completeness of the historical record in Greece, despite being one of the longest worldwide, reaches only 500 years for M ≥ 7.3 and less than 200 years for M ≥ 6.5. Considering that active faults in the area have recurrence intervals of a few hundred to several thousands of years, it is clear that many active faults have not been activated during the completeness period covered by the historical records. New Seismic Hazard Assessment methodologies tend to follow fault specific approaches where seismic sources are geologically constrained active faults, in order to address problems related to the historical records incompleteness, obtain higher spatial resolution and calculate realistic source locality distances, since seismic sources are very accurately located. Fault specific approaches provide quantitative assessments as they measure fault slip rates from geological data, providing a more reliable estimate of seismic hazard. We used a fault specific seismic hazard assessment approach for the region of Attica. The method of seismic hazard mapping from geological fault throw-rate data combined three major factors: Empirical data which combine fault rupture lengths, earthquake magnitudes and coseismic slip relationships. The radiuses of VI, VII, VIII and IX isoseismals on the Modified Mercalli (MM) intensity scale. Attenuation - amplification functions for seismic shaking on bedrock compared to basin filling sediments. We explicitly modeled 22 active faults that could affect the region of Attica, including Athens, using detailed data derived from published papers, neotectonic maps and fieldwork observations. Moreover, we incorporated background seismicity models from the historic record and also the subduction zone earthquakes distribution, for the integration of strong deep earthquakes that could also affect Attica region. We created 4 high spatial resolution seismic hazard maps for the region of Attica, one for each of the intensities VII - X (MM). These maps offer a locality specific shaking recurrence record, which represents the long-term shaking record in a more complete way, since they incorporate several seismic cycles of the active faults that could affect Attica. Each one of these high resolution seismic hazard maps displays both the spatial distribution and the recurrence, over a specific time period, of the relevant intensity. Time - independent probabilities were extracted based on these average recurrence intervals, using the stationary Poisson model P = 1 -e-Λt. The 'Λ' value was provided by the intensities recurrence, as displayed in the seismic hazard maps. However, the insurance contracts usually lack of detailed spatial information and they refer to Postal Codes level, akin to CRESTA zones. To this end, a time-independent probability of shaking at intensities VII - X was calculated for every Postal Code, for a given time period, using the Poisson model. The reserves calculation on buildings portfolio combines the probability of events of specific intensities within the Postal Codes, with the buildings characteristics, such as the building construction type and the insured value. We propose a standard approach for the reserves calculation K(t) for a specific time period: K (t) = x2 ·[x1 ·y1 ·P1(t) + x1 ·y2 ·P2(t) + x1 ·y3 ·P3(t) + x1 ·y4 ·P4(t)] x1 which is a function of the probabilities of occurrence for the seismic intensities VII - X (P1(t) -P4(t)) for the same period, the value of the building x1, the insured value x2 and the characteristics of the building, such as the construction type, age, height and use of property (y1 - y4). Furthermore a stochastic approach is also adopted in order to obtain the relevant reserve value K(t) for the specific time period. This calculation considers a set of simulations from the Poisson random variable and then taking the respective expectations.

  12. Influence of the bond-slip relationship on the flexural capacity of R.C. joints damaged by corrosion

    NASA Astrophysics Data System (ADS)

    Imperatore, Stefania

    2016-06-01

    In moderate and aggressive environmental condition, old reinforced concrete structures are often subjected to corrosive phenomena. Corrosion causes cracking, loss of diameter in reinforcement and variation of the bond behavior between steel and concrete. Then, in presence of cyclic actions like the seismic ones, old R.C. elements vary their ultimate drift, ductility, plastic rotation capacity and energy dissipation with the corrosion level. The problem is of current interest: the issue has been introduced in some paragraph of the Model Code 2010 and a committee is now drafting a new document on assessment strategies on existing concrete structures also damaged by corrosion. In this work, a first step on the analysis of the impact of the corrosion on the seismic behavior of R.C. elements is assessed: by mean FEM analyses, of a poor detailed column/foundation joint is analyzed in a parametric way in order to evaluate the influence of the bond-slip degradation by corrosion on the element flexural capacity.

  13. Concordia CCD - A Geoscope station in continental Antarctica

    NASA Astrophysics Data System (ADS)

    Maggi, A.; Lévêque, J.; Thoré, J.; Bes de Berc, M.; Bernard, A.; Danesi, S.; Morelli, A.; Delladio, A.; Sorrentino, D.; Stutzmann, E.; Geoscope Team

    2010-12-01

    Concordia (Dome C, Antarctica) has had a permanent seismic station since 2005. It is run by EOST and INGV in collaboration with the French and Italian polar institutes (IPEV and PNRA). It is installed in an ice-vault, at 12m depth, distant 1km from the permanent scientific base at Concordia. The temperature in the vault is a constant -55°C. The data quality at the station has improved continuously since its installation. In 2007, the station was declared at ISC as an open station with station code CCD (ConCorDia), with data available upon request. It is only the second permanent station in the Antarctic continent, after South Pole. In 2010, CCD was included in the Geoscope network. Data from CCD starting in 2007 are now freely available from the Geoscope Data Center and IRIS. We present an analysis of the data quality at CCD, and describe the technical difficulties of operating an observatory-quality seismic station in the extreme environmental conditons present in continental Antarctica.

  14. Earthquake Loading Assessment to Evaluate Liquefaction Potential in Emilia-Romagna Region

    NASA Astrophysics Data System (ADS)

    Daminelli, R.; Marcellini, A.; Tento, A.

    2016-12-01

    The May-June 2012 seismic sequence that struck Lombardia and Emilia-Romagna consisted of seven main events of magnitude greater than 5 followed by numerous aftershocks. The strongest earthquakes occurred on May 20 (M=5.9) and May 29 (M=5.8). The widespread soil liquefaction, unexpected because of the moderate magnitude of the events, pushed the local authorities to issue research projects aimed to define the earthquake loading to evaluate the liquefaction safety factor. The reasons explained below led us to adopt a deterministic hazard approach to evaluate the seismic parameters relevant to liquefaction assessment, despite the fact that the Italian Seismic Building Code (NTC08) is based on probabilistic hazard analysis. For urban planning and building design geologists generally adopt the CRR/CSR technique to assess liquefaction potential; therefore we considered PGA and a design magnitude to be representative of the seismic loading. The procedure adopted consists: a) identification of seismic source zones and characterization of each zone by the maximum magnitude; b) evaluation of the source to site distance and c) adoption of a suitable attenuation law to compute the expected PGA at the site, given the site condition and the design magnitude. The design magnitude can be: the maximum magnitude; the magnitude that causes the largest PGA, or both. The PGA values obtained are larger with respect to the 474 years return period PGA prescribed by NTC08 for the seismic design for ordinary buildings. We conducted a CPTU resistance test intended to define the CRR at the village of Cavezzo, situated in the epicentral area of the 2012 earthquake. The CRR/CSR ratio led to an elevated liquefaction risk at the analysed site. On the contrary the adoption of the 474 years return period PGA of the NTCO8 prescribed for Cavezzo site led to a negligible liquefaction risk. Note that very close to the investigated site several liquefaction phenomena were observed.

  15. Earthquake Loading Assessment to Evaluate Liquefaction Potential in Emilia-Romagna Region

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Marcellini, Alberto; Tento, Alberto

    2017-04-01

    The May-June 2012 seismic sequence that struck Lombardia and Emilia-Romagna consisted of seven main events of magnitude greater than 5 followed by numerous aftershocks. The strongest earthquakes occurred on May 20 (M=5.9) and May 29 (M=5.8). The widespread soil liquefaction, unexpected because of the moderate magnitude of the events, pushed the local authorities to issue research projects aimed to define the earthquake loading to evaluate the liquefaction safety factor. The reasons explained below led us to adopt a deterministic hazard approach to evaluate the seismic parameters relevant to liquefaction assessment, despite the fact that the Italian Seismic Building Code (NTC08) is based on probabilistic hazard analysis. For urban planning and building design geologists generally adopt the CRR/CSR technique to assess liquefaction potential; therefore we considered PGA and a design magnitude to be representative of the seismic loading. The procedure adopted consists: a) identification of seismic source zones and characterization of each zone by the maximum magnitude; b) evaluation of the source to site distance and c) adoption of a suitable attenuation law to compute the expected PGA at the site, given the site condition and the design magnitude. The design magnitude can be: the maximum magnitude; the magnitude that causes the largest PGA, or both. The PGA values obtained are larger with respect to the 474 years return period PGA prescribed by NTC08 for the seismic design for ordinary buildings. We conducted a CPTU resistance test intended to define the CRR at the village of Cavezzo, situated in the epicentral area of the 2012 earthquake. The CRR/CSR ratio led to an elevated liquefaction risk at the analysed site. On the contrary the adoption of the 474 years return period PGA of the NTCO8 prescribed for Cavezzo site led to a negligible liquefaction risk. Note that very close to the investigated site several liquefaction phenomena were observed.

  16. A permanent seismic station beneath the Ocean Bottom

    NASA Astrophysics Data System (ADS)

    Harris, David; Cessaro, Robert K.; Duennebier, Fred K.; Byrne, David A.

    1987-03-01

    The Hawaii Institute of Geophysics began development of the Ocean Subbottom Seisometer (OSS) system in 1978, and OSS systems were installed in four locations between 1979 and 1982. The OSS system is a permanent, deep ocean borehole seismic recording system composed of a borehole sensor package (tool), an electromechanical cable, recorder package, and recovery system. Installed near the bottom of a borehole (drilled by the D/V Glomar Challenger), the tool contains three orthogonal, 4.5-Hz geophones, two orthogonal tilt meters; and a temperature sensor. Signals from these sensors are multiplexed, digitized (with a floating point technique), and telemetered through approximately 10 km of electromechanical cable to a recorder package located near the ocean bottom. Electrical power for the tool is supplied from the recorder package. The digital seismic signals are demultiplexed, converted back to analog form, processed through an automatic gain control (AGC) circuit, and recorded along with a time code on magnetic tape cassettes in the recorder package. Data may be recorded continuously for up to two months in the self-contained recorder package. Data may also be recorded in real time (digital formal) during the installation and subsequent recorder package servicing. The recorder package is connected to a submerged recovery buoy by a length of bouyant polypropylene rope. The anchor on the recovery buoy is released by activating either of the acoustical command releases. The polypropylene rope may also be seized with a grappling hook to effect recovery. The recorder package may be repeatedly serviced as long as the tool remains functional A wide range of data has been recovered from the OSS system. Recovered analog records include signals from natural seismic sources such as earthquakes (teleseismic and local), man-made seismic sources such as refraction seismic shooting (explosives and air cannons), and nuclear tests. Lengthy continuous recording has permitted analysis of wideband noise levels, and the slowly varying parameters, temperature and tilt.

  17. Reducing disk storage of full-3D seismic waveform tomography (F3DT) through lossy online compression

    NASA Astrophysics Data System (ADS)

    Lindstrom, Peter; Chen, Po; Lee, En-Jui

    2016-08-01

    Full-3D seismic waveform tomography (F3DT) is the latest seismic tomography technique that can assimilate broadband, multi-component seismic waveform observations into high-resolution 3D subsurface seismic structure models. The main drawback in the current F3DT implementation, in particular the scattering-integral implementation (F3DT-SI), is the high disk storage cost and the associated I/O overhead of archiving the 4D space-time wavefields of the receiver- or source-side strain tensors. The strain tensor fields are needed for computing the data sensitivity kernels, which are used for constructing the Jacobian matrix in the Gauss-Newton optimization algorithm. In this study, we have successfully integrated a lossy compression algorithm into our F3DT-SI workflow to significantly reduce the disk space for storing the strain tensor fields. The compressor supports a user-specified tolerance for bounding the error, and can be integrated into our finite-difference wave-propagation simulation code used for computing the strain fields. The decompressor can be integrated into the kernel calculation code that reads the strain fields from the disk and compute the data sensitivity kernels. During the wave-propagation simulations, we compress the strain fields before writing them to the disk. To compute the data sensitivity kernels, we read the compressed strain fields from the disk and decompress them before using them in kernel calculations. Experiments using a realistic dataset in our California statewide F3DT project have shown that we can reduce the strain-field disk storage by at least an order of magnitude with acceptable loss, and also improve the overall I/O performance of the entire F3DT-SI workflow significantly. The integration of the lossy online compressor may potentially open up the possibilities of the wide adoption of F3DT-SI in routine seismic tomography practices in the near future.

  18. Improving Station Performance by Building Isolation Walls in the Tunnel

    NASA Astrophysics Data System (ADS)

    Jia, Yan; Horn, Nikolaus; Leohardt, Roman

    2014-05-01

    Conrad Observatory is situated far away from roads and industrial areas on the Trafelberg in Lower Austria. At the end of the seismic tunnel, the main seismic instrument of the Observatory with a station code CONA is located. This station is one of the most important seismic stations in the Austrian Seismic Network (network code OE). The seismic observatory consists of a 145m long gallery and an underground laboratory building with several working areas. About 25 meters away from the station CONA, six temporary seismic stations were implemented for research purposes. Two of them were installed with the same equipment as CONA, while the remaining four stations were set up with digitizers having lower noise and higher resolution (Q330HR) and sensors with the same type (STS-2). In order to prevent possible disturbances by air pressure and temperature fluctuation, three walls were built inside of the tunnel. The first wall is located ca 63 meters from the tunnel entrance, while a set of double walls with a distance of 1.5 meters is placed about 53 meters from the first isolation wall but between the station CONA and the six temporary stations. To assess impact of the isolation walls on noise reduction and detection performance, investigations are conducted in two steps. The first study is carried out by comparing the noise level and detection performance between the station CONA behind the double walls and the stations in front of the double walls for verifying the noise isolation by the double walls. To evaluate the effect of the single wall, station noise level and detection performance were studied by comparing the results before and after the installation of the wall. Results and discussions will be presented. Additional experiment is conducted by filling insulation material inside of the aluminium boxes of the sensors (above and around the sensors). This should help us to determine an optimal insulation of the sensors with respect to pressure and temperature fluctuations.

  19. Reducing Disk Storage of Full-3D Seismic Waveform Tomography (F3DT) Through Lossy Online Compression

    DOE PAGES

    Lindstrom, Peter; Chen, Po; Lee, En-Jui

    2016-05-05

    Full-3D seismic waveform tomography (F3DT) is the latest seismic tomography technique that can assimilate broadband, multi-component seismic waveform observations into high-resolution 3D subsurface seismic structure models. The main drawback in the current F3DT implementation, in particular the scattering-integral implementation (F3DT-SI), is the high disk storage cost and the associated I/O overhead of archiving the 4D space-time wavefields of the receiver- or source-side strain tensors. The strain tensor fields are needed for computing the data sensitivity kernels, which are used for constructing the Jacobian matrix in the Gauss-Newton optimization algorithm. In this study, we have successfully integrated a lossy compression algorithmmore » into our F3DT SI workflow to significantly reduce the disk space for storing the strain tensor fields. The compressor supports a user-specified tolerance for bounding the error, and can be integrated into our finite-difference wave-propagation simulation code used for computing the strain fields. The decompressor can be integrated into the kernel calculation code that reads the strain fields from the disk and compute the data sensitivity kernels. During the wave-propagation simulations, we compress the strain fields before writing them to the disk. To compute the data sensitivity kernels, we read the compressed strain fields from the disk and decompress them before using them in kernel calculations. Experiments using a realistic dataset in our California statewide F3DT project have shown that we can reduce the strain-field disk storage by at least an order of magnitude with acceptable loss, and also improve the overall I/O performance of the entire F3DT-SI workflow significantly. The integration of the lossy online compressor may potentially open up the possibilities of the wide adoption of F3DT-SI in routine seismic tomography practices in the near future.« less

  20. A new discrete-element approach for the assessment of the seismic resistance of composite reinforced concrete-masonry buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calio, I.; Cannizzaro, F.; Marletta, M.

    2008-07-08

    In the present study a new discrete-element approach for the evaluation of the seismic resistance of composite reinforced concrete-masonry structures is presented. In the proposed model, unreinforced masonry panels are modelled by means of two-dimensional discrete-elements, conceived by the authors for modelling masonry structures, whereas the reinforced concrete elements are modelled by lumped plasticity elements interacting with the masonry panels through nonlinear interface elements. The proposed procedure was adopted for the assessment of the seismic response of a case study confined-masonry building which was conceived to be a typical representative of a wide class of residential buildings designed to themore » requirements of the 1909 issue of the Italian seismic code and widely adopted in the aftermath of the 1908 earthquake for the reconstruction of the cities of Messina and Reggio Calabria.« less

  1. Seismic anisotropy in deforming salt bodies

    NASA Astrophysics Data System (ADS)

    Prasse, P.; Wookey, J. M.; Kendall, J. M.; Dutko, M.

    2017-12-01

    Salt is often involved in forming hydrocarbon traps. Studying salt dynamics and the deformation processes is important for the exploration industry. We have performed numerical texture simulations of single halite crystals deformed by simple shear and axial extension using the visco-plastic self consistent approach (VPSC). A methodology from subduction studies to estimate strain in a geodynamic simulation is applied to a complex high-resolution salt diapir model. The salt diapir deformation is modelled with the ELFEN software by our industrial partner Rockfield, which is based on a finite-element code. High strain areas at the bottom of the head-like strctures of the salt diapir show high amount of seismic anisotropy due to LPO development of halite crystals. The results demonstrate that a significant degree of seismic anisotropy can be generated, validating the view that this should be accounted for in the treatment of seismic data in, for example, salt diapir settings.

  2. A new discrete-element approach for the assessment of the seismic resistance of composite reinforced concrete-masonry buildings

    NASA Astrophysics Data System (ADS)

    Caliò, I.; Cannizzaro, F.; D'Amore, E.; Marletta, M.; Pantò, B.

    2008-07-01

    In the present study a new discrete-element approach for the evaluation of the seismic resistance of composite reinforced concrete-masonry structures is presented. In the proposed model, unreinforced masonry panels are modelled by means of two-dimensional discrete-elements, conceived by the authors for modelling masonry structures, whereas the reinforced concrete elements are modelled by lumped plasticity elements interacting with the masonry panels through nonlinear interface elements. The proposed procedure was adopted for the assessment of the seismic response of a case study confined-masonry building which was conceived to be a typical representative of a wide class of residential buildings designed to the requirements of the 1909 issue of the Italian seismic code and widely adopted in the aftermath of the 1908 earthquake for the reconstruction of the cities of Messina and Reggio Calabria.

  3. Large Historical Earthquakes and Tsunami Hazards in the Western Mediterranean: Source Characteristics and Modelling

    NASA Astrophysics Data System (ADS)

    Harbi, Assia; Meghraoui, Mustapha; Belabbes, Samir; Maouche, Said

    2010-05-01

    The western Mediterranean region was the site of numerous large earthquakes in the past. Most of these earthquakes are located at the East-West trending Africa-Eurasia plate boundary and along the coastline of North Africa. The most recent recorded tsunamigenic earthquake occurred in 2003 at Zemmouri-Boumerdes (Mw 6.8) and generated ~ 2-m-high tsunami wave. The destructive wave affected the Balearic Islands and Almeria in southern Spain and Carloforte in southern Sardinia (Italy). The earthquake provided a unique opportunity to gather instrumental records of seismic waves and tide gauges in the western Mediterranean. A database that includes a historical catalogue of main events, seismic sources and related fault parameters was prepared in order to assess the tsunami hazard of this region. In addition to the analysis of the 2003 records, we study the 1790 Oran and 1856 Jijel historical tsunamigenic earthquakes (Io = IX and X, respectively) that provide detailed observations on the heights and extension of past tsunamis and damage in coastal zones. We performed the modelling of wave propagation using NAMI-DANCE code and tested different fault sources from synthetic tide gauges. We observe that the characteristics of seismic sources control the size and directivity of tsunami wave propagation on both northern and southern coasts of the western Mediterranean.

  4. Non-Seismology Seismology: Using QuakeCatchers to Analyze the Frequency of Bridge Vibrations

    NASA Astrophysics Data System (ADS)

    Courtier, A. M.; Constantin, C.; Wilson, C. F.

    2013-12-01

    We conducted an experiment to test the feasibility of measuring seismic waves generated by traffic near James Madison University. We used QuakeCatcher seismometers (originally designed for passive seismic measurement) to measure vibrations associated with traffic on a wooden bridge as well as a nearby concrete bridge. This experiment was a signal processing exercise for a student research project and did not draw any conclusions regarding bridge safety or security. The experiment consisted of two temporary measurement stations comprised of a laptop computer and a QuakeCatcher - a small seismometer that plugs directly into the laptop via a USB cable. The QuakeCatcher was taped to the ground at the edge of the bridge to achieve good coupling, and vibrational events were triggered repeatedly with a control vehicle to accumulate a consistent dataset of the bridge response. For the wooden bridge, the resulting 'seismograms' were converted to Seismic Analysis Code (SAC) format and analyzed in MATLAB. The concrete bridge did not generate vibrations significant enough to trigger the recording mechanism on the QuakeCatchers. We will present an overview of the experimental design and frequency content of the traffic patterns, as well as a discussion of the instructional benefits of using the QuakeCatcher sensors in this non-traditional setting.

  5. Predicted Attenuation Relation and Observed Ground Motion of Gorkha Nepal Earthquake of 25 April 2015

    NASA Astrophysics Data System (ADS)

    Singh, R. P.; Ahmad, R.

    2015-12-01

    A comparison of recent observed ground motion parameters of recent Gorkha Nepal earthquake of 25 April 2015 (Mw 7.8) with the predicted ground motion parameters using exitsing attenuation relation of the Himalayan region will be presented. The recent earthquake took about 8000 lives and destroyed thousands of poor quality of buildings and the earthquake was felt by millions of people living in Nepal, China, India, Bangladesh, and Bhutan. The knowledge of ground parameters are very important in developing seismic code of seismic prone regions like Himalaya for better design of buildings. The ground parameters recorded in recent earthquake event and aftershocks are compared with attenuation relations for the Himalayan region, the predicted ground motion parameters show good correlation with the observed ground parameters. The results will be of great use to Civil engineers in updating existing building codes in the Himlayan and surrounding regions and also for the evaluation of seismic hazards. The results clearly show that the attenuation relation developed for the Himalayan region should be only used, other attenuation relations based on other regions fail to provide good estimate of observed ground motion parameters.

  6. Should ground-motion records be rotated to fault-normal/parallel or maximum direction for response history analysis of buildings?

    USGS Publications Warehouse

    Reyes, Juan C.; Kalkan, Erol

    2012-01-01

    In the United States, regulatory seismic codes (for example, California Building Code) require at least two sets of horizontal ground-motion components for three-dimensional (3D) response history analysis (RHA) of building structures. For sites within 5 kilometers (3.1 miles) of an active fault, these records should be rotated to fault-normal and fault-parallel (FN/FP) directions, and two RHAs should be performed separately—when FN and then FP direction are aligned with transverse direction of the building axes. This approach is assumed to lead to two sets of responses that envelope the range of possible responses over all nonredundant rotation angles. The validity of this assumption is examined here using 3D computer models of single-story structures having symmetric (torsionally stiff) and asymmetric (torsionally flexible) layouts subjected to an ensemble of near-fault ground motions with and without apparent velocity pulses. In this parametric study, the elastic vibration period is varied from 0.2 to 5 seconds, and yield-strength reduction factors, R, are varied from a value that leads to linear-elastic design to 3 and 5. Further validations are performed using 3D computer models of 9-story structures having symmetric and asymmetric layouts subjected to the same ground-motion set. The influence of the ground-motion rotation angle on several engineering demand parameters (EDPs) is examined in both linear-elastic and nonlinear-inelastic domains to form benchmarks for evaluating the use of the FN/FP directions and also the maximum direction (MD). The MD ground motion is a new definition for horizontal ground motions for use in site-specific ground-motion procedures for seismic design according to provisions of the American Society of Civil Engineers/Seismic Engineering Institute (ASCE/SEI) 7-10. The results of this study have important implications for current practice, suggesting that ground motions rotated to MD or FN/FP directions do not necessarily provide the most critical EDPs in nonlinear-inelastic domain; however, they tend to produce larger EDPs than as-recorded (arbitrarily oriented) motions.

  7. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  8. Determination of Paleoseismic Ground Motions from Inversion of Block Failures in Masonry Structures

    NASA Astrophysics Data System (ADS)

    Yagoda-Biran, G.; Hatzor, Y. H.

    2010-12-01

    Accurate estimation of ground motion parameters such as expected peak ground acceleration (PGA), predominant frequency and duration of motion in seismically active regions, is crucial for hazard preparedness and sound engineering design. The best way to estimate quantitatively these parameters would be to investigate long term recorded data of past strong earthquakes in a studied region. In some regions of the world however recorded data are scarce due to lack of seismic network infrastructure, and in all regions the availability of recorded data is restricted to the late 19th century and onwards. Therefore, existing instrumental data are hardly representative of the true seismicity of a region. When recorded data are scarce or not available, alternative methods may be applied, for example adopting a quantitative paleoseismic approach. In this research we suggest the use of seismically damaged masonry structures as paleoseismic indicators. Visitors to archeological sites all over the world are often struck by structural failure features which seem to be "seismically driven", particularly when inspecting old masonry structures. While it is widely accepted that no other loading mechanism can explain the preserved damage, the actual driving mechanism remains enigmatic even now. In this research we wish to explore how such failures may be triggered by earthquake induced ground motions and use observed block displacements to determine the characteristic parameters of the paleoseismic earthquake motion, namely duration, frequency, and amplitude. This is performed utilizing a 3D, fully dynamic, numerical analysis performed with the Discontinuous Deformation Analysis (DDA) method. Several case studies are selected for 3D numerical analysis. First we study a simple structure in the old city of L'Aquila, Italy. L'Aquila was hit by an earthquake on April 6th, 2009, with over 300 casualties and many of its medieval buildings damaged. This case study is an excellent opportunity to validate our method, since in the case of L'Aquila, both the damaged structure and the ground motions are recorded. The 3D modeling of the structure is rather complicated, and is performed by first modeling the structure with CAD software and later "translating" the model to the numerical code used. In the future, several more case studies will be analyzed, such as Kedesh and Avdat in Israel, and in collaboration with Hugh and Bilham the Temple of Shiva at Pandrethan, Kashmir. Establishing a numerical 3D dynamic analysis for back analysis of stone displacement in masonry structures as a paleoseismic tool can provide much needed data on ground motion parameters in regions where instrumental data are scarce, or are completely absent.

  9. Microzonation of Seismic Hazard Potential in Taipei, Taiwan

    NASA Astrophysics Data System (ADS)

    Liu, K. S.; Lin, Y. P.

    2017-12-01

    The island of Taiwan lies at the boundary between the Philippine Sea plate and the Eurasia plate. Accordingly, the majority of seismic energy release near Taiwan originates from the two subduction zones. It is therefore not surprising that Taiwan has repeatedly been struck by large earthquakes such as 1986 Hualien earthquake, 1999 Chi Chi and 2002 Hualien earthquake. Microzonation of seismic hazard potential becomes necessary in Taipei City for the Central Geological Survey announced the Sanchiao active fault as Category II. In this study, a catalog of more than 2000 shallow earthquakes occurred from 1900 to 2015 with Mw magnitudes ranging from 5.0 to 8.2, and 11 disastrous earthquakes occurred from 1683-1899, as well as Sanchiao active fault in the vicinity are used to estimate the seismic hazard potential in Taipei City for seismic microzonation. Furthermore, the probabilities of seismic intensity exceeding CWB intensity 5, 6, 7 and MMI VI, VII, VIII in 10, 30, and 50-year periods in the above areas are also analyzed for the seismic microzonation. Finally, by comparing with the seismic zoning map of Taiwan in current building code that was revised after 921 earthquakes, Results of this study will show which areas with higher earthquake hazard potential in Taipei City. They provide a valuable database for the seismic design of critical facilities. It will help mitigate Taipei City earthquake disaster loss in the future, as well as provide critical information for emergency response plans.

  10. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  11. Array seismological investigation of the South Atlantic 'Superplume'

    NASA Astrophysics Data System (ADS)

    Hempel, Stefanie; Gassmöller, Rene; Thomas, Christine

    2015-04-01

    We apply the axisymmetric, spherical Earth spectral elements code AxiSEM to model seismic compressional waves which sample complex `superplume' structures in the lower mantle. High-resolution array seismological stacking techniques are evaluated regarding their capability to resolve large-scale high-density low-velocity bodies including interior structure such as inner upwellings, high density lenses, ultra-low velocity zones (ULVZs), neighboring remnant slabs and adjacent small-scale uprisings. Synthetic seismograms are also computed and processed for models of the Earth resulting from geodynamic modelling of the South Atlantic mantle including plate reconstruction. We discuss the interference and suppression of the resulting seismic signals and implications for a seismic data study in terms of visibility of the South Atlantic `superplume' structure. This knowledge is used to process, invert and interpret our data set of seismic sources from the Andes and the South Sandwich Islands detected at seismic arrays spanning from Ethiopia over Cameroon to South Africa mapping the South Atlantic `superplume' structure including its interior structure. In order too present the model of the South Atlantic `superplume' structure that best fits the seismic data set, we iteratively compute synthetic seismograms while adjusting the model according to the dependencies found in the parameter study.

  12. Ground-motion modeling of the 1906 San Francisco earthquake, part I: Validation using the 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Aagaard, Brad T.; Brocher, T.M.; Dolenc, D.; Dreger, D.; Graves, R.W.; Harmsen, S.; Hartzell, S.; Larsen, S.; Zoback, M.L.

    2008-01-01

    We compute ground motions for the Beroza (1991) and Wald et al. (1991) source models of the 1989 magnitude 6.9 Loma Prieta earthquake using four different wave-propagation codes and recently developed 3D geologic and seismic velocity models. In preparation for modeling the 1906 San Francisco earthquake, we use this well-recorded earthquake to characterize how well our ground-motion simulations reproduce the observed shaking intensities and amplitude and durations of recorded motions throughout the San Francisco Bay Area. All of the simulations generate ground motions consistent with the large-scale spatial variations in shaking associated with rupture directivity and the geologic structure. We attribute the small variations among the synthetics to the minimum shear-wave speed permitted in the simulations and how they accommodate topography. Our long-period simulations, on average, under predict shaking intensities by about one-half modified Mercalli intensity (MMI) units (25%-35% in peak velocity), while our broadband simulations, on average, under predict the shaking intensities by one-fourth MMI units (16% in peak velocity). Discrepancies with observations arise due to errors in the source models and geologic structure. The consistency in the synthetic waveforms across the wave-propagation codes for a given source model suggests the uncertainty in the source parameters tends to exceed the uncertainty in the seismic velocity structure. In agreement with earlier studies, we find that a source model with slip more evenly distributed northwest and southeast of the hypocenter would be preferable to both the Beroza and Wald source models. Although the new 3D seismic velocity model improves upon previous velocity models, we identify two areas needing improvement. Nevertheless, we find that the seismic velocity model and the wave-propagation codes are suitable for modeling the 1906 earthquake and scenario events in the San Francisco Bay Area.

  13. Evaluation of building fundamental periods and effects of local geology on ground motion parameters in the Siracusa area, Italy

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; D'Amico, Sebastiano; Lombardo, Giuseppe; Longo, Emanuela

    2016-07-01

    The Siracusa area, located in the southeastern coast of Sicily (Italy), is mainly characterized by the outcropping of a limestone formation. This lithotype, which is overlain by soft sediments such as sandy clays and detritus, can be considered as the local bedrock. Records of ambient noise, processed through spectral ratio techniques, were used to assess the dynamic properties of a sample survey of both reinforced concrete and masonry buildings. The results show that experimental periods of existing buildings are always lower than those proposed by the European seismic code. This disagreement could be related to the role played by stiff masonry infills, as well as the influence of adjacent buildings, especially in downtown Siracusa. Numerical modeling was also used to study the effect of local geology on the seismic site response of the Siracusa area. Seismic urban scenarios were simulated considering a moderate magnitude earthquake (December 13th, 1990) to assess the shaking level of the different outcropping formations. Spectral acceleration at different periods, peak ground acceleration, and velocity were obtained through a stochastic approach adopting an extended source model code. Seismic ground motion scenario highlighted that amplification mainly occurs in the sedimentary deposits that are widespread to the south of the study area as well as on some spot areas where coarse detritus and sandy clay outcrop. On the other hand, the level of shaking appears moderate in all zones with outcropping limestone and volcanics.

  14. Filtering, Coding, and Compression with Malvar Wavelets

    DTIC Science & Technology

    1993-12-01

    speech coding techniques being investigated by the military (38). Imagery: Space imagery often requires adaptive restoration to deblur out-of-focus...and blurred image, find an estimate of the ideal image using a priori information about the blur, noise , and the ideal image" (12). The research for...recording can be described as the original signal convolved with impulses , which appear as echoes in the seismic event. The term deconvolution indicates

  15. Fast in-memory elastic full-waveform inversion using consumer-grade GPUs

    NASA Astrophysics Data System (ADS)

    Sivertsen Bergslid, Tore; Birger Raknes, Espen; Arntsen, Børge

    2017-04-01

    Full-waveform inversion (FWI) is a technique to estimate subsurface properties by using the recorded waveform produced by a seismic source and applying inverse theory. This is done through an iterative optimization procedure, where each iteration requires solving the wave equation many times, then trying to minimize the difference between the modeled and the measured seismic data. Having to model many of these seismic sources per iteration means that this is a highly computationally demanding procedure, which usually involves writing a lot of data to disk. We have written code that does forward modeling and inversion entirely in memory. A typical HPC cluster has many more CPUs than GPUs. Since FWI involves modeling many seismic sources per iteration, the obvious approach is to parallelize the code on a source-by-source basis, where each core of the CPU performs one modeling, and do all modelings simultaneously. With this approach, the GPU is already at a major disadvantage in pure numbers. Fortunately, GPUs can more than make up for this hardware disadvantage by performing each modeling much faster than a CPU. Another benefit of parallelizing each individual modeling is that it lets each modeling use a lot more RAM. If one node has 128 GB of RAM and 20 CPU cores, each modeling can use only 6.4 GB RAM if one is running the node at full capacity with source-by-source parallelization on the CPU. A parallelized per-source code using GPUs can use 64 GB RAM per modeling. Whenever a modeling uses more RAM than is available and has to start using regular disk space the runtime increases dramatically, due to slow file I/O. The extremely high computational speed of the GPUs combined with the large amount of RAM available for each modeling lets us do high frequency FWI for fairly large models very quickly. For a single modeling, our GPU code outperforms the single-threaded CPU-code by a factor of about 75. Successful inversions have been run on data with frequencies up to 40 Hz for a model of 2001 by 600 grid points with 5 m grid spacing and 5000 time steps, in less than 2.5 minutes per source. In practice, using 15 nodes (30 GPUs) to model 101 sources, each iteration took approximately 9 minutes. For reference, the same inversion run with our CPU code uses two hours per iteration. This was done using only a very simple wavefield interpolation technique, saving every second timestep. Using a more sophisticated checkpointing or wavefield reconstruction method would allow us to increase this model size significantly. Our results show that ordinary gaming GPUs are a viable alternative to the expensive professional GPUs often used today, when performing large scale modeling and inversion in geophysics.

  16. Seismic facies analysis based on self-organizing map and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian

    2015-01-01

    Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.

  17. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  18. Using 3D Simulation of Elastic Wave Propagation in Laplace Domain for Electromagnetic-Seismic Inverse Modeling

    NASA Astrophysics Data System (ADS)

    Petrov, P.; Newman, G. A.

    2010-12-01

    Quantitative imaging of the subsurface objects is essential part of modern geophysical technology important in oil and gas exploration and wide-range engineering applications. A significant advancement in developing a robust, high resolution imaging technology is concerned with using the different geophysical measurements (gravity, EM and seismic) sense the subsurface structure. A joint image of the subsurface geophysical attributes (velocity, electrical conductivity and density) requires the consistent treatment of the different geophysical data (electromagnetic and seismic) due to their differing physical nature - diffusive and attenuated propagation of electromagnetic energy and nonlinear, multiple scattering wave propagation of seismic energy. Recent progress has been reported in the solution of this problem by reducing the complexity of seismic wave field. Works formed by Shin and Cha (2009 and 2008) suggests that low-pass filtering the seismic trace via Laplace-Fourier transformation can be an effective approach for obtaining seismic data that has similar spatial resolution to EM data. The effect of Laplace- Fourier transformation on the low-pass filtered trace changes the modeling of the seismic wave field from multi-wave propagation to diffusion. The key benefit of transformation is that diffusive wave-field inversion works well for both data sets seismic (Shin and Cha, 2008) and electromagnetic (Commer and Newman 2008, Newman et al., 2010). Moreover the different data sets can also be matched for similar and consistent resolution. Finally, the low pass seismic image is also an excellent choice for a starting model when analyzing the entire seismic waveform to recover the high spatial frequency components of the seismic image; its reflectivity (Shin and Cha, 2009). Without a good starting model full waveform seismic imaging and migration can encounter serious difficulties. To produce seismic wave fields consistent for joint imaging in the Laplace-Fourier domain we had developed 3D code for full-wave field simulation in the elastic media which take into account nonlinearity introduced by free-surface effects. Our approach is based on the velocity-stress formulation. In the contrast to conventional formulation we defined the material properties such as density and Lame constants not at nodal points but within cells. This second order finite differences method formulated in the cell-based grid, generate numerical solutions compatible with analytical ones within the range errors determinate by dispersion analysis. Our simulator will be embedded in an inversion scheme for joint seismic- electromagnetic imaging. It also offers possibilities for preconditioning the seismic wave propagation problems in the frequency domain. References. Shin, C. & Cha, Y. (2009), Waveform inversion in the Laplace-Fourier domain, Geophys. J. Int. 177(3), 1067- 1079. Shin, C. & Cha, Y. H. (2008), Waveform inversion in the Laplace domain, Geophys. J. Int. 173(3), 922-931. Commer, M. & Newman, G. (2008), New advances in three-dimensional controlled-source electromagnetic inversion, Geophys. J. Int. 172(2), 513-535. Newman, G. A., Commer, M. & Carazzone, J. J. (2010), Imaging CSEM data in the presence of electrical anisotropy, Geophysics, in press.

  19. Dominant seismic sources for the cities in South Sumatra

    NASA Astrophysics Data System (ADS)

    Sunardi, Bambang; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Rohadi, Supriyanto; Sulastri, Putra, Ade Surya

    2017-07-01

    Subduction zone along west of Sumatra and Sumatran fault zone are active seismic sources. Seismotectonically, South Sumatra could be affected by earthquakes triggered by these seismic sources. This paper discussed contribution of each seismic source to earthquake hazards for cities of Palembang, Prabumulih, Banyuasin, OganIlir, Ogan Komering Ilir, South Oku, Musi Rawas and Empat Lawang. These hazards are presented in form of seismic hazard curves. The study was conducted by using Probabilistic Seismic Hazard Analysis (PSHA) of 2% probability of exceedance in 50 years. Seismic sources used in analysis included megathrust zone M2 of Sumatra and South Sumatra, background seismic sources and shallow crustal seismic sources consist of Ketaun, Musi, Manna and Kumering faults. The results of the study showed that for cities relatively far from the seismic sources, subduction / megathrust seismic source with a depth ≤ 50 km greatly contributed to the seismic hazard and the other areas showed deep background seismic sources with a depth of more than 100 km dominate to seismic hazard respectively.

  20. Active Fault Near-Source Zones Within and Bordering the State of California for the 1997 Uniform Building Code

    USGS Publications Warehouse

    Petersen, M.D.; Toppozada, Tousson R.; Cao, T.; Cramer, C.H.; Reichle, M.S.; Bryant, W.A.

    2000-01-01

    The fault sources in the Project 97 probabilistic seismic hazard maps for the state of California were used to construct maps for defining near-source seismic coefficients, Na and Nv, incorporated in the 1997 Uniform Building Code (ICBO 1997). The near-source factors are based on the distance from a known active fault that is classified as either Type A or Type B. To determine the near-source factor, four pieces of geologic information are required: (1) recognizing a fault and determining whether or not the fault has been active during the Holocene, (2) identifying the location of the fault at or beneath the ground surface, (3) estimating the slip rate of the fault, and (4) estimating the maximum earthquake magnitude for each fault segment. This paper describes the information used to produce the fault classifications and distances.

  1. Fast principal component analysis for stacking seismic data

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Bai, Min

    2018-04-01

    Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.

  2. Naval Research Laboratory Arctic Initiatives

    DTIC Science & Technology

    2011-06-01

    Campaign Code 7420 Arctic Modeling Code 7320/7500/7600 In-situ NRL, CRREL NRL boreholes Strategy Remote Sensing Synergism −Collect in-situ...Navy and Marine Corps Corporate Laboratory An array of BMFCs being prepared for deployment. Each BMFC consists of a weighted anode laid flat onto...Gas CH4 E C D CO2 BGHS Free Methane Gas Hydrates HCO3- HCO3- Seismic and geochemical data to predict deep sediment hydrates Estimate spatial

  3. Non-Linear Seismic Velocity Estimation from Multiple Waveform Functionals and Formal Assessment of Constraints

    DTIC Science & Technology

    2011-09-01

    tectonically active regions such as the Middle East. For example, we previously applied the code to determine the crust and upper mantle structure...Objective Optimization (MOO) for Multiple Datasets The primary goal of our current project is to develop a tool for estimating crustal structure that...be used to obtain crustal velocity structures by modeling broadband waveform, receiver function, and surface wave dispersion data. The code has been

  4. Analysis and Simulation of Far-Field Seismic Data from the Source Physics Experiment

    DTIC Science & Technology

    2012-09-01

    ANALYSIS AND SIMULATION OF FAR-FIELD SEISMIC DATA FROM THE SOURCE PHYSICS EXPERIMENT Arben Pitarka, Robert J. Mellors, Arthur J. Rodgers, Sean...Security Site (NNSS) provides new data for investigating the excitation and propagation of seismic waves generated by buried explosions. A particular... seismic model. The 3D seismic model includes surface topography. It is based on regional geological data, with material properties constrained by shallow

  5. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.

  6. Coupling Hydrodynamic and Wave Propagation Codes for Modeling of Seismic Waves recorded at the SPE Test.

    NASA Astrophysics Data System (ADS)

    Larmat, C. S.; Rougier, E.; Delorey, A.; Steedman, D. W.; Bradley, C. R.

    2016-12-01

    The goal of the Source Physics Experiment (SPE) is to bring empirical and theoretical advances to the problem of detection and identification of underground nuclear explosions. For this, the SPE program includes a strong modeling effort based on first principles calculations with the challenge to capture both the source and near-source processes and those taking place later in time as seismic waves propagate within complex 3D geologic environments. In this paper, we report on results of modeling that uses hydrodynamic simulation codes (Abaqus and CASH) coupled with a 3D full waveform propagation code, SPECFEM3D. For modeling the near source region, we employ a fully-coupled Euler-Lagrange (CEL) modeling capability with a new continuum-based visco-plastic fracture model for simulation of damage processes, called AZ_Frac. These capabilities produce high-fidelity models of various factors believed to be key in the generation of seismic waves: the explosion dynamics, a weak grout-filled borehole, the surrounding jointed rock, and damage creation and deformations happening around the source and the free surface. SPECFEM3D, based on the Spectral Element Method (SEM) is a direct numerical method for full wave modeling with mathematical accuracy. The coupling interface consists of a series of grid points of the SEM mesh situated inside of the hydrodynamic code's domain. Displacement time series at these points are computed using output data from CASH or Abaqus (by interpolation if needed) and fed into the time marching scheme of SPECFEM3D. We will present validation tests with the Sharpe's model and comparisons of waveforms modeled with Rg waves (2-8Hz) that were recorded up to 2 km for SPE. We especially show effects of the local topography, velocity structure and spallation. Our models predict smaller amplitudes of Rg waves for the first five SPE shots compared to pure elastic models such as Denny &Johnson (1991).

  7. 78 FR 13911 - Proposed Revision to Design of Structures, Components, Equipment and Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    ... Analysis Reports for Nuclear Power Plants: LWR Edition,'' Section 3.7.1, ``Seismic Design Parameters,'' Section 3.7.2, ``Seismic System Analysis,'' Section 3.7.3, ``Seismic Subsystem Analysis,'' Section 3.8.1... and analysis issues, (2) updates to review interfaces to improve the efficiency and consistency of...

  8. Viking-2 Seismometer Measurements on Mars: PDS Data Archive and Meteorological Applications

    NASA Astrophysics Data System (ADS)

    Lorenz, Ralph D.; Nakamura, Yosio; Murphy, James R.

    2017-11-01

    A data product has been generated and archived on the NASA Planetary Data System (Geosciences Node), which presents the seismometer readings of Viking Lander 2 in an easy-to-access form, for both the raw ("high rate") waveform records and the compressed ("event mode") amplitude and frequency records. In addition to the records themselves, a separate summary file for each instrument mode lists key statistics of each record together with the meteorological measurements made closest in time to the seismic record. This juxtaposition facilitates correlation of the seismometer instrument response to different meteorological conditions, or the selection of seismic data during which wind disturbances can be expected to be small. We summarize data quality issues and also discuss lander-generated seismic signals, due to operation of the sampling arm or other systems, which may be of interest for prospective missions to other bodies. We review wind-seismic correlation, the "Martian solar day (sol) 80" candidate seismic event, and identify the seismic signature of a probable dust devil vortex on sol 482 : the seismometer data allow an estimate of the peak wind, occurring between coarsely spaced meteorology measurements. We present code to generate the plots in this paper to illustrate use of the data product.

  9. Seismic risk assessment of architectural heritages in Gyeongju considering local site effects

    NASA Astrophysics Data System (ADS)

    Park, H.-J.; Kim, D.-S.; Kim, D.-M.

    2013-02-01

    A seismic risk assessment is conducted for cultural heritage sites in Gyeongju, the capital of Korea's ancient Silla Kingdom. Gyeongju, home to UNESCO World Heritage sites, contains remarkable artifacts of Korean Buddhist art. An extensive geotechnical survey including a series of in situ tests is presented, providing pertinent soil profiles for site response analyses on thirty cultural heritage sites. After the shear wave velocity profiles and dynamic material properties were obtained, site response analyses were carried out at each historical site and the amplification characteristics, site period, and response spectrum of the site were determined for the earthquake levels of 2400 yr and 1000 yr return periods based on the Korean seismic hazard map. Response spectrum and corresponding site coefficients obtained from site response analyses considering geologic conditions differ significantly from the current Korean seismic code. This study confirms the importance of site-specific ground response analyses considering local geological conditions. Results are given in the form of the spatial distribution of bedrock depth, site period, and site amplification coefficients, which are particularly valuable in the context of a seismic vulnerability study. This study presents the potential amplification of hazard maps and provides primary data on the seismic risk assessment of each cultural heritage.

  10. Expected damages of retrofitted bridges with RC jacketing

    NASA Astrophysics Data System (ADS)

    Montes, O.; Jara, J. M.; Jara, M.; Olmos, B. A.

    2015-07-01

    The bridge infrastructure in many countries of the world consists of medium span length structures built several decades ago and designed for very low seismic forces. Many of them are reinforced concrete structures that according to the current code regulations have to be rehabilitated to increase their seismic capacity. One way to reduce the vulnerability of the bridges is by using retrofitting techniques that increase the strength of the structure or by incorporating devices to reduce the seismic demand. One of the most common retrofit techniques of the bridges substructures is the use of RC jacketing; this research assesses the expected damages of seismically deficient medium length highway bridges retrofitted with reinforced concrete jacketing, by conducting a parametric study. We select a suite of twenty accelerograms of subduction earthquakes recorded close to the Pacific Coast in Mexico. The original structures consist of five 30 m span simple supported bridges with five pier heights of 5 m, 10 m, 15 m 20 and 25 m and the analyses include three different jacket thickness and three steel ratios. The bridges were subjected to the seismic records and non-linear time history analyses were carried out by using the OpenSEEs Plataform. Results allow selecting the reinforced concrete jacketing that better improves the expected seismic behavior of the bridge models.

  11. Seismic intensity monitoring: from mature basins in the North Sea to sample-scale porosity measurements.

    NASA Astrophysics Data System (ADS)

    De Siena, Luca; Sketsiou, Panayiota

    2017-04-01

    We plan the application of a joint velocity, attenuation, and scattering tomography to the North Sea basins. By using seismic phases and intensities from previous passive and active surveys our aim is to image and monitor fluids under the subsurface. Seismic intensities provide unique solutions to the problem of locating/tracking gas/fluid movements in the volcanoes and depicting sub-basalt and sub-intrusives in volcanic reservoirs. The proposed techniques have been tested in volcanic Islands (Deception Island), continental calderas (Campi Flegrei) and Quaternary Volcanoes (Mount. St. Helens) and have been proved effective at monitoring fracture opening, imaging buried fluid-filled bodies, and tracking water/gas interfaces. These novel seismic attributes are modelled in space and time and connected with the lithology of the sampled medium, specifically density and permeability, with as key output a novel computational code with strong commercial potential. Data are readily available in the framework of the NERC CDT Oil & Gas project.

  12. Seismic site coefficients and acceleration design response spectra based on conditions in South Carolina : final report.

    DOT National Transportation Integrated Search

    2014-11-15

    The simplified procedure in design codes for determining earthquake response spectra involves : estimating site coefficients to adjust available rock accelerations to site accelerations. Several : investigators have noted concerns with the site coeff...

  13. The Effectiveness of the Italian Seismic Code NT08 Prescriptions in Reproducing the Observed Site Amplification: the City of Rome and the Acquasanta Terme Town, Italy, Case Study

    NASA Astrophysics Data System (ADS)

    Caserta, A.; Doumaz, F.; Pischiutta, M.; Costanzo, A.

    2017-12-01

    In the European design code EU08 used in Italy as NT08, site effects are accounted through several scaling factors, depending on the Vs30 and topographic conditions. The effectiveness of such approach has been tested in two case studies. The first one is located in the Tiber valley, the main sedimentary basin of the city of Rome. The second one is located in Acquasanta Terme town (central Appennines). In both cases, the expected amplification levels according to the Italian design code, was calculated on the basis of the velocity profile and other geological information collected in-situ. The expected values were compared in the former case (Rome) with data recorded during the seismic sequence following the 2009 April 6th, Mw=6.3 L'Aquila earthquake (mainshock, and aftershocks) and in the latter case (Acquasanta Terme) with moderate-magnitude aftershocks following the 2016 August 24th, Mw = 6.0, Amatrice missed main shock. Our results highlight that the parameterizations adopted by the design code are not sufficient to reproduce the real ground shaking occurring during earthquakes. These means that the Vs30 parameter ignores three-dimensional and frequency-dependent effects, as well as the influence of the near surface geology deeper than 30 meters.

  14. Tsunami Hazard in the Algerian Coastline

    NASA Astrophysics Data System (ADS)

    Amir, L. A.

    2008-05-01

    The Algerian coastline is located at the border between the African and the Eurasian tectonic plates. The collision between these two plates is approximately 4 to 7 mm/yr. The Alps and the tellian Atlas result from this convergence. Historical and present day data show the occurrence of earthquakes with magnitude up to 7 degrees on Richter scale in the northern part of the country. Cities were destroyed and the number of victims reached millions of people. Recently, small seismic waves generated by a destructive earthquake (Epicenter: 36.90N, 3.71E; Mw=6.8; Algeria, 2003, NEIC) were recorded in the French and Spanish coasts. This event raised again the issue of tsunami hazard in western Mediterranean region. For the Algerian study case, the assessment of seismic and tsunami hazard is a matter of great interest because of fast urban development of cities like Algiers. This study aims to provide scientific arguments to help in the elaboration of the Mediterranean tsunami alert program. This is a real complex issue because (1) the western part of the sea is narrow, (2) constructions on the Algerian coastline do not respect safety standards and (3) the seismic hazard is important. The present work is based on a numerical modeling approach. Firstly, a database is created to gather and list information related to seismology, tectonic, abnormal sea level's variations recorded/observed, submarine and coastal topographic data for the western part of the Mediterranean margin. This database helped to propose series of scenario that could trigger tsunami in the Mediterranean sea. Seismic moment, rake and focal depth are the major parameters that constrain the modeling input seismic data. Then, the undersea earthquakes modeling and the seabed deformations are computed with a program adapted from the rngchn code based on Okada's analytic equations. The last task of this work consisted to calculate the initial water surface displacement and simulate the triggered tsunami. Generation and propagation of induced seismic waves were estimated with another program adapted from the swan code for the resolution of the hydrodynamic shallow water equations. The results obtained will be firstly presented. Then, based on seismic waves travel times and run up height values, a large discussion will focus on the tsunami alert program for cities marked by fast urban development.

  15. PWL 1.0 Personal WaveLab: an object-oriented workbench for seismogram analysis on Windows systems

    NASA Astrophysics Data System (ADS)

    Bono, Andrea; Badiali, Lucio

    2005-02-01

    Personal WaveLab 1.0 wants to be the starting point for an ex novo development of seismic time-series analysis procedures for Windows-based personal computers. Our objective is two-fold. Firstly, being itself a stand-alone application, it allows to do "basic" digital or digitised seismic waveform analysis. Secondly, thanks to its architectural characteristics it can be the basis for the development of more complex and power featured applications. An expanded version of PWL, called SisPick!, is currently in use at the Istituto Nazionale di Geofisica e Vulcanologia (Italian Institute of Geophysics and Volcanology) for real-time monitoring with purposes of Civil Protection. This means that about 90 users tested the application for more than 1 year, making its features more robust and efficient. SisPick! was also employed in the United Nations Nyragongo Project, in Congo, and during the Stromboli emergency in summer of 2002. The main appeals of the application package are: ease of use, object-oriented design, good computational speed, minimal need of disk space and the complete absence of third-party developed components (including ActiveX). Windows environment spares the user scripting or complex interaction with the system. The system is in constant development to answer the needs and suggestions of its users. Microsoft Visual Basic 6 source code, installation package, test data sets and documentation are available at no cost.

  16. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  17. Seismicity in Pennsylvania: Evidence for Anthropogenic Events?

    NASA Astrophysics Data System (ADS)

    Homman, K.; Nyblade, A.

    2015-12-01

    The deployment and operation of the USArray Transportable Array (TA) and the PASEIS (XY) seismic networks in Pennsylvania during 2013 and 2014 provide a unique opportunity for investigating the seismicity of Pennsylvania. These networks, along with several permanent stations in Pennsylvania, resulted in a total of 104 seismometers in and around Pennsylvania that have been used in this study. Event locations were first obtained with Antelope Environmental Monitoring Software using P-wave arrival times. Arrival times were hand picked using a 1-5 Hz bandpass filter to within 0.1 seconds. Events were then relocated using a velocity model developed for Pennsylvania and the HYPOELLIPSE location code. In this study, 1593 seismic events occurred between February 2013 and December 2014 in Pennsylvania. These events ranged between magnitude (ML) 1.04 and 2.89 with an average MLof 1.90. Locations of the events occur across the state in many areas where no seismicity has been previously reported. Preliminary results indicate that most of these events are related to mining activity. Additional work using cross-correlation techniques is underway to examine a number of event clusters for evidence of hydraulic fracturing or wastewater injection sources.

  18. BARENTS16: a 1-D velocity model for the western Barents Sea

    NASA Astrophysics Data System (ADS)

    Pirli, Myrto; Schweitzer, Johannes

    2018-01-01

    A minimum 1-D seismic velocity model for routine seismic event location purposes was determined for the area of the western Barents Sea, using a modified version of the VELEST code. The resulting model, BARENTS16, and corresponding station corrections were produced using data from stations at regional distances, the vast majority located in the periphery of the recorded seismic activity, due to the unfavorable land-sea distribution. Recorded seismicity is approached through the listings of a joint bulletin, resulting from the merging of several international and regional bulletins for the region, as well as additional parametric data from temporary deployments. We discuss the challenges posed by this extreme network-seismicity geometry in terms of velocity estimation resolution and result stability. Although the conditions do not facilitate the estimation of meaningful station corrections at the farthermost stations, and even well-resolved corrections do not have a convincing contribution, we show that the process can still converge to a stable velocity average for the crust and upper mantle, in good agreement with a priori information about the regional structure and geology, which reduces adequately errors in event location estimates.

  19. Application of Adjoint Method and Spectral-Element Method to Tomographic Inversion of Regional Seismological Structure Beneath Japanese Islands

    NASA Astrophysics Data System (ADS)

    Tsuboi, S.; Miyoshi, T.; Obayashi, M.; Tono, Y.; Ando, K.

    2014-12-01

    Recent progress in large scale computing by using waveform modeling technique and high performance computing facility has demonstrated possibilities to perform full-waveform inversion of three dimensional (3D) seismological structure inside the Earth. We apply the adjoint method (Liu and Tromp, 2006) to obtain 3D structure beneath Japanese Islands. First we implemented Spectral-Element Method to K-computer in Kobe, Japan. We have optimized SPECFEM3D_GLOBE (Komatitsch and Tromp, 2002) by using OpenMP so that the code fits hybrid architecture of K-computer. Now we could use 82,134 nodes of K-computer (657,072 cores) to compute synthetic waveform with about 1 sec accuracy for realistic 3D Earth model and its performance was 1.2 PFLOPS. We use this optimized SPECFEM3D_GLOBE code and take one chunk around Japanese Islands from global mesh and compute synthetic seismograms with accuracy of about 10 second. We use GAP-P2 mantle tomography model (Obayashi et al., 2009) as an initial 3D model and use as many broadband seismic stations available in this region as possible to perform inversion. We then use the time windows for body waves and surface waves to compute adjoint sources and calculate adjoint kernels for seismic structure. We have performed several iteration and obtained improved 3D structure beneath Japanese Islands. The result demonstrates that waveform misfits between observed and theoretical seismograms improves as the iteration proceeds. We now prepare to use much shorter period in our synthetic waveform computation and try to obtain seismic structure for basin scale model, such as Kanto basin, where there are dense seismic network and high seismic activity. Acknowledgements: This research was partly supported by MEXT Strategic Program for Innovative Research. We used F-net seismograms of the National Research Institute for Earth Science and Disaster Prevention.

  20. The discrimination of man-made explosions from earthquakes using seismo-acoustic analysis in the Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Che, Il-Young; Jeon, Jeong-Soo

    2010-05-01

    Korea Institute of Geoscience and Mineral Resources (KIGAM) operates an infrasound network consisting of seven seismo-acoustic arrays in South Korea. Development of the arrays began in 1999, partially in collaboration with Southern Methodist University, with the goal of detecting distant infrasound signals from natural and anthropogenic phenomena in and around the Korean Peninsula. The main operational purpose of this network is to discriminate man-made seismic events from seismicity including thousands of seismic events per year in the region. The man-made seismic events are major cause of error in estimating the natural seismicity, especially where the seismic activity is weak or moderate such as in the Korean Peninsula. In order to discriminate the man-made explosions from earthquakes, we have applied the seismo-acoustic analysis associating seismic and infrasonic signals generated from surface explosion. The observations of infrasound at multiple arrays made it possible to discriminate surface explosion, because small or moderate size earthquake is not sufficient to generate infrasound. Till now we have annually discriminated hundreds of seismic events in seismological catalog as surface explosions by the seismo-acoustic analysis. Besides of the surface explosions, the network also detected infrasound signals from other sources, such as bolide, typhoons, rocket launches, and underground nuclear test occurred in and around the Korean Peninsula. In this study, ten years of seismo-acoustic data are reviewed with recent infrasonic detection algorithm and association method that finally linked to the seismic monitoring system of the KIGAM to increase the detection rate of surface explosions. We present the long-term results of seismo-acoustic analysis, the detection capability of the multiple arrays, and implications for seismic source location. Since the seismo-acoustic analysis is proved as a definite method to discriminate surface explosion, the analysis will be continuously used for estimating natural seismicity and understanding infrasonic sources.

  1. Multiple scattering of waves in random media: Application to the study of the city-site effect in Mexico City area.

    NASA Astrophysics Data System (ADS)

    Ishizawa, O. A.; Clouteau, D.

    2007-12-01

    Long-duration, amplifications and spatial response's variability of the seismic records registered in Mexico City during the September 1985 earthquake cannot only be explained by the soil velocity model. We will try to explain these phenomena by studying the extent of the effect of buildings' diffracted wave fields during an earthquake. The main question is whether the presence of a large number of buildings can significantly modify the seismic wave field. We are interested in the interaction between the incident wave field propagating in a stratified half- space and a large number of structures at the free surface, i.e., the coupled city-site effect. We study and characterize the seismic wave propagation regimes in a city using the theory of wave propagation in random media. In the coupled city-site system, the buildings are modeled as resonant scatterers uniformly distributed at the surface of a deterministic, horizontally layered elastic half-space representing the soil. Based on the mean-field and the field correlation equations, we build a theoretical model which takes into account the multiple scattering of seismic waves and allows us to describe the coupled city-site system behavior in a simple and rapid way. The results obtained for the configurationally averaged field quantities are validated by means of 3D results for the seismic response of a deterministic model. The numerical simulations of this model are computed with MISS3D code based on classical Soil-Structure Interaction techniques and on a variational coupling between Boundary Integral Equations for a layered soil and a modal Finite Element approach for the buildings. This work proposes a detailed numerical and a theoretical analysis of the city-site interaction (CSI) in Mexico City area. The principal parameters in the study of the CSI are the buildings resonant frequency distribution, the soil characteristics of the site, the urban density and position of the buildings in the city, as well as the type of incident wave. The main results of the theoretical and numerical models allow us to characterize the seismic movement in urban areas.

  2. On the Seismic Safety of Nuclear Power Plant Sites in South Korea

    NASA Astrophysics Data System (ADS)

    Choi, H.; Park, S.; Yang, J.; Shim, T.; Im, C. B.

    2016-12-01

    The Korean Peninsula is located at the far eastern part of Eurasian Plate, and within the intra-plate region several hundred km away from the nearest plate boundary. The earthquakes around the Korean Peninsula show the typical characteristics of intra-plate earthquakes. So to speak, those are low seismicity, relatively smaller magnitude than that of inter-plate earthquakes, and spatially irregular epicenters. There are 24 nuclear power plants (NPPs) in operation, 4 NPPs in completion of construction, and 4 NPPs in preparation of construction in South Korea. Even though the seismicity of the Korean Peninsula is known as relatively low, but because there are more than 30 NPPs within not so large territory, thorough the preparedness of NPPs' safety against earthquakes is required. The earthquake preparedness of NPPs in South Korea is composed of 4 stages: site election, design, construction and operation. Since regulatory codes and standards are strictly applied in each stage, the NPPs in South Korea are believed to be safe enough against the maximum potential earthquake ground motion. Through data analysis on geological and seismological characteristics of the region within a radius of 320 km from the site and the detailed geological survey of the area within a radius of 8 km from the site, the design earthquake ground motion of NPPs in South Korea is determined to be 0.2g (in case of newly constructed NPPs is 0.3g) considering the maximum potential earthquake ground motion and some safety margin. The ground motions and surface deformation caused by capable faults are also considered in the seismic design of NPPs. In addition, the Korea Institute of Nuclear Safety as a regulatory technical expert organization, has been operating independent real time earthquake monitoring network as a part of securing the seismic safety of NPP sites in South Korea since late 1990's. If earthquakes with more than magnitude 3.0 are occurred in the Korean Peninsula or the peak ground motions with more than 0.01g are occurred in NPP sites, such information is reported to the government and shared to the public through the website including the information of the seismic safety of NPP sites.

  3. Earth science: lasting earthquake legacy

    USGS Publications Warehouse

    Parsons, Thomas E.

    2009-01-01

    On 31 August 1886, a magnitude-7 shock struck Charleston, South Carolina; low-level activity continues there today. One view of seismic hazard is that large earthquakes will return to New Madrid and Charleston at intervals of about 500 years. With expected ground motions that would be stronger than average, that prospect produces estimates of earthquake hazard that rival those at the plate boundaries marked by the San Andreas fault and Cascadia subduction zone. The result is two large 'bull's-eyes' on the US National Seismic Hazard Maps — which, for example, influence regional building codes and perceptions of public safety.

  4. Joint innversion of seismic and magnetotelluric data in the Parkfield Region of California using the normalized cross-gradient constraint

    USGS Publications Warehouse

    Bennington, Ninfa L.; Zhang, Haijiang; Thurber, Cliff; Bedrosian, Paul A.

    2015-01-01

    We present jointly inverted models of P-wave velocity (Vp) and electrical resistivity for a two-dimensional profile centered on the San Andreas Fault Observatory at Depth (SAFOD). Significant structural similarity between main features of the separately inverted Vp and resistivity models is exploited by carrying out a joint inversion of the two datasets using the normalized cross-gradient constraint. This constraint favors structurally similar Vp and resistivity images that adequately fit the seismic and magnetotelluric (MT) datasets. The new inversion code, tomoDDMT, merges the seismic inversion code tomoDD and the forward modeling and sensitivity kernel subroutines of the MT inversion code OCCAM2DMT. TomoDDMT is tested on a synthetic dataset and demonstrates the code’s ability to more accurately resolve features of the input synthetic structure relative to the separately inverted resistivity and velocity models. Using tomoDDMT, we are able to resolve a number of key issues raised during drilling at SAFOD. We are able to infer the distribution of several geologic units including the Salinian granitoids, the Great Valley sequence, and the Franciscan Formation. The distribution and transport of fluids at both shallow and great depths is also examined. Low values of velocity/resistivity attributed to a feature known as the Eastern Conductor (EC) can be explained in two ways: the EC is a brine-filled, high porosity region, or this region is composed largely of clay-rich shales of the Franciscan. The Eastern Wall, which lies immediately adjacent to the EC, is unlikely to be a fluid pathway into the San Andreas Fault’s seismogenic zone due to its observed higher resistivity and velocity values.

  5. Static Corrections to Improve Seismic Monitoring of the North Korean Nuclear Test Site with Regional Arrays

    NASA Astrophysics Data System (ADS)

    Wilkins, N.; Wookey, J. M.; Selby, N. D.

    2017-12-01

    Seismology is an important part of the International Monitoring System (IMS) installed to detect, identify, and locate nuclear detonations in breach of the Comprehensive nuclear Test Ban Treaty (CTBT) prior to and after its entry into force. Seismic arrays in particular provide not only a means of detecting and locating underground nuclear explosions, but in discriminating them from naturally occurring earthquakes of similar magnitude. One potential discriminant is the amplitude ratio of high frequency (> 2 Hz) P waves to S waves (P/S) measured at regional distances (3 - 17 °). Accurate measurement of such discriminants, and the ability to detect low-magnitude seismicity from a suspicious event relies on high signal-to-noise ratio (SNR) data. A correction to the slowness vector of the incident seismic wavefield, and static corrections applied to the waveforms recorded at each receiver within the array can be shown to improve the SNR. We apply codes we have developed to calculate slowness-azimuth station corrections (SASCs) and static corrections to the arrival time and amplitude of the seismic waveform to seismic arrays regional to the DPRK nuclear test site at Punggye-ri, North Korea. We use the F-statistic to demonstrate the SNR improvement to data from the nuclear tests and other seismic events in the vicinity of the test site. We also make new measurements of P/S with the corrected waveforms and compare these with existing measurements.

  6. Updating the USGS seismic hazard maps for Alaska

    USGS Publications Warehouse

    Mueller, Charles; Briggs, Richard; Wesson, Robert L.; Petersen, Mark D.

    2015-01-01

    The U.S. Geological Survey makes probabilistic seismic hazard maps and engineering design maps for building codes, emergency planning, risk management, and many other applications. The methodology considers all known earthquake sources with their associated magnitude and rate distributions. Specific faults can be modeled if slip-rate or recurrence information is available. Otherwise, areal sources are developed from earthquake catalogs or GPS data. Sources are combined with ground-motion estimates to compute the hazard. The current maps for Alaska were developed in 2007, and included modeled sources for the Alaska-Aleutian megathrust, a few crustal faults, and areal seismicity sources. The megathrust was modeled as a segmented dipping plane with segmentation largely derived from the slip patches of past earthquakes. Some megathrust deformation is aseismic, so recurrence was estimated from seismic history rather than plate rates. Crustal faults included the Fairweather-Queen Charlotte system, the Denali–Totschunda system, the Castle Mountain fault, two faults on Kodiak Island, and the Transition fault, with recurrence estimated from geologic data. Areal seismicity sources were developed for Benioff-zone earthquakes and for crustal earthquakes not associated with modeled faults. We review the current state of knowledge in Alaska from a seismic-hazard perspective, in anticipation of future updates of the maps. Updated source models will consider revised seismicity catalogs, new information on crustal faults, new GPS data, and new thinking on megathrust recurrence, segmentation, and geometry. Revised ground-motion models will provide up-to-date shaking estimates for crustal earthquakes and subduction earthquakes in Alaska.

  7. Detailed fault structure of the Tarutung Pull-Apart Basin in Sumatra, Indonesia, derived from local earthquake data

    NASA Astrophysics Data System (ADS)

    Muksin, Umar; Haberland, Christian; Nukman, Mochamad; Bauer, Klaus; Weber, Michael

    2014-12-01

    The Tarutung Basin is located at a right step-over in the northern central segment of the dextral strike-slip Sumatran Fault System (SFS). Details of the fault structure along the Tarutung Basin are derived from the relocations of seismicity as well as from focal mechanism and structural geology. The seismicity distribution derived by a 3D inversion for hypocenter relocation is clustered according to a fault-like seismicity distribution. The seismicity is relocated with a double-difference technique (HYPODD) involving the waveform cross-correlations. We used 46,904 and 3191 arrival differences obtained from catalogue data and cross-correlation analysis, respectively. Focal mechanisms of events were analyzed by applying a grid search method (HASH code). Although there is no significant shift of the hypocenters (10.8 m in average) and centroids (167 m in average), the application of the double difference relocation sharpens the earthquake distribution. The earthquake lineation reflects the fault system, the extensional duplex fault system, and the negative flower structure within the Tarutung Basin. The focal mechanisms of events at the edge of the basin are dominantly of strike-slip type representing the dextral strike-slip Sumatran Fault System. The almost north-south striking normal fault events along extensional zones beneath the basin correlate with the maximum principal stress direction which is the direction of the Indo-Australian plate motion. The extensional zones form an en-echelon pattern indicated by the presence of strike-slip faults striking NE-SW to NW-SE events. The detailed characteristics of the fault system derived from the seismological study are also corroborated by structural geology at the surface.

  8. How sensitive is earthquake ground motion to source parameters? Insights from a numerical study in the Mygdonian basin

    NASA Astrophysics Data System (ADS)

    Chaljub, Emmanuel; Maufroy, Emeline; deMartin, Florent; Hollender, Fabrice; Guyonnet-Benaize, Cédric; Manakou, Maria; Savvaidis, Alexandros; Kiratzi, Anastasia; Roumelioti, Zaferia; Theodoulidis, Nikos

    2014-05-01

    Understanding the origin of the variability of earthquake ground motion is critical for seismic hazard assessment. Here we present the results of a numerical analysis of the sensitivity of earthquake ground motion to seismic source parameters, focusing on the Mygdonian basin near Thessaloniki (Greece). We use an extended model of the basin (65 km [EW] x 50 km [NS]) which has been elaborated during the Euroseistest Verification and Validation Project. The numerical simulations are performed with two independent codes, both implementing the Spectral Element Method. They rely on a robust, semi-automated, mesh design strategy together with a simple homogenization procedure to define a smooth velocity model of the basin. Our simulations are accurate up to 4 Hz, and include the effects of surface topography and of intrinsic attenuation. Two kinds of simulations are performed: (1) direct simulations of the surface ground motion for real regional events having various back azimuth with respect to the center of the basin; (2) reciprocity-based calculations where the ground motion due to 980 different seismic sources is computed at a few stations in the basin. In the reciprocity-based calculations, we consider epicentral distances varying from 2.5 km to 40 km, source depths from 1 km to 15 km and we span the range of possible back-azimuths with a 10 degree bin. We will present some results showing (1) the sensitivity of ground motion parameters to the location and focal mechanism of the seismic sources; and (2) the variability of the amplification caused by site effects, as measured by standard spectral ratios, to the source characteristics

  9. Analysis of seismic patterns observed at Nevado del Ruiz volcano, Colombia during August September 1985

    NASA Astrophysics Data System (ADS)

    Martinelli, Bruno

    1990-07-01

    The seismic activity of the Nevado del Ruiz volcano was monitored during August-September 1985 using a three-component portable seismograph station placed on the upper part of the volcano. The objective was to investigate the frequency content of the seismic signals and the possible sources of the volcanic tremor. The seismicity showed a wide spectrum of signals, especially at the beginning of September. Some relevant patterns from the collected records, which have been analyzed by spectrum analysis, are presented. For the purpose of analysis, the records have been divided into several categories such as long-period events, tremor, cyclic tremor episodes, and strong seismic activity on September 8, 1985. The origin of the seismic signals must be considered in relation to the dynamical and acoustical properties of fluids and the shape and dimensions of the volcano's conduits. The main results of the present experiment and analysis show that the sources of the seismic signals are within the volcanic edifice. The signal characteristics indicate that the sources lie in fluid-phase interactions rather than in brittle fracturing of solid components.

  10. Tsunami Generation Modelling for Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Annunziato, A.; Matias, L.; Ulutas, E.; Baptista, M. A.; Carrilho, F.

    2009-04-01

    In the frame of a collaboration between the European Commission Joint Research Centre and the Institute of Meteorology in Portugal, a complete analytical tool to support Early Warning Systems is being developed. The tool will be part of the Portuguese National Early Warning System and will be used also in the frame of the UNESCO North Atlantic Section of the Tsunami Early Warning System. The system called Tsunami Analysis Tool (TAT) includes a worldwide scenario database that has been pre-calculated using the SWAN-JRC code (Annunziato, 2007). This code uses a simplified fault generation mechanism and the hydraulic model is based on the SWAN code (Mader, 1988). In addition to the pre-defined scenario, a system of computers is always ready to start a new calculation whenever a new earthquake is detected by the seismic networks (such as USGS or EMSC) and is judged capable to generate a Tsunami. The calculation is performed using minimal parameters (epicentre and the magnitude of the earthquake): the programme calculates the rupture length and rupture width by using empirical relationship proposed by Ward (2002). The database calculations, as well the newly generated calculations with the current conditions are therefore available to TAT where the real online analysis is performed. The system allows to analyze also sea level measurements available worldwide in order to compare them and decide if a tsunami is really occurring or not. Although TAT, connected with the scenario database and the online calculation system, is at the moment the only software that can support the tsunami analysis on a global scale, we are convinced that the fault generation mechanism is too simplified to give a correct tsunami prediction. Furthermore short tsunami arrival times especially require a possible earthquake source parameters data on tectonic features of the faults like strike, dip, rake and slip in order to minimize real time uncertainty of rupture parameters. Indeed the earthquake parameters available right after an earthquake are preliminary and could be inaccurate. Determining which earthquake source parameters would affect the initial height and time series of tsunamis will show the sensitivity of the tsunami time series to seismic source details. Therefore a new fault generation model will be adopted, according to the seismotectonics properties of the different regions, and finally included in the calculation scheme. In order to do this, within the collaboration framework of Portuguese authorities, a new model is being defined, starting from the seismic sources in the North Atlantic, Caribbean and Gulf of Cadiz. As earthquakes occurring in North Atlantic and Caribbean sources may affect Portugal mainland, the Azores and Madeira archipelagos also these sources will be included in the analysis. Firstly we have started to examine the geometries of those sources that spawn tsunamis to understand the effect of fault geometry and depths of earthquakes. References: Annunziato, A., 2007. The Tsunami Assesment Modelling System by the Joint Research Center, Science of Tsunami Hazards, Vol. 26, pp. 70-92. Mader, C.L., 1988. Numerical modelling of water waves, University of California Press, Berkeley, California. Ward, S.N., 2002. Tsunamis, Encyclopedia of Physical Science and Technology, Vol. 17, pp. 175-191, ed. Meyers, R.A., Academic Press.

  11. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  12. Investigation of the 27 February 2010 Mw 8.8 Chilean earthquake integrating aftershock analysis, back-projection imaging and cGPS results

    NASA Astrophysics Data System (ADS)

    Clévédé, E.; Satriano, C.; Bukchin, B.; Lancieri, M.; Fuenzalida, A.; Vilotte, J.; Lyon-Caen, H.; Vigny, C.; Socquet, A.; Aranda, C.; Campos, J. A.; Scientific Team of the Lia Montessus de Ballore (Cnrs-Insu, U. Chile)

    2010-12-01

    The Mw 8.8 earthquake in central Chile ruptured more than 400 km along the subduction bound between the Nazca and the South American plates. The aftershock distribution clearly shows that this earthquake filled a well-known seismic gap, corresponding to rupture extension of the 1835 earthquake. The triggered post-seismic activity extends farther north of the gap, partially overlapping the 1985 and the 1960 Valparaiso earthquakes. However, the analysis of continuous GPS (cGPS) recordings, and back projection imaging of teleseismic body wave energy, indicate that the rupture stopped south of Valparaiso, around -33.5 degrees of latitude. An important question is how far the rupture actually extended to the north and the potential relation between the northernmost aftershock activity and remaining asperities within the ruptured zone of the previous Valparaiso earthquakes. The extension of the rupture offshore, towards west, also deserves further investigation. The aftershock distribution and the back propagation analysis support the hypothesis that, in the northern part, the rupture may have reached the surface at the trench. In this work, we performed a CMT and depth location study for more than 10 of the immediate largest aftershocks using teleseismic surface wave analysis constrained by P-wave polarity. In parallel, a detailed analysis of aftershocks in the northern part of the rupture, between 2010-03-11 and 2010-05-13, have been performed using the data from the station of the Chilean Servicio Sismológico Nacional (SSN), and of the post-seismic network, deployed by the French CNRS-INSU, GFZ, IRIS, and Caltech. We accurately hand-picked 153 larger events, which have been located using a non-linear probabilistic code, with improved depth location. Focal mechanisms have been computed for the larger events. Those results have been integrated with the analysis of cGPS and teleseismic back projection, and the overall kinematic of the Maule earthquake is discussed as well as the extension of the rupture along strike and dip.

  13. FY08 LDRD Final Report A New Method for Wave Propagation in Elastic Media LDRD Project Tracking Code: 05-ERD-079

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petersson, A

    The LDRD project 'A New Method for Wave Propagation in Elastic Media' developed several improvements to the traditional finite difference technique for seismic wave propagation, including a summation-by-parts discretization which is provably stable for arbitrary heterogeneous materials, an accurate treatment of non-planar topography, local mesh refinement, and stable outflow boundary conditions. This project also implemented these techniques in a parallel open source computer code called WPP, and participated in several seismic modeling efforts to simulate ground motion due to earthquakes in Northern California. This research has been documented in six individual publications which are summarized in this report. Of thesemore » publications, four are published refereed journal articles, one is an accepted refereed journal article which has not yet been published, and one is a non-refereed software manual. The report concludes with a discussion of future research directions and exit plan.« less

  14. The 2014 United States National Seismic Hazard Model

    USGS Publications Warehouse

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Mueller, Charles; Haller, Kathleen; Frankel, Arthur; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen; Boyd, Oliver; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nicolas; Wheeler, Russell; Williams, Robert; Olsen, Anna H.

    2015-01-01

    New seismic hazard maps have been developed for the conterminous United States using the latest data, models, and methods available for assessing earthquake hazard. The hazard models incorporate new information on earthquake rupture behavior observed in recent earthquakes; fault studies that use both geologic and geodetic strain rate data; earthquake catalogs through 2012 that include new assessments of locations and magnitudes; earthquake adaptive smoothing models that more fully account for the spatial clustering of earthquakes; and 22 ground motion models, some of which consider more than double the shaking data applied previously. Alternative input models account for larger earthquakes, more complicated ruptures, and more varied ground shaking estimates than assumed in earlier models. The ground motions, for levels applied in building codes, differ from the previous version by less than ±10% over 60% of the country, but can differ by ±50% in localized areas. The models are incorporated in insurance rates, risk assessments, and as input into the U.S. building code provisions for earthquake ground shaking.

  15. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, Robert; Rivers, Wilmer

    any single computer program for seismic data analysis will not have all the capabilities needed to study reference events, since hese detailed studies will be highly specialized. It may be necessary to develop and test new algorithms, and then these special ;odes must be integrated with existing software to use their conventional data-processing routines. We have investigated two neans of establishing communications between the legacy and new codes: CORBA and XML/SOAP Web services. We have nvestigated making new Java code communicate with a legacy C-language program, geotool, running under Linux. Both methods vere successful, but both were difficult to implement.more » C programs on UNIX/Linux are poorly supported for Web services, compared vith the Java and .NET languages and platforms. Easier-to-use middleware will be required for scientists to construct distributed applications as easily as stand-alone ones. Considerable difficulty was encountered in modifying geotool, and this problem shows he need to use component-based user interfaces instead of large C-language codes where changes to one part of the program nay introduce side effects into other parts. We have nevertheless made bug fixes and enhancements to that legacy program, but t remains difficult to expand it through communications with external software.« less

  17. Improved design of special boundary elements for T-shaped reinforced concrete walls

    NASA Astrophysics Data System (ADS)

    Ji, Xiaodong; Liu, Dan; Qian, Jiaru

    2017-01-01

    This study examines the design provisions of the Chinese GB 50011-2010 code for seismic design of buildings for the special boundary elements of T-shaped reinforced concrete walls and proposes an improved design method. Comparison of the design provisions of the GB 50011-2010 code and those of the American code ACI 318-14 indicates a possible deficiency in the T-shaped wall design provisions in GB 50011-2010. A case study of a typical T-shaped wall designed in accordance with GB 50011-2010 also indicates the insufficient extent of the boundary element at the non-flange end and overly conservative design of the flange end boundary element. Improved designs for special boundary elements of T-shaped walls are developed using a displacement-based method. The proposed design formulas produce a longer boundary element at the non-flange end and a shorter boundary element at the flange end, relative to those of the GB 50011-2010 provisions. Extensive numerical analysis indicates that T-shaped walls designed using the proposed formulas develop inelastic drift of 0.01 for both cases of the flange in compression and in tension.

  18. Seismic design parameters - A user guide

    USGS Publications Warehouse

    Leyendecker, E.V.; Frankel, A.D.; Rukstales, K.S.

    2001-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings (1997 NEHRP Provisions) introduced seismic design procedure that is based on the explicit use of spectral response acceleration rather than the traditional peak ground acceleration and/or peak ground velocity or zone factors. The spectral response accelerations are obtained from spectral response acceleration maps accompanying the report. Maps are available for the United States and a number of U.S. territories. Since 1997 additional codes and standards have also adopted seismic design approaches based on the same procedure used in the NEHRP Provisions and the accompanying maps. The design documents using the 1997 NEHRP Provisions procedure may be divided into three categories -(1) Design of New Construction, (2) Design and Evaluation of Existing Construction, and (3) Design of Residential Construction. A CD-ROM has been prepared for use in conjunction with the design documents in each of these three categories. The spectral accelerations obtained using the software on the CD are the same as those that would be obtained by using the maps accompanying the design documents. The software has been prepared to operate on a personal computer using a Windows (Microsoft Corporation) operating environment and a point and click type of interface. The user can obtain the spectral acceleration values that would be obtained by use of the maps accompanying the design documents, include site factors appropriate for the Site Class provided by the user, calculate a response spectrum that includes the site factor, and plot a response spectrum. Sites may be located by providing the latitude-longitude or zip code for all areas covered by the maps. All of the maps used in the various documents are also included on the CDROM

  19. Unsupervised seismic facies analysis with spatial constraints using regularized fuzzy c-means

    NASA Astrophysics Data System (ADS)

    Song, Chengyun; Liu, Zhining; Cai, Hanpeng; Wang, Yaojun; Li, Xingming; Hu, Guangmin

    2017-12-01

    Seismic facies analysis techniques combine classification algorithms and seismic attributes to generate a map that describes main reservoir heterogeneities. However, most of the current classification algorithms only view the seismic attributes as isolated data regardless of their spatial locations, and the resulting map is generally sensitive to noise. In this paper, a regularized fuzzy c-means (RegFCM) algorithm is used for unsupervised seismic facies analysis. Due to the regularized term of the RegFCM algorithm, the data whose adjacent locations belong to same classification will play a more important role in the iterative process than other data. Therefore, this method can reduce the effect of seismic data noise presented in discontinuous regions. The synthetic data with different signal/noise values are used to demonstrate the noise tolerance ability of the RegFCM algorithm. Meanwhile, the fuzzy factor, the neighbour window size and the regularized weight are tested using various values, to provide a reference of how to set these parameters. The new approach is also applied to a real seismic data set from the F3 block of the Netherlands. The results show improved spatial continuity, with clear facies boundaries and channel morphology, which reveals that the method is an effective seismic facies analysis tool.

  20. Comparing Low-Frequency Earthquakes During Triggered and Ambient Tremor in Taiwan

    NASA Astrophysics Data System (ADS)

    Alvarado Lara, F., Sr.; Ledezma, C., Sr.

    2014-12-01

    In South America, larger magnitude seismic events originate in the subduction zone between the Nazca and Continental plates, as opposed to crustal events. Crustal seismic events are important in areas very close to active fault lines; however, seismic hazard analyses incorporate crust events related to a maximum distance from the site under study. In order to use crustal events as part of a seismic hazard analysis, it is necessary to use the attenuation relationships which represent the seismic behavior of the site under study. Unfortunately, in South America the amount of compiled crustal event historical data is not yet sufficient to generate a firm regional attenuation relationship. In the absence of attenuation relationships for crustal earthquakes in the region, the conventional approach is to use attenuation relationships from other regions which have a large amount of compiled data and which have similar seismic conditions to the site under study. This practice permits the development of seismic hazard analysis work with a certain margin of accuracy. In South America, in the engineering practice, new generation attenuation relationships (NGA-W) are used among other alternatives in order to incorporate the effect of crustal events in a seismic hazard analysis. In 2014, the NGA-W Version 2 (NGA-W2) was presented with a database containing information from Taiwan, Turkey, Iran, USA, Mexico, Japan, and Alaska. This paper examines whether it is acceptable to utilize the NGA-W2 in seismic hazard analysis in South America. A comparison between response spectrums of the seismic risk prepared in accordance with NGA-W2 and actual response spectrums of crustal events from Argentina is developed in order to support the examination. The seismic data were gathered from equipment installed in the cities of Santiago, Chile and Mendoza, Argentina.

  1. Revision of the Applicability of the NGA's in South America, Chile - Argentina.

    NASA Astrophysics Data System (ADS)

    Alvarado Lara, F., Sr.; Ledezma, C., Sr.

    2015-12-01

    In South America, larger magnitude seismic events originate in the subduction zone between the Nazca and Continental plates, as opposed to crustal events. Crustal seismic events are important in areas very close to active fault lines; however, seismic hazard analyses incorporate crust events related to a maximum distance from the site under study. In order to use crustal events as part of a seismic hazard analysis, it is necessary to use the attenuation relationships which represent the seismic behavior of the site under study. Unfortunately, in South America the amount of compiled crustal event historical data is not yet sufficient to generate a firm regional attenuation relationship. In the absence of attenuation relationships for crustal earthquakes in the region, the conventional approach is to use attenuation relationships from other regions which have a large amount of compiled data and which have similar seismic conditions to the site under study. This practice permits the development of seismic hazard analysis work with a certain margin of accuracy. In South America, in the engineering practice, new generation attenuation relationships (NGA-W) are used among other alternatives in order to incorporate the effect of crustal events in a seismic hazard analysis. In 2014, the NGA-W Version 2 (NGA-W2) was presented with a database containing information from Taiwan, Turkey, Iran, USA, Mexico, Japan, and Alaska. This paper examines whether it is acceptable to utilize the NGA-W2 in seismic hazard analysis in South America. A comparison between response spectrums of the seismic risk prepared in accordance with NGA-W2 and actual response spectrums of crustal events from Argentina is developed in order to support the examination. The seismic data were gathered from equipment installed in the cities of Santiago, Chile and Mendoza, Argentina.

  2. Slope Stability Analysis In Seismic Areas Of The Northern Apennines (Italy)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo Presti, D.; Fontana, T.; Marchetti, D.

    2008-07-08

    Several research works have been published on the slope stability in the northern Tuscany (central Italy) and particularly in the seismic areas of Garfagnana and Lunigiana (Lucca and Massa-Carrara districts), aimed at analysing the slope stability under static and dynamic conditions and mapping the landslide hazard. In addition, in situ and laboratory investigations are available for the study area, thanks to the activities undertaken by the Tuscany Seismic Survey. Based on such a huge information the co-seismic stability of few ideal slope profiles have been analysed by means of Limit equilibrium method LEM - (pseudo-static) and Newmark sliding block analysismore » (pseudo-dynamic). The analysis--results gave indications about the most appropriate seismic coefficient to be used in pseudo-static analysis after establishing allowable permanent displacement. Such indications are commented in the light of the Italian and European prescriptions for seismic stability analysis with pseudo-static approach. The stability conditions, obtained from the previous analyses, could be used to define microzonation criteria for the study area.« less

  3. Spatial pattern recognition of seismic events in South West Colombia

    NASA Astrophysics Data System (ADS)

    Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber

    2013-09-01

    Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.

  4. Signal-to-noise ratio application to seismic marker analysis and fracture detection

    NASA Astrophysics Data System (ADS)

    Xu, Hui-Qun; Gui, Zhi-Xian

    2014-03-01

    Seismic data with high signal-to-noise ratios (SNRs) are useful in reservoir exploration. To obtain high SNR seismic data, significant effort is required to achieve noise attenuation in seismic data processing, which is costly in materials, and human and financial resources. We introduce a method for improving the SNR of seismic data. The SNR is calculated by using the frequency domain method. Furthermore, we optimize and discuss the critical parameters and calculation procedure. We applied the proposed method on real data and found that the SNR is high in the seismic marker and low in the fracture zone. Consequently, this can be used to extract detailed information about fracture zones that are inferred by structural analysis but not observed in conventional seismic data.

  5. Enhanced characterization of faults and fractures at EGS sites by CO2 injection coupled with active seismic monitoring, pressure-transient testing, and well logging

    NASA Astrophysics Data System (ADS)

    Oldenburg, C. M.; Daley, T. M.; Borgia, A.; Zhang, R.; Doughty, C.; Jung, Y.; Altundas, B.; Chugunov, N.; Ramakrishnan, T. S.

    2016-12-01

    Faults and fractures in geothermal systems are difficult to image and characterize because they are nearly indistinguishable from host rock using traditional seismic and well-logging tools. We are investigating the use of CO2 injection and production (push-pull) in faults and fractures for contrast enhancement for better characterization by active seismic, well logging, and push-pull pressure transient analysis. Our approach consists of numerical simulation and feasibility assessment using conceptual models of potential enhanced geothermal system (EGS) sites such as Brady's Hot Spring and others. Faults in the deep subsurface typically have associated damage and gouge zones that provide a larger volume for uptake of CO2 than the slip plane alone. CO2 injected for push-pull well testing has a preference for flowing in the fault and fractures because CO2 is non-wetting relative to water and the permeability of open fractures and fault gouge is much higher than matrix. We are carrying out numerical simulations of injection and withdrawal of CO2 using TOUGH2/ECO2N. Simulations show that CO2 flows into the slip plane and gouge and damage zones and is driven upward by buoyancy during the push cycle over day-long time scales. Recovery of CO2 during the pull cycle is limited because of buoyancy effects. We then use the CO2 saturation field simulated by TOUGH2 in our anisotropic finite difference code from SPICE-with modifications for fracture compliance-that we use to model elastic wave propagation. Results show time-lapse differences in seismic response using a surface source. Results suggest that CO2 can be best imaged using time-lapse differencing of the P-wave and P-to-S-wave scattering in a vertical seismic profile (VSP) configuration. Wireline well-logging tools that measure electrical conductivity show promise as another means to detect and image the CO2-filled fracture near the injection well and potential monitoring well(s), especially if a saline-water pre-flush is carried out to enhance conductivity contrast. Pressure-transient analysis is also carried out to further constrain fault zone characteristics. These multiple complementary characterization approaches are being used to develop working models of fault and fracture zone characteristics relevant to EGS energy recovery.

  6. Analysis of the Earthquake Impact towards water-based fire extinguishing system

    NASA Astrophysics Data System (ADS)

    Lee, J.; Hur, M.; Lee, K.

    2015-09-01

    Recently, extinguishing system installed in the building when the earthquake occurred at a separate performance requirements. Before the building collapsed during the earthquake, as a function to maintain a fire extinguishing. In particular, the automatic sprinkler fire extinguishing equipment, such as after a massive earthquake without damage to piping also must maintain confidentiality. In this study, an experiment installed in the building during the earthquake, the water-based fire extinguishing saw grasp the impact of the pipe. Experimental structures for water-based fire extinguishing seismic construction step by step, and then applied to the seismic experiment, the building appears in the extinguishing of the earthquake response of the pipe was measured. Construction of acceleration caused by vibration being added to the size and the size of the displacement is measured and compared with the data response of the pipe from the table, thereby extinguishing water piping need to enhance the seismic analysis. Define the seismic design category (SDC) for the four groups in the building structure with seismic criteria (KBC2009) designed according to the importance of the group and earthquake seismic intensity. The event of a real earthquake seismic analysis of Category A and Category B for the seismic design of buildings, the current fire-fighting facilities could have also determined that the seismic performance. In the case of seismic design categories C and D are installed in buildings to preserve the function of extinguishing the required level of seismic retrofit design is determined.

  7. New discovered Izmir and Busan Mud Volcanoes and Application of Seismic Attributes and AVO Analysis in the Easternmost Black Sea.

    NASA Astrophysics Data System (ADS)

    Okay, S.; Cifci, G.; Ozel, S.; Atgin, O.; Ozel, O.; Barin, B.; Er, M.; Dondurur, D.; Kucuk, M.; Gurcay, S.; Choul Kim, D.; Sung-Ho, B.

    2012-04-01

    Recently, the continental margins of Black Sea became important for its gas content. There are no scientific researches offshore Trabzon-Giresun area except the explorations of oil companies. This is the first survey that performed in that area. 1700 km high resolution multichannel seismic and chirp data simultaneously were collected onboard R/V K.Piri Reis . The seismic data reveal BSRs, bright spots and acoustic maskings especially on the eastern part of the survey area. The survey area in the Eastern Black Sea includes continental slope, apron and deep basin. Two mud volcanoes are discovered and named as Busan and Izmir. The observed fold belt is believed to be the main driving force for the growth of mud volcanoes.Faults are developed at the flanks of diapiric uplift. Seismic attributes and AVO analysis are applied to 9 seismic sections which have probable gassy sediments and BSR zones. In the seismic attribute analysis high amplitude horzions with reverse polarity are observed in instantaneous frequency, envelope and apparent polarity sections also with low frequency at instantaneous frequency sections. These analysis verify existence of gas accumulations in the sediments. AVO analysis and cross section drawing and Gradient analysis show Class 1 AVO anomaly and indicate gas in sediments. Keywords: BSR, Bright spot, Mud volcano, Seismic Attributes, AVO

  8. SWRT: A package for semi-analytical solutions of surface wave propagation, including mode conversion, across transversely aligned vertical discontinuities

    NASA Astrophysics Data System (ADS)

    Datta, Arjun

    2018-03-01

    We present a suite of programs that implement decades-old algorithms for computation of seismic surface wave reflection and transmission coefficients at a welded contact between two laterally homogeneous quarter-spaces. For Love as well as Rayleigh waves, the algorithms are shown to be capable of modelling multiple mode conversions at a lateral discontinuity, which was not shown in the original publications or in the subsequent literature. Only normal incidence at a lateral boundary is considered so there is no Love-Rayleigh coupling, but incidence of any mode and coupling to any (other) mode can be handled. The code is written in Python and makes use of SciPy's Simpson's rule integrator and NumPy's linear algebra solver for its core functionality. Transmission-side results from this code are found to be in good agreement with those from finite-difference simulations. In today's research environment of extensive computing power, the coded algorithms are arguably redundant but SWRT can be used as a valuable testing tool for the ever evolving numerical solvers of seismic wave propagation. SWRT is available via GitHub (https://github.com/arjundatta23/SWRT.git).

  9. Nonlinear to Linear Elastic Code Coupling in 2-D Axisymmetric Media.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preston, Leiph

    Explosions within the earth nonlinearly deform the local media, but at typical seismological observation distances, the seismic waves can be considered linear. Although nonlinear algorithms can simulate explosions in the very near field well, these codes are computationally expensive and inaccurate at propagating these signals to great distances. A linearized wave propagation code, coupled to a nonlinear code, provides an efficient mechanism to both accurately simulate the explosion itself and to propagate these signals to distant receivers. To this end we have coupled Sandia's nonlinear simulation algorithm CTH to a linearized elastic wave propagation code for 2-D axisymmetric media (axiElasti)more » by passing information from the nonlinear to the linear code via time-varying boundary conditions. In this report, we first develop the 2-D axisymmetric elastic wave equations in cylindrical coordinates. Next we show how we design the time-varying boundary conditions passing information from CTH to axiElasti, and finally we demonstrate the coupling code via a simple study of the elastic radius.« less

  10. Metadata Management on the SCEC PetaSHA Project: Helping Users Describe, Discover, Understand, and Use Simulation Data in a Large-Scale Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.

    2007-12-01

    Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.

  11. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  12. Review on Rapid Seismic Vulnerability Assessment for Bulk of Buildings

    NASA Astrophysics Data System (ADS)

    Nanda, R. P.; Majhi, D. R.

    2013-09-01

    This paper provides a brief overview of rapid visual screening (RVS) procedures available in different countries with a comparison among all the methods. Seismic evaluation guidelines from, USA, Canada, Japan, New Zealand, India, Europe, Italy, UNDP, with other methods are reviewed from the perspective of their applicability to developing countries. The review shows clearly that some of the RVS procedures are unsuited for potential use in developing countries. It is expected that this comparative assessment of various evaluation schemes will help to identify the most essential components of such a procedure for use in India and other developing countries, which is not only robust, reliable but also easy to use with available resources. It appears that Federal Emergency Management Agency (FEMA) 154 and New Zealand Draft Code approaches can be suitably combined to develop a transparent, reasonably rigorous and generalized procedure for seismic evaluation of buildings in developing countries.

  13. Challenges in making a seismic hazard map for Alaska and the Aleutians

    USGS Publications Warehouse

    Wesson, R.L.; Boyd, O.S.; Mueller, C.S.; Frankel, A.D.; Freymueller, J.T.

    2008-01-01

    We present a summary of the data and analyses leading to the revision of the time-independent probabilistic seismic hazard maps of Alaska and the Aleutians. These maps represent a revision of existing maps based on newly obtained data, and reflect best current judgments about methodology and approach. They have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States, and will be proposed for adoption in future revisions to the International Building Code. We present example maps for peak ground acceleration, 0.2 s spectral amplitude (SA), and 1.0 s SA at a probability level of 2% in 50 years (annual probability of 0.000404). In this summary, we emphasize issues encountered in preparation of the maps that motivate or require future investigation and research.

  14. Spatial Temporal Analysis Of Mine-induced Seismicity

    NASA Astrophysics Data System (ADS)

    Fedotova, I. V.; Yunga, S. L.

    The results of analysis of influence mine-induced seismicity on state of stress of a rock mass are represented. The spatial-temporal analysis of influence of mass explosions on rock massif deformation is carried out in the territory of a mine field Yukspor of a wing of the Joined Kirovsk mine JSC "Apatite". Estimation of influence of mass explosions on a massif were determined based firstly on the parameters of natural seismicic regime, and secondly taking into consideration change of seismic energy release. After long series of explosions variations in average number of seismic events was fixed. Is proved, that with increase of a volume of rocks, involved in a deforma- tion the released energy of seismic events, and characteristic intervals of time of their preparation are also varied. At the same time, the mechanism of destruction changes also: from destruction's, of a type shift - separation before destruction's, in a quasi- solid heterogeneous massif (in oxidized zones and zones of actuated faults). Analysis of a database seismicity of a massif from 1993 to 1999 years has confirmed, that the response of a massif on explosions is connected to stress-deformations state a mas- sif and parameters of a mining working. The analysis of spatial-temporal distribution of hypocenters of seismic events has allowed to allocate migration of fissile regions of destruction after mass explosions. The researches are executed at support of the Russian foundation for basic research, - projects 00-05-64758, 01-05-65340.

  15. Regional Observation of Seismic Activity in Baekdu Mountain

    NASA Astrophysics Data System (ADS)

    Kim, Geunyoung; Che, Il-Young; Shin, Jin-Soo; Chi, Heon-Cheol

    2015-04-01

    Seismic unrest in Baekdu Mountain area between North Korea and Northeast China region has called attention to geological research community in Northeast Asia due to her historical and cultural importance. Seismic bulletin shows level of seismic activity in the area is higher than that of Jilin Province of Northeast China. Local volcanic observation shows a symptom of magmatic unrest in period between 2002 and 2006. Regional seismic data have been used to analyze seismic activity of the area. The seismic activity could be differentiated from other seismic phenomena in the region by the analysis.

  16. Modeling and inversion Matlab algorithms for resistivity, induced polarization and seismic data

    NASA Astrophysics Data System (ADS)

    Karaoulis, M.; Revil, A.; Minsley, B. J.; Werkema, D. D.

    2011-12-01

    M. Karaoulis (1), D.D. Werkema (3), A. Revil (1,2), A., B. Minsley (4), (1) Colorado School of Mines, Dept. of Geophysics, Golden, CO, USA. (2) ISTerre, CNRS, UMR 5559, Université de Savoie, Equipe Volcan, Le Bourget du Lac, France. (3) U.S. EPA, ORD, NERL, ESD, CMB, Las Vegas, Nevada, USA . (4) USGS, Federal Center, Lakewood, 10, 80225-0046, CO. Abstract We propose 2D and 3D forward modeling and inversion package for DC resistivity, time domain induced polarization (IP), frequency-domain IP, and seismic refraction data. For the resistivity and IP case, discretization is based on rectangular cells, where each cell has as unknown resistivity in the case of DC modelling, resistivity and chargeability in the time domain IP modelling, and complex resistivity in the spectral IP modelling. The governing partial-differential equations are solved with the finite element method, which can be applied to both real and complex variables that are solved for. For the seismic case, forward modeling is based on solving the eikonal equation using a second-order fast marching method. The wavepaths are materialized by Fresnel volumes rather than by conventional rays. This approach accounts for complicated velocity models and is advantageous because it considers frequency effects on the velocity resolution. The inversion can accommodate data at a single time step, or as a time-lapse dataset if the geophysical data are gathered for monitoring purposes. The aim of time-lapse inversion is to find the change in the velocities or resistivities of each model cell as a function of time. Different time-lapse algorithms can be applied such as independent inversion, difference inversion, 4D inversion, and 4D active time constraint inversion. The forward algorithms are benchmarked against analytical solutions and inversion results are compared with existing ones. The algorithms are packaged as Matlab codes with a simple Graphical User Interface. Although the code is parallelized for multi-core cpus, it is not as fast as machine code. In the case of large datasets, someone should consider transferring parts of the code to C or Fortran through mex files. This code is available through EPA's website on the following link http://www.epa.gov/esd/cmb/GeophysicsWebsite/index.html Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy.

  17. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.

  18. Preliminary Analysis of Remote Triggered Seismicity in Northern Baja California Generated by the 2011, Tohoku-Oki, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Wong-Ortega, V.; Castro, R. R.; Gonzalez-Huizar, H.; Velasco, A. A.

    2013-05-01

    We analyze possible variations of seismicity in the northern Baja California due to the passage of seismic waves from the 2011, M9.0, Tohoku-Oki, Japan earthquake. The northwestern area of Baja California is characterized by a mountain range composed of crystalline rocks. These Peninsular Ranges of Baja California exhibits high microseismic activity and moderate size earthquakes. In the eastern region of Baja California shearing between the Pacific and the North American plates takes place and the Imperial and Cerro-Prieto faults generate most of the seismicity. The seismicity in these regions is monitored by the seismic network RESNOM operated by the Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE). This network consists of 13 three-component seismic stations. We use the seismic catalog of RESNOM to search for changes in local seismic rates occurred after the passing of surface waves generated by the Tohoku-Oki, Japan earthquake. When we compare one month of seismicity before and after the M9.0 earthquake, the preliminary analysis shows absence of triggered seismicity in the northern Peninsular Ranges and an increase of seismicity south of the Mexicali valley where the Imperial fault jumps southwest and the Cerro Prieto fault continues.

  19. An efficient implementation of 3D high-resolution imaging for large-scale seismic data with GPU/CPU heterogeneous parallel computing

    NASA Astrophysics Data System (ADS)

    Xu, Jincheng; Liu, Wei; Wang, Jin; Liu, Linong; Zhang, Jianfeng

    2018-02-01

    De-absorption pre-stack time migration (QPSTM) compensates for the absorption and dispersion of seismic waves by introducing an effective Q parameter, thereby making it an effective tool for 3D, high-resolution imaging of seismic data. Although the optimal aperture obtained via stationary-phase migration reduces the computational cost of 3D QPSTM and yields 3D stationary-phase QPSTM, the associated computational efficiency is still the main problem in the processing of 3D, high-resolution images for real large-scale seismic data. In the current paper, we proposed a division method for large-scale, 3D seismic data to optimize the performance of stationary-phase QPSTM on clusters of graphics processing units (GPU). Then, we designed an imaging point parallel strategy to achieve an optimal parallel computing performance. Afterward, we adopted an asynchronous double buffering scheme for multi-stream to perform the GPU/CPU parallel computing. Moreover, several key optimization strategies of computation and storage based on the compute unified device architecture (CUDA) were adopted to accelerate the 3D stationary-phase QPSTM algorithm. Compared with the initial GPU code, the implementation of the key optimization steps, including thread optimization, shared memory optimization, register optimization and special function units (SFU), greatly improved the efficiency. A numerical example employing real large-scale, 3D seismic data showed that our scheme is nearly 80 times faster than the CPU-QPSTM algorithm. Our GPU/CPU heterogeneous parallel computing framework significant reduces the computational cost and facilitates 3D high-resolution imaging for large-scale seismic data.

  20. MatSeis and the GNEM R&E regional seismic anaylsis tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chael, Eric Paul; Hart, Darren M.; Young, Christopher John

    2003-08-01

    To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentationmore » will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.« less

  1. Accuracy of three-dimensional seismic ground response analysis in time domain using nonlinear numerical simulations

    NASA Astrophysics Data System (ADS)

    Liang, Fayun; Chen, Haibing; Huang, Maosong

    2017-07-01

    To provide appropriate uses of nonlinear ground response analysis for engineering practice, a three-dimensional soil column with a distributed mass system and a time domain numerical analysis were implemented on the OpenSees simulation platform. The standard mesh of a three-dimensional soil column was suggested to be satisfied with the specified maximum frequency. The layered soil column was divided into multiple sub-soils with a different viscous damping matrix according to the shear velocities as the soil properties were significantly different. It was necessary to use a combination of other one-dimensional or three-dimensional nonlinear seismic ground analysis programs to confirm the applicability of nonlinear seismic ground motion response analysis procedures in soft soil or for strong earthquakes. The accuracy of the three-dimensional soil column finite element method was verified by dynamic centrifuge model testing under different peak accelerations of the earthquake. As a result, nonlinear seismic ground motion response analysis procedures were improved in this study. The accuracy and efficiency of the three-dimensional seismic ground response analysis can be adapted to the requirements of engineering practice.

  2. Magnitude Scaling of the early displacement for the 2007, Mw 7.8 Tocopilla sequence (Chile)

    NASA Astrophysics Data System (ADS)

    Lancieri, M.; Fuenzalida, A.; Ruiz, S.; Madariaga, R. I.

    2009-12-01

    We investigate the empirical relationships between the initial portion of P and S-phase and the final event magnitude, on the Tocopilla (Chile) event and its aftershocks. Such correlations, on which real-time magnitude estimation for seismic early warning is founded, have been widely studied on several data sets, merging earthquakes generated in different tectonic settings and recorded with very different networks. The Tocopilla (Mw 7.8) earthquake, occurred along the northern Chile seismic gap on 14 November 2007, provides, together with its aftershocks, a unique opportunity of studying a homogeneous data set in terms of tectonic environment, focal mechanism, and recording network. The preliminary analysis required to build the seismic catalogue includes the automatic identification of more than 570 aftershocks using an automatic phase detector and picker algorithm, and the subsequent location of the events through a non-linear and probabilistic code. The seismic moment (M0) has been calculated by spectral modeling of P and S waves, assuming a Brune omega-square model. This analysis also yields values for the corner frequency and quality factor. The estimated range of moment magnitude for the aftershocks sequence is [2.8 - 6.8]. The correlation between the low pass filtered peak displacement (PD) and the final magnitude has been investigated for 90 events with magnitude greater than 4. These include the main event, its larger aftershock (Mw 6.8 occurred twenty-four hours after the main shock), and seven events with magnitude greater than 5.7. The recovered relationships confirm the observations of Zollo et al. [2006, 2007] of a clear correlation between distance corrected PD and final magnitude in the magnitude range [4.0 - 7.4], when considering time windows of 4 sec of P- or 2 sec of S- wave. In contrast with the previous studies, when examining time windows of 2 sec of P-wave, we surprisingly do not observe any saturation effect for magnitudes greater than 6.5, but rather a slope change in the regression curve. To explain the causative link between the PD and the final magnitude, it has been argued that an earthquake fracture that initiates with a higher flow rate of the elastic energy has an increased probability of propagating to longer distances. As consequence, the stress drop and/or active slip surface have to scale with seismic moment in the initial stage of seismic ruptures. To test this hypothesis, we investigated the correlation between radiated energy (E), corner frequency (fc) and seismic moment (M0). A first analysis performed on the entire S-phase confirms that M0 is proportional to fc^3; the apparent stress drop does not depend from M0 and that the E and M0 scale with a slope of 1.5. Therefore it appears to be no clear violation of the self-similarity. The same relationships will be investigated on signal windows of 2 sec of P- and 2 sec S-wave, used for early warning.

  3. The SISIFO project: Seismic Safety at High Schools

    NASA Astrophysics Data System (ADS)

    Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi; Dusi, Alberto; Grimaz, Stefano; Malisan, Petra; Saraò, Angela; Mucciarelli, Marco

    2014-05-01

    For many years, the Italian scientific community has faced the problem of the reduction of earthquake risk using innovative educational techniques. Recent earthquakes in Italy and around the world have clearly demonstrated that seismic codes alone are not able to guarantee an effective mitigation of risk. After the tragic events of San Giuliano di Puglia (2002), where an earthquake killed 26 school children, special attention was paid in Italy to the seismic safety of schools, but mainly with respect to structural aspects. Little attention has been devoted to the possible and even significant damage to non-structural elements (collapse of ceilings, tipping of cabinets and shelving, obstruction of escape routes, etc..). Students and teachers trained on these aspects may lead to a very effective preventive vigilance. Since 2002, the project EDURISK (www.edurisk.it) proposed educational tools and training programs for schools, at primary and middle levels. More recently, a nationwide campaign aimed to adults (www.iononrischio.it) was launched with the extensive support of civil protection volounteers. There was a gap for high schools, and Project SISIFO was designed to fill this void and in particular for those schools with technical/scientific curricula. SISIFO (https://sites.google.com/site/ogssisifo/) is a multidisciplinary initiative, aimed at the diffusion of scientific culture for achieving seismic safety in schools, replicable and can be structured in training the next several years. The students, helped by their teachers and by experts from scientific institutions, followed a course on specialized training on earthquake safety. The trial began in North-East Italy, with a combination of hands-on activities for the measurement of earthquakes with low-cost instruments and lectures with experts in various disciplines, accompanied by specifically designed teaching materials, both on paper and digital format. We intend to raise teachers and students knowledge of the problems of seismic hazard, seismic response of foundation soils, and building dynamics to stimulate awareness of seismic safety, including seismic hazard, seismic site response, seismic behaviour of structural and non-structural elements and functional issues (escape ways, emergency systems, etc.). The awareness of seismic safety in places of study, work and life aims at improving the capacity to recognize safety issues and possible solutions

  4. Study of the Triggering Level of Dynamic Stress Induces Non-Volcanic Tremor in Longitudinal Valley in Eastern Taiwan

    NASA Astrophysics Data System (ADS)

    Sun, W. F.; Chang, W. Y.; Chen, H. Y.

    2015-12-01

    Taiwan is located at the margin of the Eurasian Plate and the Philippine Sea Plate, which is a subduction zone between these two plates and the fault structures are rather complicated and dense seismicity, especially the Longitudinal Valley (LV) in eastern Taiwan. Non-volcanic tremor (NVT) is a seismic signal with low amplitude and long duration. NVT is often occurred below the seismogenic zone, which is between the lower crust and upper mantle, and the arrival time data of the body wave is difficult to be collected. Therefore, this study aims to investigate the physical mechanisms of NVT in several steps. First, in the investigation of the teleseismic earthquake data from the U.S. Geological Survey in 2005 to 2014, thirty-five potential teleseismic earthquakes are selected. Second, the seismograms are collected from the Broadband Array in Taiwan for Seismology (BATS) and Central Weather Bureau Seismic Network (CWBSN) for these thirty-five potential teleseismic earthquakes. Third, the Seismic Analysis Code is used to select the seismograms from seven possible events which satisfied the conditions of triggering tremor during the passage of the surface wave. Forth, a band-pass filter is applied to retain the frequency with the range of 2-8 HZ of the surface waveform. Finally, visually determination for the tremor signals. The experimental results show that five certainly NVT events and two potential triggered events were found in the LV zone of eastern Taiwan. The locations of the hypocenters were then estimated using HYPO71 for these five certain events. According to the estimated hypocenters, the sources of NVT are possibly beneath the southern region of LV, close to the Chih-Shang fault. Moreover, these estimated hypocenters are within the high Vp/Vs ratio region and in depth of 30-40 km. The further analysis found that the amplitude of the surface wave is one of the key factors that when the peak ground velocity > 0.02cm/s, which equivalents to 2-3kPa dynamic stress, might trigger tremors.

  5. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  6. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  7. Web Services Provide Access to SCEC Scientific Research Application Software

    NASA Astrophysics Data System (ADS)

    Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.

    2003-12-01

    Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the correct API interface from within C++ and/or C/Fortran). This poster presentation will provide descriptions of the following selected web services and their origin as scientific application codes: 3D community velocity models for Southern California, geocoordinate conversions (latitude/longitude to UTM), execution of GMT graphical scripts, data format conversions (Gocad to Matlab format), and implementation of Seismic Hazard Analysis application programs that calculate hazard curve and hazard map data sets.

  8. Seismic analysis of the frame structure reformed by cutting off column and jacking based on stiffness ratio

    NASA Astrophysics Data System (ADS)

    Zhao, J. K.; Xu, X. S.

    2017-11-01

    The cutting off column and jacking technology is a method for increasing story height, which has been widely used and paid much attention in engineering. The stiffness will be changed after the process of cutting off column and jacking, which directly affects the overall seismic performance. It is usually necessary to take seismic strengthening measures to enhance the stiffness. A five story frame structure jacking project in Jinan High-tech Zone was taken as an example, and three finite element models were established which contains the frame model before lifting, after lifting and after strengthening. Based on the stiffness, the dynamic time-history analysis was carried out to research its seismic performance under the EL-Centro seismic wave, the Taft seismic wave and the Tianjin artificial seismic wave. The research can provide some guidance for the design and construction of the entire jack lifting structure.

  9. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5 and Appendices G through I.« less

  10. Performance assessment of conventional and base-isolated nuclear power plants for earthquake and blast loadings

    NASA Astrophysics Data System (ADS)

    Huang, Yin-Nan

    Nuclear power plants (NPPs) and spent nuclear fuel (SNF) are required by code and regulations to be designed for a family of extreme events, including very rare earthquake shaking, loss of coolant accidents, and tornado-borne missile impacts. Blast loading due to malevolent attack became a design consideration for NPPs and SNF after the terrorist attacks of September 11, 2001. The studies presented in this dissertation assess the performance of sample conventional and base isolated NPP reactor buildings subjected to seismic effects and blast loadings. The response of the sample reactor building to tornado-borne missile impacts and internal events (e.g., loss of coolant accidents) will not change if the building is base isolated and so these hazards were not considered. The sample NPP reactor building studied in this dissertation is composed of containment and internal structures with a total weight of approximately 75,000 tons. Four configurations of the reactor building are studied, including one conventional fixed-base reactor building and three base-isolated reactor buildings using Friction Pendulum(TM), lead rubber and low damping rubber bearings. The seismic assessment of the sample reactor building is performed using a new procedure proposed in this dissertation that builds on the methodology presented in the draft ATC-58 Guidelines and the widely used Zion method, which uses fragility curves defined in terms of ground-motion parameters for NPP seismic probabilistic risk assessment. The new procedure improves the Zion method by using fragility curves that are defined in terms of structural response parameters since damage and failure of NPP components are more closely tied to structural response parameters than to ground motion parameters. Alternate ground motion scaling methods are studied to help establish an optimal procedure for scaling ground motions for the purpose of seismic performance assessment. The proposed performance assessment procedure is used to evaluate the vulnerability of the conventional and base-isolated NPP reactor buildings. The seismic performance assessment confirms the utility of seismic isolation at reducing spectral demands on secondary systems. Procedures to reduce the construction cost of secondary systems in isolated reactor buildings are presented. A blast assessment of the sample reactor building is performed for an assumed threat of 2000 kg of TNT explosive detonated on the surface with a closest distance to the reactor building of 10 m. The air and ground shock waves produced by the design threat are generated and used for performance assessment. The air blast loading to the sample reactor building is computed using a Computational Fluid Dynamics code Air3D and the ground shock time series is generated using an attenuation model for soil/rock response. Response-history analysis of the sample conventional and base isolated reactor buildings to external blast loadings is performed using the hydrocode LS-DYNA. The spectral demands on the secondary systems in the isolated reactor building due to air blast loading are greater than those for the conventional reactor building but much smaller than those spectral demands associated with Safe Shutdown Earthquake shaking. The isolators are extremely effective at filtering out high acceleration, high frequency ground shock loading.

  11. Derivation of energy-based base shear force coefficient considering hysteretic behavior and P-delta effects

    NASA Astrophysics Data System (ADS)

    Ucar, Taner; Merter, Onur

    2018-01-01

    A modified energy-balance equation accounting for P-delta effects and hysteretic behavior of reinforced concrete members is derived. Reduced hysteretic properties of structural components due to combined stiffness and strength degradation and pinching effects, and hysteretic damping are taken into account in a simple manner by utilizing plastic energy and seismic input energy modification factors. Having a pre-selected yield mechanism, energy balance of structure in inelastic range is considered. P-delta effects are included in derived equation by adding the external work of gravity loads to the work of equivalent inertia forces and equating the total external work to the modified plastic energy. Earthquake energy input to multi degree of freedom (MDOF) system is approximated by using the modal energy-decomposition. Energy-based base shear coefficients are verified by means of both pushover analysis and nonlinear time history (NLTH) analysis of several RC frames having different number of stories. NLTH analyses of frames are performed by using the time histories of ten scaled ground motions compatible with elastic design acceleration spectrum and fulfilling duration/amplitude related requirements of Turkish Seismic Design Code. The observed correlation between energy-based base shear force coefficients and the average base shear force coefficients of NLTH analyses provides a reasonable confidence in estimation of nonlinear base shear force capacity of frames by using the derived equation.

  12. The Effect Analysis of Strain Rate on Power Transmission Tower-Line System under Seismic Excitation

    PubMed Central

    Wang, Wenming

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system. PMID:25105157

  13. A Revised Earthquake Catalogue for South Iceland

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Zechar, J. Douglas; Vogfjörd, Kristín S.; Eberhard, David A. J.

    2016-01-01

    In 1991, a new seismic monitoring network named SIL was started in Iceland with a digital seismic system and automatic operation. The system is equipped with software that reports the automatic location and magnitude of earthquakes, usually within 1-2 min of their occurrence. Normally, automatic locations are manually checked and re-estimated with corrected phase picks, but locations are subject to random errors and systematic biases. In this article, we consider the quality of the catalogue and produce a revised catalogue for South Iceland, the area with the highest seismic risk in Iceland. We explore the effects of filtering events using some common recommendations based on network geometry and station spacing and, as an alternative, filtering based on a multivariate analysis that identifies outliers in the hypocentre error distribution. We identify and remove quarry blasts, and we re-estimate the magnitude of many events. This revised catalogue which we consider to be filtered, cleaned, and corrected should be valuable for building future seismicity models and for assessing seismic hazard and risk. We present a comparative seismicity analysis using the original and revised catalogues: we report characteristics of South Iceland seismicity in terms of b value and magnitude of completeness. Our work demonstrates the importance of carefully checking an earthquake catalogue before proceeding with seismicity analysis.

  14. Reply to “Comment on “Should Memphis build for California's earthquakes?” From A.D. Frankel”

    NASA Astrophysics Data System (ADS)

    Stein, Seth; Tomasello, Joseph; Newman, Andrew

    Carl Sagan observed that “extraordinary claims require extraordinary evidence.” In our view, A.D. Frankel's arguments (see accompanying Comment piece) do not reach the level required to demonstrate the counter-intuitive propositions that the earthquake hazard in the New Madrid Seismic Zone (NMSZ) is comparable to that in coastal California, and that buildings should be built to similar standards.This interchange is the latest in an ongoing debate beginning with Newman et al.'s [1999a] recommendation, based on analysis of Global Positioning System and earthquake data, that Frankel et al.'s [1996] estimate of California-level seismic hazard for the NMSZ should be reduced. Most points at issue, except for those related to the costs and benefits of the proposed new International Building Code 2000, have already been argued at length by both sides in the literature [e.g.,Schweig et al., 1999; Newman et al., 1999b, 2001; Cramer, 2001]. Hence,rather than rehash these points, we will try here to provide readers not enmeshed in this morass with an overview of the primary differences between our view and that of Frankel.

  15. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS IN SUPPORT OF INCREASED LIQUID LEVEL IN 241-AP TANK FARMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MACKEY TC; ABBOTT FG; CARPENTER BG

    2007-02-16

    The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford. The "Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Project" is in support of Tri-Party Agreement Milestone M-48-14.

  16. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  17. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.

  18. Seismic structures beneath Popocatepetl (Mexico) and Gorely (Kamchatka) volcanoes derived from passive tomography studies

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Pavel; Koulakov, Ivan

    2014-05-01

    A number of active volcanoes are observed in different parts of the world, and they attract great interest of scientists. Comparing their characteristics helps in understanding the origin and mechanisms of their activity. One of the most effective methods for studying the deep structure beneath volcanoes is passive source seismic tomography. In this study we present results of tomographic inversions for two active volcanoes located in different parts of the world: Popocatepetl (Mexico) and Gorely (Kamchatka, Russia). In the past century both volcanoes were actively erupted that explains great interest to their detailed investigations. In both cases we made the full data analysis starting from picking the arrival times from local events. In the case of the Popocatepetl study, a temporary seismological network was deployed by GFZ for the period from December 1999 to July 2000. Note that during this period there were a very few events recorded inside the volcano. Most of recorded earthquakes occurred in surrounding areas and they probably have the tectonic nature. We performed a special analysis to ground the efficiency of using these data for studying seismic structure beneath the network installed on the volcano. The tomographic inversion was performed using the LOTOS code by Koulakov (2009). Beneath the Popocatepetl volcano we have found a zone of strong anti-correlation between P- and S-velocities that leaded to high values of Vp/Vs ratio. Similar features were found for some other volcanoes in previous studies. We interpret these anomalies as zones of high content of fluids and melts that are related to active magma sources. For the case of Gorely volcano we used the data of a temporary network just deployed in summer 2013 by our team from IPGG, Novosibirsk. Luckily, during the field works, the volcano started to manifest strong seismic activity. In this period, 100 - 200 volcanic events occurred daily. We collected the continuous seismic records from 20 stations for 5-7 days that gives us the possibility to locate several hundreds of events and to build a preliminary seismic model beneath the Gorely volcano. We found a zone of low S-velocity located beneath the SE flank of the volcano, just between the Gorely and Mutnovsky volcanoes. This may serve as an argument for feeding these volcanoes from a single source. Although Popocatepetl and Gorely volcanoes are considerably different in their size and eruption characteristics, we found some similar features in the seismic structures, such as anti-correlation of P- and S- anomalies and high Vp/Vs ratio patterns below summits. This provides common patterns that give us the keys for understanding the general mechanism of working the volcanic systems. This study was partly supported by the projects #7.3 of BES RAS, IP SB RAS #20 and IP SB-FEB RAS #42

  19. VERCE: a productive e-Infrastructure and e-Science environment for data-intensive seismology research

    NASA Astrophysics Data System (ADS)

    Vilotte, Jean-Pierre; Atkinson, Malcolm; Carpené, Michele; Casarotti, Emanuele; Frank, Anton; Igel, Heiner; Rietbrock, Andreas; Schwichtenberg, Horst; Spinuso, Alessandro

    2016-04-01

    Seismology pioneers global and open-data access -- with internationally approved data, metadata and exchange standards facilitated worldwide by the Federation of Digital Seismic Networks (FDSN) and in Europe the European Integrated Data Archives (EIDA). The growing wealth of data generated by dense observation and monitoring systems and recent advances in seismic wave simulation capabilities induces a change in paradigm. Data-intensive seismology research requires a new holistic approach combining scalable high-performance wave simulation codes and statistical data analysis methods, and integrating distributed data and computing resources. The European E-Infrastructure project "Virtual Earthquake and seismology Research Community e-science environment in Europe" (VERCE) pioneers the federation of autonomous organisations providing data and computing resources, together with a comprehensive, integrated and operational virtual research environment (VRE) and E-infrastructure devoted to the full path of data use in a research-driven context. VERCE delivers to a broad base of seismology researchers in Europe easily used high-performance full waveform simulations and misfit calculations, together with a data-intensive framework for the collaborative development of innovative statistical data analysis methods, all of which were previously only accessible to a small number of well-resourced groups. It balances flexibility with new integrated capabilities to provide a fluent path from research innovation to production. As such, VERCE is a major contribution to the implementation phase of the ``European Plate Observatory System'' (EPOS), the ESFRI initiative of the solid-Earth community. The VRE meets a range of seismic research needs by eliminating chores and technical difficulties to allow users to focus on their research questions. It empowers researchers to harvest the new opportunities provided by well-established and mature high-performance wave simulation codes of the community. It enables active researchers to invent and refine scalable methods for innovative statistical analysis of seismic waveforms in a wide range of application contexts. The VRE paves the way towards a flexible shared framework for seismic waveform inversion, lowering the barriers to uptake for the next generation of researchers. The VRE can be accessed through the science gateway that puts together computational and data-intensive research into the same framework, integrating multiple data sources and services. It provides a context for task-oriented and data-streaming workflows, and maps user actions to the full gamut of the federated platform resources and procurement policies, activating the necessary behind-the-scene automation and transformation. The platform manages and produces domain metadata, coupling them with the provenance information describing the relationships and the dependencies, which characterise the whole workflow process. This dynamic knowledge base, can be explored for validation purposes via a graphical interface and a web API. Moreover, it fosters the assisted selection and re-use of the data within each phase of the scientific analysis. These phases can be identified as Simulation, Data Access, Preprocessing, Misfit and data processing, and are presented to the users of the gateway as dedicated and interactive workspaces. By enabling researchers to share results and provenance information, VERCE steers open-science behaviour, allowing researchers to discover and build on prior work and thereby to progress faster. A key asset is the agile strategy that VERCE deployed in a multi-organisational context, engaging seismologists, data scientists, ICT researchers, HPC and data resource providers, system administrators into short-lived tasks each with a goal that is a seismology priority, and intimately coupling research thinking with technical innovation. This changes the focus from HPC production environments and community data services to user-focused scenario, avoiding wasteful bouts of technology centricity where technologists collect requirements and develop a system that is not used because the ideas of the planned users have moved on. As such the technologies and concepts developed in VERCE are relevant to many other disciplines in computational and data driven Earth Sciences and can provide the key technologies for a European wide computational and data intensive framework in Earth Sciences.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, Arthur J.; Dreger, Douglas S.; Pitarka, Arben

    We performed three-dimensional (3D) anelastic ground motion simulations of the South Napa earthquake to investigate the performance of different finite rupture models and the effects of 3D structure on the observed wavefield. We considered rupture models reported by Dreger et al. (2015), Ji et al., (2015), Wei et al. (2015) and Melgar et al. (2015). We used the SW4 anelastic finite difference code developed at Lawrence Livermore National Laboratory (Petersson and Sjogreen, 2013) and distributed by the Computational Infrastructure for Geodynamics. This code can compute the seismic response for fully 3D sub-surface models, including surface topography and linear anelasticity. Wemore » use the 3D geologic/seismic model of the San Francisco Bay Area developed by the United States Geological Survey (Aagaard et al., 2008, 2010). Evaluation of earlier versions of this model indicated that the structure can reproduce main features of observed waveforms from moderate earthquakes (Rodgers et al., 2008; Kim et al., 2010). Simulations were performed for a domain covering local distances (< 25 km) and resolution providing simulated ground motions valid to 1 Hz.« less

  1. Geophysical surveying in the Sacramento Delta for earthquake hazard assessment and measurement of peat thickness

    NASA Astrophysics Data System (ADS)

    Craig, M. S.; Kundariya, N.; Hayashi, K.; Srinivas, A.; Burnham, M.; Oikawa, P.

    2017-12-01

    Near surface geophysical surveys were conducted in the Sacramento-San Joaquin Delta for earthquake hazard assessment and to provide estimates of peat thickness for use in carbon models. Delta islands have experienced 3-8 meters of subsidence during the past century due to oxidation and compaction of peat. Projected sea level rise over the next century will contribute to an ongoing landward shift of the freshwater-saltwater interface, and increase the risk of flooding due to levee failure or overtopping. Seismic shear wave velocity (VS) was measured in the upper 30 meters to determine Uniform Building Code (UBC)/ National Earthquake Hazard Reduction Program (NEHRP) site class. Both seismic and ground penetrating radar (GPR) methods were employed to estimate peat thickness. Seismic surface wave surveys were conducted at eight sites on three islands and GPR surveys were conducted at two of the sites. Combined with sites surveyed in 2015, the new work brings the total number of sites surveyed in the Delta to twenty.Soil boreholes were made at several locations using a hand auger, and peat thickness ranged from 2.1 to 5.5 meters. Seismic surveys were conducted using the multichannel analysis of surface wave (MASW) method and the microtremor array method (MAM). On Bouldin Island, VS of the surficial peat layer was 32 m/s at a site with pure peat and 63 m/s at a site peat with higher clay and silt content. Velocities at these sites reached a similar value, about 125 m/s, at a depth of 10 m. GPR surveys were performed at two sites on Sherman Island using 100 MHz antennas, and indicated the base of the peat layer at a depth of about 4 meters, consistent with nearby auger holes.The results of this work include VS depth profiles and UBC/NEHRP site classifications. Seismic and GPR methods may be used in a complementary fashion to estimate peat thickness. The seismic surface wave method is a relatively robust method and more effective than GPR in many areas with high clay content or where surface sediments have been disturbed by human activities. GPR does however provide significantly higher resolution and better depth control in areas with suitable recording conditions.

  2. (Multi)fractality of Earthquakes by use of Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Enescu, B.; Ito, K.; Struzik, Z. R.

    2002-12-01

    The fractal character of earthquakes' occurrence, in time, space or energy, has by now been established beyond doubt and is in agreement with modern models of seismicity. Moreover, the cascade-like generation process of earthquakes -with one "main" shock followed by many aftershocks, having their own aftershocks- may well be described through multifractal analysis, well suited for dealing with such multiplicative processes. The (multi)fractal character of seismicity has been analysed so far by using traditional techniques, like the box-counting and correlation function algorithms. This work introduces a new approach for characterising the multifractal patterns of seismicity. The use of wavelet analysis, in particular of the wavelet transform modulus maxima, to multifractal analysis was pioneered by Arneodo et al. (1991, 1995) and applied successfully in diverse fields, such as the study of turbulence, the DNA sequences or the heart rate dynamics. The wavelets act like a microscope, revealing details about the analysed data at different times and scales. We introduce and perform such an analysis on the occurrence time of earthquakes and show its advantages. In particular, we analyse shallow seismicity, characterised by a high aftershock "productivity", as well as intermediate and deep seismic activity, known for its scarcity of aftershocks. We examine as well declustered (aftershocks removed) versions of seismic catalogues. Our preliminary results show some degree of multifractality for the undeclustered, shallow seismicity. On the other hand, at large scales, we detect a monofractal scaling behaviour, clearly put in evidence for the declustered, shallow seismic activity. Moreover, some of the declustered sequences show a long-range dependent (LRD) behaviour, characterised by a Hurst exponent, H > 0.5, in contrast with the memory-less, Poissonian model. We demonstrate that the LRD is a genuine characteristic and is not an effect of the time series probability distribution function. One of the most attractive features of wavelet analysis is its ability to determine a local Hurst exponent. We show that this feature together with the possibility of extending the analysis to spatial patterns may constitute a valuable approach to search for anomalous (precursory?) patterns of seismic activity.

  3. AniTomo - New Anisotropic Teleseismic Body-Wave Tomography Code to Unravel Structure of the Upper Mantle: Impact of Inversion Settings on Inferences of the Output Model

    NASA Astrophysics Data System (ADS)

    Munzarova, H.; Plomerova, J.; Kissling, E. H.

    2015-12-01

    Consideration of only isotropic wave propagation and neglecting anisotropy in tomography studies is a simplification obviously incongruous with current understanding of mantle-lithosphere plate dynamics. Both fossil anisotropy in the mantle lithosphere and anisotropy due to the present-day flow in the asthenosphere may significantly influence propagation of seismic waves. We present a novel code for anisotropic teleseismic tomography (AniTomo) that allows to invert relative P-wave travel time residuals simultaneously for coupled isotropic-anisotropic P-wave velocity models of the upper mantle. We have modified frequently-used isotropic teleseismic tomography code Telinv by assuming weak hexagonal anisotropy with symmetry axis oriented generally in 3D to be, together with heterogeneities, a source of the observed P-wave travel-time residuals. Careful testing of the new code with synthetics, concentrating on strengths and limitations of the inversion method, is a necessary step before AniTomo is applied to real datasets. We examine various aspects of anisotropic tomography and particularly influence of ray coverage on resolvability of individual model parameters and of initial models on the result. Synthetic models are designed to schematically represent heterogeneous and anisotropic structures in the upper mantle. Several synthetic tests mimicking a real tectonic setting, e.g. the lithosphere subduction in the Northern Apennines in Italy (Munzarova et al., G-Cubed, 2013), allow us to make quantitative assessments of the well-known trade-off between effects of seismic anisotropy and heterogeneities. Our results clearly document that significant distortions of imaged velocity heterogeneities may result from neglecting anisotropy.

  4. The 2017 Maple Creek Seismic Swarm in Yellowstone National Park

    NASA Astrophysics Data System (ADS)

    Pang, G.; Hale, J. M.; Farrell, J.; Burlacu, R.; Koper, K. D.; Smith, R. B.

    2017-12-01

    The University of Utah Seismograph Stations (UUSS) performs near-real-time monitoring of seismicity in the region around Yellowstone National Park in partnership with the United States Geological Survey and the National Park Service. UUSS operates and maintains 29 seismic stations with network code WY (short-period, strong-motion, and broadband) and records data from five other seismic networks—IW, MB, PB, TA, and US—to enhance the location capabilities in the Yellowstone region. A seismic catalog is produced using a conventional STA/LTA detector and single-event location techniques (Hypoinverse). On June 12, 2017, a seismic swarm began in Yellowstone National Park about 5 km east of Hebgen Lake. The swarm is adjacent to the source region of the 1959 MW 7.3 Hebgen Lake earthquake, in an area corresponding to positive Coulumb stress change from that event. As of Aug. 1, 2017, the swarm consists of 1481 earthquakes with 1 earthquake above magnitude 4, 8 earthquakes in the magnitude 3 range, 115 earthquakes in the magnitude 2 range, 469 earthquakes in the magnitude 1 range, 856 earthquakes in the magnitude 0 range, 22 earthquakes with negative magnitudes, and 10 earthquakes with no magnitude. Earthquake depths are mostly between 3 and 10 km and earthquake depth increases toward the northwest. Moment tensors for the 2 largest events (3.6 MW and 4.4. MW) show strike-slip faulting with T axes oriented NE-SW, consistent with the regional stress field. We are currently using waveform cross-correlation methods to measure differential travel times that are being used with the GrowClust program to generate high-accuracy relative relocations. Those locations will be used to identify structures in the seismicity and make inferences about the tectonic and magmatic processes causing the swarm.

  5. Improved phase arrival estimate and location for local earthquakes in South Korea

    NASA Astrophysics Data System (ADS)

    Morton, E. A.; Rowe, C. A.; Begnaud, M. L.

    2012-12-01

    The Korean Institute of Geoscience and Mineral Resources (KIGAM) and the Korean Meteorological Agency (KMA) regularly report local (distance < ~1200 km) seismicity recorded with their networks; we obtain preliminary event location estimates as well as waveform data, but no phase arrivals are reported, so the data are not immediately useful for earthquake location. Our goal is to identify seismic events that are sufficiently well-located to provide accurate seismic travel-time information for events within the KIGAM and KMA networks, and also recorded by some regional stations. Toward that end, we are using a combination of manual phase identification and arrival-time picking, with waveform cross-correlation, to cluster events that have occurred in close proximity to one another, which allows for improved phase identification by comparing the highly correlating waveforms. We cross-correlate the known events with one another on 5 seismic stations and cluster events that correlate above a correlation coefficient threshold of 0.7, which reveals few clusters containing few events each. The small number of repeating events suggests that the online catalogs have had mining and quarry blasts removed before publication, as these can contribute significantly to repeating seismic sources in relatively aseismic regions such as South Korea. The dispersed source locations in our catalog, however, are ideal for seismic velocity modeling by providing superior sampling through the dense seismic station arrangement, which produces favorable event-to-station ray path coverage. Following careful manual phase picking on 104 events chosen to provide adequate ray coverage, we re-locate the events to obtain improved source coordinates. The re-located events are used with Thurber's Simul2000 pseudo-bending local tomography code to estimate the crustal structure on the Korean Peninsula, which is an important contribution to ongoing calibration for events of interest in the region.

  6. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  7. Understanding the seismic wave propagation inside and around an underground cavity from a 3D numerical survey

    NASA Astrophysics Data System (ADS)

    Esterhazy, Sofi; Schneider, Felix; Perugia, Ilaria; Bokelmann, Götz

    2017-04-01

    Motivated by the need to detect an underground cavity within the procedure of an On-Site-Inspection (OSI) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO), which might be caused by a nuclear explosion/weapon testing, we aim to provide a basic numerical study of the wave propagation around and inside such an underground cavity. One method to investigate the geophysical properties of an underground cavity allowed by the Comprehensive Nuclear-test Ban Treaty is referred to as "resonance seismometry" - a resonance method that uses passive or active seismic techniques, relying on seismic cavity vibrations. This method is in fact not yet entirely determined by the Treaty and so far, there are only very few experimental examples that have been suitably documented to build a proper scientific groundwork. This motivates to investigate this problem on a purely numerical level and to simulate these events based on recent advances in numerical modeling of wave propagation problems. Our numerical study includes the full elastic wave field in three dimensions. We consider the effects from an incoming plane wave as well as point source located in the surrounding of the cavity at the surface. While the former can be considered as passive source like a tele-seismic earthquake, the latter represents a man-made explosion or a viborseis as used for/in active seismic techniques. Further we want to demonstrate the specific characteristics of the scattered wave field from a P-waves and S-wave separately. For our simulations in 3D we use the discontinuous Galerkin Spectral Element Code SPEED developed by MOX (The Laboratory for Modeling and Scientific Computing, Department of Mathematics) and DICA (Department of Civil and Environmental Engineering) at the Politecnico di Milano. The computations are carried out on the Vienna Scientific Cluster (VSC). The accurate numerical modeling can facilitate the development of proper analysis techniques to detect the remnants of an underground nuclear test, help to set a rigorous scientific base of OSI and contribute to bringing the Treaty into force.

  8. Seismic anisotropy from crust to core: a mineral and rock physics perspective

    NASA Astrophysics Data System (ADS)

    Mainprice, David

    2014-05-01

    Since the early work of Hess and co-works for mantle in the 1960s and Poupinet et al. in 1980s for the inner core, we know that seismic anisotropy is a global phenomenon. Progress in seismology has led to a much more complete image of the Earth's interior in terms of heterogeneity and anisotropy. The interpretation of the seismic anisotropy requires a multidisciplinary effort to unravel the geodynamic scenario recorded in today's seismological snapshot. Progress in mineral physics on the experimental measurement of elastic properties at extreme conditions are now completed by ab initio atomic modelling for the full range of temperatures and pressures of the Earth's interior. The new data on the elastic constants of wider range minerals enables more realistic petrology for seismic anisotropy models. Experimental plastic deformation of polycrystalline samples at deep Earth conditions allows the direct study of crystal preferred orientation (CPO) and these studies are completed by ab initio atomic modelling of dislocations and other defects that control plasticity. Finally, polycrystalline plasticity codes allow the simulation of CPO reported by experimentalists and the modelling of more complex strain paths required for geodynamic models. The CPO of crustal and mantle rocks from the Earth's surface or recovered as xenoliths, provides a geological verification of the CPOs present in the Earth. The systematic use of CPO measured by U-stage for field studies all over the world for last 40 years has now been intensified in last 15 years by the use of electron back-scattered diffraction (EBSD) to study of CPO and the associated digital microstructure. It is an appropriate time to analysis CPO databases of olivine and other minerals, which represents the work of our group, both present and former members, as well as collaborating colleagues. It is also interesting to compare the natural record as illustrated by our databases in the light of recent experimental results. Information on CPO together with single crystal elastic constants and the equation of state allow the modelling of seismic anisotropy due to plasticity at any PT condition, and the connection with geodynamic processes related to large-scale flow in the deep Earth.

  9. Automatic Earthquake Detection and Location by Waveform coherency in Alentejo (South Portugal) Using CatchPy

    NASA Astrophysics Data System (ADS)

    Custodio, S.; Matos, C.; Grigoli, F.; Cesca, S.; Heimann, S.; Rio, I.

    2015-12-01

    Seismic data processing is currently undergoing a step change, benefitting from high-volume datasets and advanced computer power. In the last decade, a permanent seismic network of 30 broadband stations, complemented by dense temporary deployments, covered mainland Portugal. This outstanding regional coverage currently enables the computation of a high-resolution image of the seismicity of Portugal, which contributes to fitting together the pieces of the regional seismo-tectonic puzzle. Although traditional manual inspections are valuable to refine automatic results they are impracticable with the big data volumes now available. When conducted alone they are also less objective since the criteria is defined by the analyst. In this work we present CatchPy, a scanning algorithm to detect earthquakes in continuous datasets. Our main goal is to implement an automatic earthquake detection and location routine in order to have a tool to quickly process large data sets, while at the same time detecting low magnitude earthquakes (i.e. lowering the detection threshold). CatchPY is designed to produce an event database that could be easily located using existing location codes (e.g.: Grigoli et al. 2013, 2014). We use CatchPy to perform automatic detection and location of earthquakes that occurred in Alentejo region (South Portugal), taking advantage of a dense seismic network deployed in the region for two years during the DOCTAR experiment. Results show that our automatic procedure is particularly suitable for small aperture networks. The event detection is performed by continuously computing the short-term-average/long-term-average of two different characteristic functions (CFs). For the P phases we used a CF based on the vertical energy trace while for S phases we used a CF based on the maximum eigenvalue of the instantaneous covariance matrix (Vidale 1991). Seismic event location is performed by waveform coherence analysis, scanning different hypocentral coordinates (Grigoli et al. 2013, 2014). The reliability of automatic detections, phase pickings and locations are tested trough the quantitative comparison with manual results. This work is supported by project QuakeLoc, reference: PTDC/GEO-FIQ/3522/2012

  10. A note on adding viscoelasticity to earthquake simulators

    USGS Publications Warehouse

    Pollitz, Fred

    2017-01-01

    Here, I describe how time‐dependent quasi‐static stress transfer can be implemented in an earthquake simulator code that is used to generate long synthetic seismicity catalogs. Most existing seismicity simulators use precomputed static stress interaction coefficients to rapidly implement static stress transfer in fault networks with typically tens of thousands of fault patches. The extension to quasi‐static deformation, which accounts for viscoelasticity of Earth’s ductile lower crust and mantle, involves the precomputation of additional interaction coefficients that represent time‐dependent stress transfer among the model fault patches, combined with defining and evolving additional state variables that track this stress transfer. The new approach is illustrated with application to a California‐wide synthetic fault network.

  11. Design and analysis of fractional order seismic transducer for displacement and acceleration measurements

    NASA Astrophysics Data System (ADS)

    Veeraian, Parthasarathi; Gandhi, Uma; Mangalanathan, Umapathy

    2018-04-01

    Seismic transducers are widely used for measurement of displacement, velocity, and acceleration. This paper presents the design of seismic transducer in the fractional domain for the measurement of displacement and acceleration. The fractional order transfer function for seismic displacement and acceleration transducer are derived using Grünwald-Letnikov derivative. Frequency response analysis of fractional order seismic displacement transducer (FOSDT) and fractional order seismic acceleration transducer (FOSAT) are carried out for different damping ratio with the different fractional order, and the maximum dynamic measurement range is identified. The results demonstrate that fractional order seismic transducer has increased dynamic measurement range and less phase distortion as compared to the conventional seismic transducer even with a lower damping ratio. Time response of FOSDT and FOSAT are derived analytically in terms of Mittag-Leffler function, the effect of fractional behavior in the time domain is evaluated from the impulse and step response. The fractional order system is found to have significantly reduced overshoot as compared to the conventional transducer. The fractional order seismic transducer design proposed in this paper is illustrated with a design example for FOSDT and FOSAT. Finally, an electrical equivalent of FOSDT and FOSAT is considered, and its frequency response is found to be in close agreement with the proposed fractional order seismic transducer.

  12. Modelling framework developed for managing and forecasting the El Hierro 2011-2014 unrest processes based on the analysis of the seismicity and deformation data rate.

    NASA Astrophysics Data System (ADS)

    Garcia, Alicia; Fernandez-Ros, Alberto; Berrocoso, Manuel; Marrero, Jose Manuel; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramon

    2014-05-01

    In July 2011 at El Hierro (Canary Islands, Spain), a volcanic unrest was detected, with significant deformations followed by increased seismicity. A submarine eruption started on 10 October 2011 and ceased on 5 March 2012, after the volcanic tremor signals persistently weakened through February 2012. However, the seismic activity did not end when the eruption, as several other seismic crises followed since. The seismic episodes presented a characteristic pattern: over a few days the number and magnitude of seismic event increased persistently, culminating in seismic events severe enough to be felt all over the island. In all cases the seismic activity was preceded by significant deformations measured on the island's surface that continued during the whole episode. Analysis of the available GNSS-GPS and seismic data suggests that several magma injection processes occurred at depth from the beginning of the unrest. A model combining the geometry of the magma injection process and the variations in seismic energy released has allowed successful forecasting of the new-vent opening. The model presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself.

  13. Post-seismic velocity changes following the 2010 Mw 7.1 Darfield earthquake, New Zealand, revealed by ambient seismic field analysis

    NASA Astrophysics Data System (ADS)

    Heckels, R. EG; Savage, M. K.; Townend, J.

    2018-05-01

    Quantifying seismic velocity changes following large earthquakes can provide insights into fault healing and reloading processes. This study presents temporal velocity changes detected following the 2010 September Mw 7.1 Darfield event in Canterbury, New Zealand. We use continuous waveform data from several temporary seismic networks lying on and surrounding the Greendale Fault, with a maximum interstation distance of 156 km. Nine-component, day-long Green's functions were computed for frequencies between 0.1 and 1.0 Hz for continuous seismic records from immediately after the 2010 September 04 earthquake until 2011 January 10. Using the moving-window cross-spectral method, seismic velocity changes were calculated. Over the study period, an increase in seismic velocity of 0.14 ± 0.04 per cent was determined near the Greendale Fault, providing a new constraint on post-seismic relaxation rates in the region. A depth analysis further showed that velocity changes were confined to the uppermost 5 km of the subsurface. We attribute the observed changes to post-seismic relaxation via crack healing of the Greendale Fault and throughout the surrounding region.

  14. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  15. SHAKING TABLE TEST AND EFFECTIVE STRESS ANALYSIS ON SEISMIC PERFORMANCE WITH SEISMIC ISOLATION RUBBER TO THE INTERMEDIATE PART OF PILE FOUNDATION IN LIQUEFACTION

    NASA Astrophysics Data System (ADS)

    Uno, Kunihiko; Otsuka, Hisanori; Mitou, Masaaki

    The pile foundation is heavily damaged at the boundary division of the ground types, liquefied ground and non-liquefied ground, during an earthquake and there is a possibility of the collapse of the piles. In this study, we conduct a shaking table test and effective stress analysis of the influence of soil liquefaction and the seismic inertial force exerted on the pile foundation. When the intermediate part of the pile, there is at the boundary division, is subjected to section force, this part increases in size as compared to the pile head in certain instances. Further, we develop a seismic resistance method for a pile foundation in liquefaction using seismic isolation rubber and it is shown the middle part seismic isolation system is very effective.

  16. Multiple-Threshold Event Detection and Other Enhancements to the Virtual Seismologist (VS) Earthquake Early Warning Algorithm

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Caprio, M.; Cua, G. B.; Heaton, T. H.; Clinton, J. F.; Wiemer, S.

    2009-12-01

    The Virtual Seismologist (VS) algorithm is a Bayesian approach to earthquake early warning (EEW) being implemented by the Swiss Seismological Service at ETH Zurich. The application of Bayes’ theorem in earthquake early warning states that the most probable source estimate at any given time is a combination of contributions from a likelihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS algorithm was one of three EEW algorithms involved in the California Integrated Seismic Network (CISN) real-time EEW testing and performance evaluation effort. Its compelling real-time performance in California over the last three years has led to its inclusion in the new USGS-funded effort to develop key components of CISN ShakeAlert, a prototype EEW system that could potentially be implemented in California. A significant portion of VS code development was supported by the SAFER EEW project in Europe. We discuss recent enhancements to the VS EEW algorithm. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to be declared an event to reduce false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and it requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) to a hybrid on-site/regional approach capable of providing a continuously evolving stream of EEW information starting from the first P-detection. Offline analysis on Swiss and California waveform datasets indicate that the multiple-threshold approach is faster and more reliable for larger events than the earlier version of the VS codes. This multiple-threshold approach is well-suited for implementation on a wide range of devices, from embedded processor systems installed at a seismic stations, to small autonomous networks for local warnings, to large-scale regional networks such as the CISN. In addition, we quantify the influence of systematic use of prior information and Vs30-based corrections for site amplification on VS magnitude estimation performance, and describe how components of the VS algorithm will be integrated into non-EEW standard network processing procedures at CHNet, the national broadband / strong motion network in Switzerland. These enhancements to the VS codes will be transitioned from off-line to real-time testing at CHNet in Europe in the coming months, and will be incorporated into the development of key components of CISN ShakeAlert prototype system in California.

  17. Magma migration at the onset of the 2012-13 Tolbachik eruption revealed by Seismic Amplitude Ratio Analysis

    NASA Astrophysics Data System (ADS)

    Caudron, Corentin; Taisne, Benoit; Kugaenko, Yulia; Saltykov, Vadim

    2015-12-01

    In contrast of the 1975-76 Tolbachik eruption, the 2012-13 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma movement prior to this important eruption. A clear seismic migration within the seismic swarm, started 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava flows, was recorded (at 11:00 UTC, 27 November 2012). In order to get a first order approximation of the magma location, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest that the seismicity migrated toward the eruption location. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano followed by a lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.

  18. Earthquake hazard and risk assessment based on Unified Scaling Law for Earthquakes: Greater Caucasus and Crimea

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

    2018-05-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10 N(M, L) = A + B·(5 - M) + C·log10 L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within a seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g., peak ground acceleration, PGA, or macro-seismic intensity). After a rigorous verification against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory). The methodology of seismic hazard and risk assessment is illustrated by application to the territory of Greater Caucasus and Crimea.

  19. High Resolution Vertical Seismic Profile from the Chicxulub IODP/ICDP Expedition 364 Borehole: Wave Speeds and Seismic Reflectivity.

    NASA Astrophysics Data System (ADS)

    Nixon, C.; Kofman, R.; Schmitt, D. R.; Lofi, J.; Gulick, S. P. S.; Christeson, G. L.; Saustrup, S., Sr.; Morgan, J. V.

    2017-12-01

    We acquired a closely-spaced vertical seismic profile (VSP) in the Chicxulub K-Pg Impact Crater drilling program borehole to calibrate the existing surface seismic profiles and provide complementary measurements of in situ seismic wave speeds. Downhole seismic records were obtained at spacings ranging from 1.25 m to 5 m along the borehole from 47.5 m to 1325 mwsf (meters wireline below sea floor) (Fig 1a) using a Sercel SlimwaveTM geophone chain (University of Alberta). The seismic source was a 30/30ci Sercel Mini GI airgun (University of Texas), fired a minimum of 5 times per station. Seismic data processing used a combination of a commercial processing package (Schlumberger's VISTA) and MatlabTM codes. The VSP displays detailed reflectivity (Fig. 1a) with the strongest reflection seen at 600 mwsf (280 ms one-way time), geologically corresponding to the sharp contact between the post-impact sediments and the target peak ring rock, thus confirming the pre-drilling interpretations of the seismic profiles. A two-way time trace extracted from the separated up-going wavefield matches the major reflection both in travel time and character. In the granitic rocks that form the peak ring of the Chicxulub impact crater, we observe P-wave velocities of 4000-4500 m/s which are significantly less than the expected values of granitoids ( 6000 m/s) (Fig. 1b). The VSP measured wave speeds are confirmed against downhole sonic logging and in laboratory velocimetry measurements; these data provide additional evidence that the crustal material displaced by the impact experienced a significant amount of damage. Samples and data provided by IODP. Samples can be requested at http://web.iodp.tamu.edu/sdrm after 19 October 2017. Expedition 364 was jointly funded by ECORD, ICDP, and IODP with contributions and logistical support from the Yucatan State Government and UNAM. The downhole seismic chain and wireline system is funded by grants to DRS from the Canada Foundation for Innovation and the Alberta Enterprise and Advanced Education Grants Program.

  20. Velocity structures of Geothermal sites: A comparative study between different tomography techniques on the EGS-Soultz-sous-Forêts Site (France)

    NASA Astrophysics Data System (ADS)

    Calo', M. C.; Dorbath, C.

    2009-12-01

    One major goal of monitoring seismicity accompanying hydraulic fracturing of a reservoir is to recover the seismic velocity field in and around the geothermal site. In many cases the seismicity induced by the hydraulic stimulations allows us to roughly describe the velocity anomalies close to the hypocentral location, but only during the time period of the stimulation. Several studies have shown that the 4D (time dependent) seismic tomographies are very useful to illustrate and study the temporal variation of the seismic velocities conditioned by injected fluids. Nevertheless in geothermal fields local earthquake tomography (LET) is often inadequate to study the seismic velocities during the inter-injection periods, due to the lack of seismicity. In July 2000 an injection test that lasted 15 days performed at the Enhanced Geothermal System (EGS) site of Soultz-sous-Forêts (Alsace, France) produced about 7200 micro-earthquakes with Duration Magnitude ranging from -0.9 to 2.5. the earthquakes were located by down hole and surface seismic stations. We present here a comparison between three tomographic studies, 1) the “traditional” seismic tomography of Cuneot et al., 2008, 2) a Double Difference tomography using the TomoDD code of Zhang and Thurber (2003) and, 3) the models obtained by applying the Weighted Average Model method (WAM, Calo’ et al., 2009). the velocity models were obtained using the same dataset recorded during the stimulation. The WAM technique produces a more reliable reconstruction of the structures around and above the cluster of earthquakes, as demonstrated by the distribution of the velocity standard deviations. Although the velocity distributions obtained by the three tomographic approaches are qualitatively similar, the WAM results correlate better with independent data such the fracturing directions measured in the down-holes, the location of the clustered seimsicity) than those of the traditional and DD tomographies. To overcome the limits of LET during the inter-injection periods we plan to perform a seismic noise tomography study. In geothermal sites, the elastic characteristics of the volume at rest, i.e. during the inter-injection periods, are often poorly known.

  1. SEGY to ASCII: Conversion and Plotting Program

    USGS Publications Warehouse

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  2. Geology’s “Super Graphics” and the Public: Missed Opportunities for Geoscience Education

    NASA Astrophysics Data System (ADS)

    Clary, R. M.; Wandersee, J. H.

    2009-12-01

    The geosciences are very visual, as demonstrated by the illustration density of maps, graphs, photographs, and diagrams in introductory textbooks. As geoscience students progress, they are further exposed to advanced graphics, such as phase diagrams and subsurface seismic data visualizations. Photographs provide information from distant sites, while multivariate graphics supply a wealth of data for viewers to access. When used effectively, geology graphics have exceptional educational potential. However, geological graphic data are often presented in specialized formats, and are not easily interpreted by an uninformed viewer. In the Howe-Russell Geoscience Complex at Louisiana State University, there is a very large graphic (~ 30 ft x 6 ft) exhibited in a side hall, immediately off the main entrance hall. The graphic, divided into two obvious parts, displays in its lower section seismic data procured in the Gulf of Mexico, from near offshore Louisiana to the end of the continental shelf. The upper section of the graphic reveals drilling block information along the seismic line. Using Tufte’s model of graphic excellence and Paivio’s dual-coding theory, we analyzed the graphic in terms of data density, complexity, legibility, format, and multivariate presentation. We also observed viewers at the site on 5 occasions, and recorded their interactions with the graphic. This graphic can best be described as a Tufte “super graphic.” Its data are high in density and multivariate in nature. Various data sources are combined in a large format to provide a powerful example of a multitude of information within a convenient and condensed presentation. However, our analysis revealed that the graphic misses an opportunity to educate the non-geologist. The information and seismic “language” of the graphic is specific to the geology community, and the information is not interpreted for the lay viewer. The absence of title, descriptions, and symbol keys are detrimental. Terms are not defined. The absence of color keys and annotations is more likely to lead to an appreciation of graphic beauty, without concomitant scientific understanding. We further concluded that in its current location, constraints of space and reflective lighting prohibit the viewer from simultaneously accessing all subsurface data in a “big picture” view. The viewer is not able to fully comprehend the macro/micro aspects of the graphic design within the limited viewing space. The graphic is an example of geoscience education possibility, a possibility that is currently undermined and unrealized by lack of interpretation. Our analysis subsequently informed the development of a model to maximize the graphic’s educational potential, which can be applied to similar geological super graphics for enhanced public scientific understanding. Our model includes interactive displays that apply the auditory-visual dual coding approach to learning. Notations and aural explanations for geological features should increase viewer understanding, and produce an effective informal educational display.

  3. Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio

    1997-11-01

    A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).

  4. Path and site effects deduced from merged transfrontier internet macroseismic data of two recent M4 earthquakes in northwest Europe using a grid cell approach

    NASA Astrophysics Data System (ADS)

    Van Noten, Koen; Lecocq, Thomas; Sira, Christophe; Hinzen, Klaus-G.; Camelbeeck, Thierry

    2017-04-01

    The online collection of earthquake reports in Europe is strongly fragmented across numerous seismological agencies. This paper demonstrates how collecting and merging online institutional macroseismic data strongly improves the density of observations and the quality of intensity shaking maps. Instead of using ZIP code Community Internet Intensity Maps, we geocode individual response addresses for location improvement, assign intensities to grouped answers within 100 km2 grid cells, and generate intensity attenuation relations from the grid cell intensities. Grid cell intensity maps are less subjective and illustrate a more homogeneous intensity distribution than communal ZIP code intensity maps. Using grid cells for ground motion analysis offers an advanced method for exchanging transfrontier equal-area intensity data without sharing any personal information. The applicability of the method is demonstrated on the felt responses of two clearly felt earthquakes: the 8 September 2011 ML 4.3 (Mw 3.7) Goch (Germany) and the 22 May 2015 ML 4.2 (Mw 3.7) Ramsgate (UK) earthquakes. Both events resulted in a non-circular distribution of intensities which is not explained by geometrical amplitude attenuation alone but illustrates an important low-pass filtering due to the sedimentary cover above the Anglo-Brabant Massif and in the Lower Rhine Graben. Our study illustrates the effect of increasing bedrock depth on intensity attenuation and the importance of the WNW-ESE Caledonian structural axis of the Anglo-Brabant Massif for seismic wave propagation. Seismic waves are less attenuated - high Q - along the strike of a tectonic structure but are more strongly attenuated - low Q - perpendicular to this structure, particularly when they cross rheologically different seismotectonic units separated by crustal-rooted faults.

  5. What defines an Expert? - Uncertainty in the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Bond, C. E.

    2008-12-01

    Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?

  6. Seismic risk assessment and application in the central United States

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic risk is a somewhat subjective, but important, concept in earthquake engineering and other related decision-making. Another important concept that is closely related to seismic risk is seismic hazard. Although seismic hazard and seismic risk have often been used interchangeably, they are fundamentally different: seismic hazard describes the natural phenomenon or physical property of an earthquake, whereas seismic risk describes the probability of loss or damage that could be caused by a seismic hazard. The distinction between seismic hazard and seismic risk is of practical significance because measures for seismic hazard mitigation may differ from those for seismic risk reduction. Seismic risk assessment is a complicated process and starts with seismic hazard assessment. Although probabilistic seismic hazard analysis (PSHA) is the most widely used method for seismic hazard assessment, recent studies have found that PSHA is not scientifically valid. Use of PSHA will lead to (1) artifact estimates of seismic risk, (2) misleading use of the annual probability of exccedance (i.e., the probability of exceedance in one year) as a frequency (per year), and (3) numerical creation of extremely high ground motion. An alternative approach, which is similar to those used for flood and wind hazard assessments, has been proposed. ?? 2011 ASCE.

  7. High lateral resolution exploration using surface waves from noise records

    NASA Astrophysics Data System (ADS)

    Chávez-García, Francisco José Yokoi, Toshiaki

    2016-04-01

    Determination of the shear-wave velocity structure at shallow depths is a constant necessity in engineering or environmental projects. Given the sensitivity of Rayleigh waves to shear-wave velocity, subsoil structure exploration using surface waves is frequently used. Methods such as the spectral analysis of surface waves (SASW) or multi-channel analysis of surface waves (MASW) determine phase velocity dispersion from surface waves generated by an active source recorded on a line of geophones. Using MASW, it is important that the receiver array be as long as possible to increase the precision at low frequencies. However, this implies that possible lateral variations are discarded. Hayashi and Suzuki (2004) proposed a different way of stacking shot gathers to increase lateral resolution. They combined strategies used in MASW with the common mid-point (CMP) summation currently used in reflection seismology. In their common mid-point with cross-correlation method (CMPCC), they cross-correlate traces sharing CMP locations before determining phase velocity dispersion. Another recent approach to subsoil structure exploration is based on seismic interferometry. It has been shown that cross-correlation of a diffuse field, such as seismic noise, allows the estimation of the Green's Function between two receivers. Thus, a virtual-source seismic section may be constructed from the cross-correlation of seismic noise records obtained in a line of receivers. In this paper, we use the seismic interferometry method to process seismic noise records obtained in seismic refraction lines of 24 geophones, and analyse the results using CMPCC to increase the lateral resolution of the results. Cross-correlation of the noise records allows reconstructing seismic sections with virtual sources at each receiver location. The Rayleigh wave component of the Green's Functions is obtained with a high signal-to-noise ratio. Using CMPCC analysis of the virtual-source seismic lines, we are able to identify lateral variations of phase velocity inside the seismic line, and increase the lateral resolution compared with results of conventional analysis.

  8. Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory

    NASA Astrophysics Data System (ADS)

    Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi

    2018-03-01

    With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.

  9. Development of a dynamic coupled hydro-geomechanical code and its application to induced seismicity

    NASA Astrophysics Data System (ADS)

    Miah, Md Mamun

    This research describes the importance of a hydro-geomechanical coupling in the geologic sub-surface environment from fluid injection at geothermal plants, large-scale geological CO2 sequestration for climate mitigation, enhanced oil recovery, and hydraulic fracturing during wells construction in the oil and gas industries. A sequential computational code is developed to capture the multiphysics interaction behavior by linking a flow simulation code TOUGH2 and a geomechanics modeling code PyLith. Numerical formulation of each code is discussed to demonstrate their modeling capabilities. The computational framework involves sequential coupling, and solution of two sub-problems- fluid flow through fractured and porous media and reservoir geomechanics. For each time step of flow calculation, pressure field is passed to the geomechanics code to compute effective stress field and fault slips. A simplified permeability model is implemented in the code that accounts for the permeability of porous and saturated rocks subject to confining stresses. The accuracy of the TOUGH-PyLith coupled simulator is tested by simulating Terzaghi's 1D consolidation problem. The modeling capability of coupled poroelasticity is validated by benchmarking it against Mandel's problem. The code is used to simulate both quasi-static and dynamic earthquake nucleation and slip distribution on a fault from the combined effect of far field tectonic loading and fluid injection by using an appropriate fault constitutive friction model. Results from the quasi-static induced earthquake simulations show a delayed response in earthquake nucleation. This is attributed to the increased total stress in the domain and not accounting for pressure on the fault. However, this issue is resolved in the final chapter in simulating a single event earthquake dynamic rupture. Simulation results show that fluid pressure has a positive effect on slip nucleation and subsequent crack propagation. This is confirmed by running a sensitivity analysis that shows an increase in injection well distance results in delayed slip nucleation and rupture propagation on the fault.

  10. Seismic risk assessment of Navarre (Northern Spain)

    NASA Astrophysics Data System (ADS)

    Gaspar-Escribano, J. M.; Rivas-Medina, A.; García Rodríguez, M. J.; Benito, B.; Tsige, M.; Martínez-Díaz, J. J.; Murphy, P.

    2009-04-01

    The RISNA project, financed by the Emergency Agency of Navarre (Northern Spain), aims at assessing the seismic risk of the entire region. The final goal of the project is the definition of emergency plans for future earthquakes. With this purpose, four main topics are covered: seismic hazard characterization, geotechnical classification, vulnerability assessment and damage estimation to structures and exposed population. A geographic information system is used to integrate, analyze and represent all information colleted in the different phases of the study. Expected ground motions on rock conditions with a 90% probability of non-exceedance in an exposure time of 50 years are determined following a Probabilistic Seismic Hazard Assessment (PSHA) methodology that includes a logic tree with different ground motion and source zoning models. As the region under study is located in the boundary between Spain and France, an effort is required to collect and homogenise seismological data from different national and regional agencies. A new homogenised seismic catalogue, merging data from Spanish, French, Catalonian and international agencies and establishing correlations between different magnitude scales, is developed. In addition, a new seismic zoning model focused on the study area is proposed. Results show that the highest ground motions on rock conditions are expected in the northeastern part of the region, decreasing southwards. Seismic hazard can be expressed as low-to-moderate. A geotechnical classification of the entire region is developed based on surface geology, available borehole data and morphotectonic constraints. Frequency-dependent amplification factors, consistent with code values, are proposed. The northern and southern parts of the region are characterized by stiff and soft soils respectively, being the softest soils located along river valleys. Seismic hazard maps including soil effects are obtained by applying these factors to the seismic hazard maps on rock conditions (for the same probability level). Again, the highest hazard is found in the northeastern part of the region. The lowest hazard is obtained along major river valleys The vulnerability assessment of the Navarra building stock is accomplished using as proxy a combination of building age, location, number of floors and the implantation of building codes. Field surveys help constraining the extent of traditional and technological construction types. The vulnerability characterization is carried out following three methods: European Macroseismic Scale (EMS 98), RISK UE vulnerability index and the capacity spectrum method implemented in Hazus. Vulnerability distribution maps for each Navarrean municipality are provided, adapted to the EMS98 vulnerability classes. The vulnerability of Navarre is medium to high, except for recent urban, highly populated developments. For each vulnerability class and expected ground motion, damage distribution is estimated by means of damage probability matrixes. Several damage indexes, embracing relative and absolute damage estimates, are used. Expected average damage is low. Whereas the largest amounts of damaged structures are found in big cities, the highest percentages are obtained in some muniucipalities of northeastern Navarre. Additionally, expected percentages and amounts of affected persons by earthquake damage are calculated for each municipality. Expected amounts of affected people are low, reflecting the low expected damage degree.

  11. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.

  12. Seismicity of the Wabash Valley, Ste. Genevieve, and Rough Creek Graben Seismic Zones from the Earthscope Ozarks-Illinois-Indiana-Kentucky (OIINK) FlexArray Experiment

    NASA Astrophysics Data System (ADS)

    Shirley, Matthew Richard

    I analyzed seismic data from the Ozarks-Illinois-Indiana-Kentucky (OIINK) seismic experiment that operated in eastern Missouri, southern Illinois, southern Indiana, and Kentucky from July 2012 through March 2015. A product of this analysis is a new catalog of earthquake locations and magnitudes for small-magnitude local events during this study period. The analysis included a pilot study involving detailed manual analysis of all events in a ten-day test period and determination of the best parameters for a suite of automated detection and location programs. I eliminated events that were not earthquakes (mostly quarry and surface mine blasts) from the output of the automated programs, and reprocessed the locations for the earthquakes with manually picked P- and S-wave arrivals. This catalog consists of earthquake locations, depths, and local magnitudes. The new catalog consists of 147 earthquake locations, including 19 located within the bounds of the OIINK array. Of these events, 16 were newly reported events, too small to be reported in the Center for Earthquake Research and Information (CERI) regional seismic network catalog. I compared the magnitudes reported by CERI for corresponding earthquakes to establish a magnitude calibration factor for all earthquakes recorded by the OIINK array. With the calibrated earthquake magnitudes, I incorporate the previous OIINK results from Yang et al. (2014) to create magnitude-frequency distributions for the seismic zones in the region alongside the magnitude-frequency distributions made from CERI data. This shows that Saint Genevieve and Wabash Valley seismic zones experience seismic activity at an order magnitude lower rate than the New Madrid seismic zone, and the Rough Creek Graben experiences seismic activity two orders of magnitude less frequently than New Madrid.

  13. High resolution seismic tomography imaging of Ireland with quarry blast data

    NASA Astrophysics Data System (ADS)

    Arroucau, P.; Lebedev, S.; Bean, C. J.; Grannell, J.

    2017-12-01

    Local earthquake tomography is a well established tool to image geological structure at depth. That technique, however, is difficult to apply in slowly deforming regions, where local earthquakes are typically rare and of small magnitude, resulting in sparse data sampling. The natural earthquake seismicity of Ireland is very low. That due to quarry and mining blasts, on the other hand, is high and homogeneously distributed. As a consequence, and thanks to the dense and nearly uniform coverage achieved in the past ten years by temporary and permanent broadband seismological stations, the quarry blasts offer an alternative approach for high resolution seismic imaging of the crust and uppermost mantle beneath Ireland. We detected about 1,500 quarry blasts in Ireland and Northern Ireland between 2011 and 2014, for which we manually picked more than 15,000 P- and 20,000 S-wave first arrival times. The anthropogenic, explosive origin of those events was unambiguously assessed based on location, occurrence time and waveform characteristics. Here, we present a preliminary 3D tomographic model obtained from the inversion of 3,800 P-wave arrival times associated with a subset of 500 events observed in 2011, using FMTOMO tomographic code. Forward modeling is performed with the Fast Marching Method (FMM) and the inverse problem is solved iteratively using a gradient-based subspace inversion scheme after careful selection of damping and smoothing regularization parameters. The results illuminate the geological structure of Ireland from deposit to crustal scale in unprecedented detail, as demonstrated by sensitivity analysis, source relocation with the 3D velocity model and comparisons with surface geology.

  14. Evaluation of Sloped Bottom Tuned Liquid Damper for Reduction of Seismic Response of Tall Buildings

    NASA Astrophysics Data System (ADS)

    Patil, G. R.; Singh, K. D.

    2016-12-01

    Due to migration of people to urban area, high land costs and use of light weight materials modern buildings tend to be taller, lighter and flexible. These buildings possess low damping. This increases the possibility of failure during earthquake ground motion and also affect the serviceability during wind vibrations. Out of many available techniques today, to reduce the response of structure under dynamic loading, Tuned Liquid Damper (TLD) is a recent technique to mitigate seismic response. However TLD has been used to mitigate the wind induced structural vibrations. Flat bottom TLD gives energy back to the structure after event of dynamic loading and it is termed as beating. Beating affects the performance of TLD. Study attempts to analyze the effectiveness of sloped bottom TLD for reducing seismic vibrations of structure. Concept of equivalent flat bottom LD has been used to analyze sloped bottom TLD. Finite element method (EM) is used to model the structure and the liquid in the TLD. MATLAB code is developed to study the response of structure, the liquid sloshing in the tank and the coupled fluid-structure interaction. A ten storey two bay RC frame is analyzed for few inputs of ground motion. A sinusoidal ground motion corresponding to resonance condition with fundamental frequency of frame is analyzed. In the analysis the inherent damping of structure is not considered. Observations from the study shows that sloped bottom TLD uses less amount of liquid than flat bottom TLD. Also observed that efficiency of sloped bottom TLD can be improved if it is properly tuned.

  15. Cost optimization of reinforced concrete cantilever retaining walls under seismic loading using a biogeography-based optimization algorithm with Levy flights

    NASA Astrophysics Data System (ADS)

    Aydogdu, Ibrahim

    2017-03-01

    In this article, a new version of a biogeography-based optimization algorithm with Levy flight distribution (LFBBO) is introduced and used for the optimum design of reinforced concrete cantilever retaining walls under seismic loading. The cost of the wall is taken as an objective function, which is minimized under the constraints implemented by the American Concrete Institute (ACI 318-05) design code and geometric limitations. The influence of peak ground acceleration (PGA) on optimal cost is also investigated. The solution of the problem is attained by the LFBBO algorithm, which is developed by adding Levy flight distribution to the mutation part of the biogeography-based optimization (BBO) algorithm. Five design examples, of which two are used in literature studies, are optimized in the study. The results are compared to test the performance of the LFBBO and BBO algorithms, to determine the influence of the seismic load and PGA on the optimal cost of the wall.

  16. An integrated approach to characterization of fractured reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Datta-Gupta, A.; Majer, E.; Vasco, D.

    1995-12-31

    This paper summarizes an integrated hydrologic and seismic characterization of a fractured limestone formation at the Conoco Borehole Test Facility (CBTF) in Kay County, Oklahoma. Transient response from pressure interference tests were first inverted in order to identify location and orientation of dominant fractures at the CBTF. Subsequently, high resolution (1000 to 10000 Hz) cross-well and single-well seismic surveys were conducted to verify the preferential slow paths indicated by hydrologic analysis. Seismic surveys were conducted before and after an air injection in order to increase the visibility of the fracture zone to seismic imaging. Both Seismic and hydrologic analysis weremore » found to yield consistent results in detecting the location of a major fracture zone.« less

  17. High precision gas hydrate imaging of small-scale and high-resolution marine sparker multichannel seismic data

    NASA Astrophysics Data System (ADS)

    Luo, D.; Cai, F.

    2017-12-01

    Small-scale and high-resolution marine sparker multi-channel seismic surveys using large energy sparkers are characterized by a high dominant frequency of the seismic source, wide bandwidth, and a high resolution. The technology with a high-resolution and high-detection precision was designed to improve the imaging quality of shallow sedimentary. In the study, a 20KJ sparker and 24-channel streamer cable with a 6.25m group interval were used as a seismic source and receiver system, respectively. Key factors for seismic imaging of gas hydrate are enhancement of S/N ratio, amplitude compensation and detailed velocity analysis. However, the data in this study has some characteristics below: 1. Small maximum offsets are adverse to velocity analysis and multiple attenuation. 2. Lack of low frequency information, that is, information less than 100Hz are invisible. 3. Low S/N ratio since less coverage times (only 12 times). These characteristics make it difficult to reach the targets of seismic imaging. In the study, the target processing methods are used to improve the seismic imaging quality of gas hydrate. First, some technologies of noise suppression are combined used in pre-stack seismic data to suppression of seismic noise and improve the S/N ratio. These technologies including a spectrum sharing noise elimination method, median filtering and exogenous interference suppression method. Second, the combined method of three technologies including SRME, τ-p deconvolution and high precision Radon transformation is used to remove multiples. Third, accurate velocity field are used in amplitude energy compensation to highlight the Bottom Simulating Reflector (short for BSR, the indicator of gas hydrates) and gas migration pathways (such as gas chimneys, hot spots et al.). Fourth, fine velocity analysis technology are used to improve accuracy of velocity analysis. Fifth, pre-stack deconvolution processing technology is used to compensate for low frequency energy and suppress of ghost, thus formation reflection characteristics are highlighted. The result shows that the small-scale and high resolution marine sparker multi-channel seismic surveys are very effective in improving the resolution and quality of gas hydrate imaging than the conventional seismic acquisition technology.

  18. Using geologic maps and seismic refraction in pavement-deflection analysis

    DOT National Transportation Integrated Search

    1999-10-01

    The researchers examined the relationship between three data types -- geologic maps, pavement deflection, and seismic refraction data -- from diverse geologic settings to determine whether geologic maps and seismic data might be used to interpret def...

  19. The shallow elastic structure of the lunar crust: New insights from seismic wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-10-01

    Enigmatic lunar seismograms recorded during the Apollo 17 mission in 1972 have so far precluded the identification of shear-wave arrivals and hence the construction of a comprehensive elastic model of the shallow lunar subsurface. Here, for the first time, we extract shear-wave information from the Apollo active seismic data using a novel waveform analysis technique based on spatial seismic wavefield gradients. The star-like recording geometry of the active seismic experiment lends itself surprisingly well to compute spatial wavefield gradients and rotational ground motion as a function of time. These observables, which are new to seismic exploration in general, allowed us to identify shear waves in the complex lunar seismograms, and to derive a new model of seismic compressional and shear-wave velocities in the shallow lunar crust, critical to understand its lithology and constitution, and its impact on other geophysical investigations of the Moon's deep interior.

  20. Seismic imaging of post-glacial sediments - test study before Spitsbergen expedition

    NASA Astrophysics Data System (ADS)

    Szalas, Joanna; Grzyb, Jaroslaw; Majdanski, Mariusz

    2017-04-01

    This work presents results of the analysis of reflection seismic data acquired from testing area in central Poland. For this experiment we used total number of 147 vertical component seismic stations (DATA-CUBE and Reftek "Texan") with accelerated weight drop (PEG-40). The profile was 350 metres long. It is a part of pilot study for future research project on Spitsbergen. The purpose of the study is to recognise the characteristics of seismic response of post-glacial sediments in order to design the most adequate survey acquisition parameters and processing sequence for data from Spitsbergen. Multiple tests and comparisons have been performed to obtain the best possible quality of seismic image. In this research we examine the influence of receiver interval size, front mute application and surface wave attenuation attempts. Although seismic imaging is the main technique we are planning to support this analysis with additional data from traveltime tomography, MASW and other a priori information.

  1. Back analysis of fault-slip in burst prone environment

    NASA Astrophysics Data System (ADS)

    Sainoki, Atsushi; Mitri, Hani S.

    2016-11-01

    In deep underground mines, stress re-distribution induced by mining activities could cause fault-slip. Seismic waves arising from fault-slip occasionally induce rock ejection when hitting the boundary of mine openings, and as a result, severe damage could be inflicted. In general, it is difficult to estimate fault-slip-induced ground motion in the vicinity of mine openings because of the complexity of the dynamic response of faults and the presence of geological structures. In this paper, a case study is conducted for a Canadian underground mine, herein called "Mine-A", which is known for its seismic activities. Using a microseismic database collected from the mine, a back analysis of fault-slip is carried out with mine-wide 3-dimensional numerical modeling. A back analysis is conducted to estimate the physical and mechanical properties of the causative fracture or shear zones. One large seismic event has been selected for the back analysis to detect a fault-slip related seismic event. In the back analysis, the shear zone properties are estimated with respect to moment magnitude of the seismic event and peak particle velocity (PPV) recorded by a strong ground motion sensor. The estimated properties are then validated through comparison with peak ground acceleration recorded by accelerometers. Lastly, ground motion in active mining areas is estimated by conducting dynamic analysis with the estimated values. The present study implies that it would be possible to estimate the magnitude of seismic events that might occur in the near future by applying the estimated properties to the numerical model. Although the case study is conducted for a specific mine, the developed methodology can be equally applied to other mines suffering from fault-slip related seismic events.

  2. Induced seismicity in a salt mine environment evaluated by a coupled continuum-discrete modelling.

    NASA Astrophysics Data System (ADS)

    Mercerat, E.; Souley, M.; Driad, L.; Bernard, P.

    2005-12-01

    Within the framework of a research project launched to assess the feasibility of seismic monitoring of underground growing cavities, this specific work focus on two main complementary axis: the validation of seismic monitoring techniques in salt mine environments, and the numerical modelling of deformation and failure mechanisms with their associated acoustic emissions, the induced microseismicity. The underground cavity under monitoring is located at Cerville (Lorraine, France) within a salt layer 180 m deep and it presents a rather regular cylindrical shape of 100 m diameter. Typically, the overburden is characterized by the presence of two competent layers with elasto-brittle behaviour and located 50 m above the salt layer. When the salt exploitation restarts, the cavity will progressively grow causing irreversible damage of the upper layers until its final collapse at a time scale of the order of one year. Numerical modelling of such a complex process requires a large scale model which takes into account both the growing cavity within the salt layer and the mechanical behaviour of the overburden where high deformation and fracturing is expected. To keep the elasto-brittle behaviour of the competent layers where most seismic damage is expected, we use the PFC code (Itasca Cons). To approach the other layers (mainly composed of marls and salt) which present more ductile and/or viscoplastic behaviour, a continuum approach based on the FLAC code (Itasca Cons) is employed. Numerous calibration process were needed to estimate the microproperties used in PFC to reproduce the macroscopic behaviour from laboratory tests performed on samples extracted from the competent layers. As long as the size of the PFC inclusion representing the brittle material is much higher than the core sample sizes, the scale effect of microproperties is examined. The next stage is to perform calculations on the basis of previous macroscopic and microproperties calibration results, and compare them with the observed microseismicity in the rock mass.

  3. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  4. Bayesian identification of multiple seismic change points and varying seismic rates caused by induced seismicity

    NASA Astrophysics Data System (ADS)

    Montoya-Noguera, Silvana; Wang, Yu

    2017-04-01

    The Central and Eastern United States (CEUS) has experienced an abnormal increase in seismic activity, which is believed to be related to anthropogenic activities. The U.S. Geological Survey has acknowledged this situation and developed the CEUS 2016 1 year seismic hazard model using the catalog of 2015 by assuming stationary seismicity in that period. However, due to the nonstationary nature of induced seismicity, it is essential to identify change points for accurate probabilistic seismic hazard analysis (PSHA). We present a Bayesian procedure to identify the most probable change points in seismicity and define their respective seismic rates. It uses prior distributions in agreement with conventional PSHA and updates them with recent data to identify seismicity changes. It can determine the change points in a regional scale and may incorporate different types of information in an objective manner. It is first successfully tested with simulated data, and then it is used to evaluate Oklahoma's regional seismicity.

  5. Systematic detection and classification of earthquake clusters in Italy

    NASA Astrophysics Data System (ADS)

    Poli, P.; Ben-Zion, Y.; Zaliapin, I. V.

    2017-12-01

    We perform a systematic analysis of spatio-temporal clustering of 2007-2017 earthquakes in Italy with magnitudes m>3. The study employs the nearest-neighbor approach of Zaliapin and Ben-Zion [2013a, 2013b] with basic data-driven parameters. The results indicate that seismicity in Italy (an extensional tectonic regime) is dominated by clustered events, with smaller proportion of background events than in California. Evaluation of internal cluster properties allows separation of swarm-like from burst-like seismicity. This classification highlights a strong geographical coherence of cluster properties. Swarm-like seismicity are dominant in regions characterized by relatively slow deformation with possible elevated temperature and/or fluids (e.g. Alto Tiberina, Pollino), while burst-like seismicity are observed in crystalline tectonic regions (Alps and Calabrian Arc) and in Central Italy where moderate to large earthquakes are frequent (e.g. L'Aquila, Amatrice). To better assess the variation of seismicity style across Italy, we also perform a clustering analysis with region-specific parameters. This analysis highlights clear spatial changes of the threshold separating background and clustered seismicity, and permits better resolution of different clusters in specific geological regions. For example, a large proportion of repeaters is found in the Etna region as expected for volcanic-induced seismicity. A similar behavior is observed in the northern Apennines with high pore pressure associated with mantle degassing. The observed variations of earthquakes properties highlight shortcomings of practices using large-scale average seismic properties, and points to connections between seismicity and local properties of the lithosphere. The observations help to improve the understanding of the physics governing the occurrence of earthquakes in different regions.

  6. A new code for automatic detection and analysis of the lineament patterns for geophysical and geological purposes (ADALGEO)

    NASA Astrophysics Data System (ADS)

    Soto-Pinto, C.; Arellano-Baeza, A.; Sánchez, G.

    2013-08-01

    We present a new numerical method for automatic detection and analysis of changes in lineament patterns caused by seismic and volcanic activities. The method is implemented as a series of modules: (i) normalization of the image contrast, (ii) extraction of small linear features (stripes) through convolution of the part of the image in the vicinity of each pixel with a circular mask or through Canny algorithm, and (iii) posterior detection of main lineaments using the Hough transform. We demonstrate that our code reliably detects changes in the lineament patterns related to the stress evolution in the Earth's crust: specifically, a significant number of new lineaments appear approximately one month before an earthquake, while one month after the earthquake the lineament configuration returns to its initial state. Application of our software to the deformations caused by volcanic activity yields the opposite results: the number of lineaments decreases with the onset of microseismicity. This discrepancy can be explained assuming that the plate tectonic earthquakes are caused by the compression and accumulation of stress in the Earth's crust due to subduction of tectonic plates, whereas in the case of volcanic activity we deal with the inflation of a volcano edifice due to elevation of pressure and magma intrusion and the resulting stretching of the surface.

  7. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: Altai-Sayan Region

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2017-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10N(M, L) = A + B·(5 - M) + C·log10L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum credible magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g. peak ground acceleration, PGA, or macro-seismic intensity etc.). After a rigorous testing against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory, etc.). This, USLE based, methodology of seismic hazard and risks assessment is applied to the territory of Altai-Sayan Region, of Russia. The study supported by the Russian Science Foundation Grant No. 15-17-30020.

  8. Analysis of Magnitude Correlations in a Self-Similar model of Seismicity

    NASA Astrophysics Data System (ADS)

    Zambrano, A.; Joern, D.

    2017-12-01

    A recent model of seismicity that incorporates a self-similar Omori-Utsu relation, which is used to describe the temporal evolution of earthquake triggering, has been shown to provide a more accurate description of seismicity in Southern California when compared to epidemic type aftershock sequence models. Forecasting of earthquakes is an active research area where one of the debated points is whether magnitude correlations of earthquakes exist within real world seismic data. Prior to this work, the analysis of magnitude correlations of the aforementioned self-similar model had not been addressed. Here we present statistical properties of the magnitude correlations for the self-similar model along with an analytical analysis of the branching ratio and criticality parameters.

  9. Assessing the seismic risk potential of South America

    USGS Publications Warehouse

    Jaiswal, Kishor; Petersen, Mark D.; Harmsen, Stephen; Smoczyk, Gregory M.

    2016-01-01

    We present here a simplified approach to quantifying regional seismic risk. The seismic risk for a given region can be inferred in terms of average annual loss (AAL) that represents long-term value of earthquake losses in any one year caused from a long-term seismic hazard. The AAL are commonly measured in the form of earthquake shaking-induced deaths, direct economic impacts or indirect losses caused due to loss of functionality. In the context of South American subcontinent, the analysis makes use of readily available public data on seismicity, population exposure, and the hazard and vulnerability models for the region. The seismic hazard model was derived using available seismic catalogs, fault databases, and the hazard methodologies that are analogous to the U.S. Geological Survey’s national seismic hazard mapping process. The Prompt Assessment of Global Earthquakes for Response (PAGER) system’s direct empirical vulnerability functions in terms of fatality and economic impact were used for performing exposure and risk analyses. The broad findings presented and the risk maps produced herein are preliminary, yet they do offer important insights into the underlying zones of high and low seismic risks in the South American subcontinent. A more detailed analysis of risk may be warranted by engaging local experts, especially in some of the high risk zones identified through the present investigation.

  10. Pattern Informatics Approach to Earthquake Forecasting in 3D

    NASA Astrophysics Data System (ADS)

    Toya, Y.; Tiampo, K. F.; Rundle, J. B.; Chen, C.; Li, H.; Klein, W.

    2009-05-01

    Natural seismicity is correlated across multiple spatial and temporal scales, but correlations in seismicity prior to a large earthquake are locally subtle (e.g. seismic quiescence) and often prominent in broad scale (e.g., seismic activation), resulting in local and regional seismicity patterns, e.g. a Mogi's donut. Recognizing that patterns in seismicity rate are reflecting the regional dynamics of the directly unobservable crustal stresses, the Pattern Informatics (PI) approach was introduced by Tiampo et al. in 2002 [Europhys. Lett., 60 (3), 481-487,] Rundle et al., 2002 [PNAS 99, suppl. 1, 2514-2521.] In this study, we expand the PI approach to forecasting earthquakes into the third, or vertical dimension, and illustrate its further improvement in the forecasting performance through case studies of both natural and synthetic data. The PI characterizes rapidly evolving spatio-temporal seismicity patterns as angular drifts of a unit state vector in a high dimensional correlation space, and systematically identifies anomalous shifts in seismic activity with respect to the regional background. 3D PI analysis is particularly advantageous over 2D analysis in resolving vertically overlapped seismicity anomalies in a highly complex tectonic environment. Case studies will help to illustrate some important properties of the PI forecasting tool. [Submitted to: Concurrency and Computation: Practice and Experience, Wiley, Special Issue: ACES2008.

  11. Impacts of potential seismic landslides on lifeline corridors.

    DOT National Transportation Integrated Search

    2015-02-01

    This report presents a fully probabilistic method for regional seismically induced landslide hazard analysis and : mapping. The method considers the most current predictions for strong ground motions and seismic sources : through use of the U.S.G.S. ...

  12. LANL seismic screening method for existing buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method andmore » will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.« less

  13. Spatial Distribution of Seismic Anisotropy in the Crust in the Northeast Front Zone of Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Wang, Q.; SHI, Y.

    2017-12-01

    There are orogenic belts and strong deformation in northeastern zone of Tibetan Plateau. The media in crust and in the upper mantle are seismic anisotropic there. This study uses seismic records by permanent seismic stations and portable seismic arrays, and adopts analysis techniques on body waves to obtain spatial anisotropic distribution in northeastern front zone of Tibetan Plateau. With seismic records of small local earthquakes, we study shear-wave splitting in the upper crust. The polarization of fast shear wave (PFS) can be obtained, and PFS is considered parallel to the strike of the cracks, as well as the direction of maximum horizontal compressive stress. However, the result shows the strong influence from tectonics, such as faults. It suggests multiple-influence including stress and fault. Spatial distribution of seismic anisotropy in study zone presents the effect in short range. PFS at the station on the strike-slip fault is quite different to PFS at station just hundreds of meters away from the fault. With seismic records of teleseismic waveforms, we obtained seismic anisotropy in the whole crust by receiver functions. The PFS directions from Pms receiver functions show consistency, generally in WNW. The time-delay of slow S phases is significant. With seismic records of SKS, PKS and SKKS phases, we can detect seismic anisotropy in the upper mantle by splitting analysis. The fast directions of these phases also show consistency, generally in WNW, similar to those of receiver functions, but larger time-delays. It suggests significant seismic anisotropy in the crust and crustal deformation is coherent to that in the upper mantle.Seismic anisotropy in the upper crust, in the whole crust and in the upper mantle are discussed both in difference and tectonic implications [Grateful to the support by NSFC Project 41474032].

  14. MASW on the standard seismic prospective scale using full spread recording

    NASA Astrophysics Data System (ADS)

    Białas, Sebastian; Majdański, Mariusz; Trzeciak, Maciej; Gałczyński, Edward; Maksym, Andrzej

    2015-04-01

    The Multichannel Analysis of Surface Waves (MASW) is one of seismic survey methods that use the dispersion curve of surface waves in order to describe the stiffness of the surface. Is is used mainly for geotechnical engineering scale with total length of spread between 5 - 450 m and spread offset between 1 - 100 m, the hummer is the seismic source on this surveys. The standard procedure of MASW survey is: data acquisition, dispersion analysis and inversion of extracting dispersion curve to obtain the closest theoretical curve. The final result includes share-wave velocity (Vs) values at different depth along the surveyed lines. The main goal of this work is to expand this engineering method to the bigger scale with the length of standard prospecting spread of 20 km using 4.5 Hz version of vertical component geophones. The standard vibroseis and explosive method are used as the seismic source. The acquisition were conducted on the full spread all the time during each single shoot. The seismic data acquisition used for this analysis were carried out on the Braniewo 2014 project in north of Poland. The results achieved during standard MASW procedure says that this method can be used on much bigger scale as well. The different methodology of this analysis requires only much stronger seismic source.

  15. Seismic signatures of carbonate caves affected by near-surface absorptions

    NASA Astrophysics Data System (ADS)

    Rao, Ying; Wang, Yanghua

    2015-12-01

    The near-surface absorption within a low-velocity zone generally has an exponential attenuation effect on seismic waves. But how does this absorption affect seismic signatures of karstic caves in deep carbonate reservoirs? Seismic simulation and analysis reveals that, although this near-surface absorption attenuates the wave energy of a continuous reflection, it does not alter the basic kinematic shape of bead-string reflections, a special seismic characteristic associated with carbonate caves in the Tarim Basin, China. Therefore, the bead-strings in seismic profiles can be utilized, with a great certainty, for interpreting the existence of caves within the deep carbonate reservoirs and for evaluating their pore spaces. Nevertheless, the difference between the central frequency and the peak frequency is increased along with the increment in the absorption. While the wave energy of bead-string reflections remains strong, due to the interference of seismic multiples generated by big impedance contrast between the infill materials of a cave and the surrounding carbonate rocks, the central frequency is shifted linearly with respect to the near-surface absorption. These two features can be exploited simultaneously, for a stable attenuation analysis of field seismic data.

  16. Post-seismic relaxation following the 2009 April 6, L'Aquila (Italy), earthquake revealed by the mass position of a broad-band seismometer

    NASA Astrophysics Data System (ADS)

    Pino, Nicola Alessandro

    2012-06-01

    Post-seismic relaxation is known to occur after large or moderate earthquakes, on time scales ranging from days to years or even decades. In general, long-term deformation following seismic events has been detected by means of standard geodetic measurements, although seismic instruments are only used to estimate short timescale transient processes. Albeit inertial seismic sensors are also sensitive to rotation around their sensitive axes, the recording of very slow inclination of the ground surface at their standard output channels is practically impossible, because of their design characteristics. However, modern force-balance, broad-band seismometers provide the possibility to detect and measure slow surface inclination, through the analysis of the mass position signal. This output channel represents the integral of the broad-band velocity and is generally considered only for state-of-health diagnostics. In fact, the analysis of mass position data recorded at the time of the 2009 April 6, L'Aquila (MW= 6.3) earthquake, by a closely located STS-2 seismometer, evidenced the occurrence of a very low frequency signal, starting right at the time of the seismic event. This waveform is only visible on the horizontal components and is not related to the usual drift coupled with the temperature changes. This analysis suggests that the observed signal is to be ascribed to slowly developing ground inclination at the station site, caused by post-seismic relaxation following the main shock. The observed tilt reached 1.7 × 10-5 rad in about 2 months. This estimate is in very good agreement with the geodetic observations, giving comparable tilt magnitude and direction at the same site. This study represents the first seismic analysis ever for the mass position signal, suggesting useful applications for usually neglected data.

  17. Geodynamic Evolution of Northeastern Tunisia During the Maastrichtian-Paleocene Time: Insights from Integrated Seismic Stratigraphic Analysis

    NASA Astrophysics Data System (ADS)

    Abidi, Oussama; Inoubli, Mohamed Hédi; Sebei, Kawthar; Amiri, Adnen; Boussiga, Haifa; Nasr, Imen Hamdi; Salem, Abdelhamid Ben; Elabed, Mahmoud

    2017-05-01

    The Maastrichtian-Paleocene El Haria formation was studied and defined in Tunisia on the basis of outcrops and borehole data; few studies were interested in its three-dimensional extent. In this paper, the El Haria formation is reviewed in the context of a tectono-stratigraphic interval using an integrated seismic stratigraphic analysis based on borehole lithology logs, electrical well logging, well shots, vertical seismic profiles and post-stack surface data. Seismic analysis benefits from appropriate calibration with borehole data, conventional interpretation, velocity mapping, seismic attributes and post-stack model-based inversion. The applied methodology proved to be powerful for charactering the marly Maastrichtian-Paleocene interval of the El Haria formation. Migrated seismic sections together with borehole measurements are used to detail the three-dimensional changes in thickness, facies and depositional environment in the Cap Bon and Gulf of Hammamet regions during the Maastrichtian-Paleocene time. Furthermore, dating based on their microfossil content divulges local and multiple internal hiatuses within the El Haria formation which are related to the geodynamic evolution of the depositional floor since the Campanian stage. Interpreted seismic sections display concordance, unconformities, pinchouts, sedimentary gaps, incised valleys and syn-sedimentary normal faulting. Based on the seismic reflection geometry and terminations, seven sequences are delineated. These sequences are related to base-level changes as the combination of depositional floor paleo-topography, tectonic forces, subsidence and the developed accommodation space. These factors controlled the occurrence of the various parts of the Maastrichtian-Paleocene interval. Detailed examinations of these deposits together with the analysis of the structural deformation at different time periods allowed us to obtain a better understanding of the sediment architecture in depth and the delineation of the geodynamic evolution of the region.

  18. Spectral amplification models for response spectrum addressing the directivity effect

    NASA Astrophysics Data System (ADS)

    Moghimi, Saed; Akkar, Sinan

    2017-04-01

    Ground motions with forward directivity effects are known with their significantly large spectral ordinates in medium-to-long periods. The large spectral ordinates stem from the impulsive characteristics of the forward directivity ground motions. The quantification of these spectral amplifications requires the identification of major seismological parameters that play a role in their generation. After running a suite of probabilistic seismic hazard analysis, Moghimi and Akkar (2016) have shown that fault slip rate, fault characteristic magnitude, fault-site geometry as well as mean annual exceedance rate are important parameters that determine the level of spectral amplification due to directivity. These parameters are considered to develop two separate spectral amplification equations in this study. The proposed equations rely on Shahi and Baker (SHB11; 2011) and Chiou and Spudich (CHS13; Spudic et al., 2013) narrow-band forward directivity models. The presented equations only focus on the estimation of maximum spectral amplifications that occur at the ends of the fault segments. This way we eliminate the fault-site parameter in our equations for simplification. The proposed equations show different trends due to differences in the narrow-band directivity models of SHB11 and CHS13. The equations given in this study can form bases for describing forward directivity effects in seismic design codes. REFERENCES Shahi. S., Baker, J.W. (2011), "An Empirically Calibrated Framework for Including the Effects of Near-Fault Directivity in Probabilistic Seismic Hazard Analysis", Bulletin of the Seismological Society of America, 101(2): 742-755. Spudich, P., Watson-Lamprey, J., Somerville, P., Bayless, J., Shahi, S. K., Baker, J. W., Rowshandel, B., and Chiou, B. (2013), "Final Report of the NGA-West2 Directivity Working Group", PEER Report 2013/09. Moghimi. S., Akkar, S. (2016), "Implications of Forward Directivity Effects on Design Ground Motions", Seismological Society of America, Annual meeting, 2016, Reno, Nevada, 87:2B Pg. 464

  19. Mobility Effect on Poroelastic Seismic Signatures in Partially Saturated Rocks With Applications in Time-Lapse Monitoring of a Heavy Oil Reservoir

    NASA Astrophysics Data System (ADS)

    Zhao, Luanxiao; Yuan, Hemin; Yang, Jingkang; Han, De-hua; Geng, Jianhua; Zhou, Rui; Li, Hui; Yao, Qiuliang

    2017-11-01

    Conventional seismic analysis in partially saturated rocks normally lays emphasis on estimating pore fluid content and saturation, typically ignoring the effect of mobility, which decides the ability of fluids moving in the porous rocks. Deformation resulting from a seismic wave in heterogeneous partially saturated media can cause pore fluid pressure relaxation at mesoscopic scale, thereby making the fluid mobility inherently associated with poroelastic reflectivity. For two typical gas-brine reservoir models, with the given rock and fluid properties, the numerical analysis suggests that variations of patchy fluid saturation, fluid compressibility contrast, and acoustic stiffness of rock frame collectively affect the seismic reflection dependence on mobility. In particular, the realistic compressibility contrast of fluid patches in shallow and deep reservoir environments plays an important role in determining the reflection sensitivity to mobility. We also use a time-lapse seismic data set from a Steam-Assisted Gravity Drainage producing heavy oil reservoir to demonstrate that mobility change coupled with patchy saturation possibly leads to seismic spectral energy shifting from the baseline to monitor line. Our workflow starts from performing seismic spectral analysis on the targeted reflectivity interface. Then, on the basis of mesoscopic fluid pressure diffusion between patches of steam and heavy oil, poroelastic reflectivity modeling is conducted to understand the shift of the central frequency toward low frequencies after the steam injection. The presented results open the possibility of monitoring mobility change of a partially saturated geological formation from dissipation-related seismic attributes.

  20. Leveraging EarthScope USArray with the Central and Eastern United States Seismic Network

    NASA Astrophysics Data System (ADS)

    Busby, R.; Sumy, D. F.; Woodward, R.; Frassetto, A.; Brudzinski, M.

    2015-12-01

    Recent earthquakes, such as the 2011 M5.8 Mineral, Virginia earthquake, raised awareness of the comparative lack of knowledge about seismicity, site response to ground shaking, and the basic geologic underpinnings in this densely populated region. With this in mind, the National Science Foundation, United States Geological Survey, United States Nuclear Regulatory Commission, and Department of Energy supported the creation of the Central and Eastern United States Seismic Network (CEUSN). These agencies, along with the IRIS Consortium who operates the network, recognized the unique opportunity to retain EarthScope Transportable Array (TA) seismic stations in this region beyond the standard deployment duration of two years per site. The CEUSN project supports 159 broadband TA stations, more than 30 with strong motion sensors added, that are scheduled to operate through 2017. Stations were prioritized in regions of elevated seismic hazard that have not been traditionally heavily monitored, such as the Charlevoix and Central Virginia Seismic Zones, and in regions proximal to nuclear power plants and other critical facilities. The stations (network code N4) transmit data in real time, with broadband and strong motion sensors sampling at 100 samples per second. More broadly the CEUSN concept also recognizes the existing backbone coverage of permanently operating seismometers in the CEUS, and forms a network of over 300 broadband stations. This multi-agency collaboration is motivated by the opportunity to use one facility to address multiple missions and needs in a way that is rarely possible, and to produce data that enables both researchers and federal agencies to better understand seismic hazard potential and associated seismic risks. In June 2015, the CEUSN Working Group (www.usarray.org/ceusn_working_group) was formed to review and provide advice to IRIS Management on the performance of the CEUSN as it relates to the target scientific goals and objectives. Map shows the 159 CEUSN stations (yellow) that will be operated and maintained by the IRIS Consortium through 2017. The CEUSN stations were selected for proximity to nuclear power plants (black squares) and other critical infrastructure as well as to more evenly distribute seismic stations across the central and eastern United States.

  1. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  2. Seismic instrumentation of buildings

    USGS Publications Warehouse

    Çelebi, Mehmet

    2000-01-01

    The purpose of this report is to provide information on how and why we deploy seismic instruments in and around building structures. The recorded response data from buildings and other instrumented structures can be and are being primarily used to facilitate necessary studies to improve building codes and therefore reduce losses of life and property during damaging earthquakes. Other uses of such data can be in emergency response situations in large urban environments. The report discusses typical instrumentation schemes, existing instrumentation programs, the steps generally followed in instrumenting a structure, selection and type of instruments, installation and maintenance requirements and data retrieval and processing issues. In addition, a summary section on how recorded response data have been utilized is included. The benefits from instrumentation of structural systems are discussed.

  3. Qualification of safety-related electrical equipment in France. Methods, approach and test facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raimondo, E.; Capman, J.L.; Herovard, M.

    1985-05-01

    Requirements for qualification of electrical equipment used in French-built nuclear power plants are stated in a national code, the RCC-E, or Regles de Construction et de Conception des Materiels Electriques. Under the RCC-E, safety related equipment is assigned to one of three different categories, according to location in the plant and anticipated normal, accident and post-accident behavior. Qualification tests differ for each category and procedures range in scope from the standard seismic test to the highly stringent VISA program, which specifies a predetermined sequence of aging, radiation, seismic and simulated accident testing. A network of official French test facilities wasmore » developed specifically to meet RCC-E requirements.« less

  4. QuakeML - An XML Schema for Seismology

    NASA Astrophysics Data System (ADS)

    Wyss, A.; Schorlemmer, D.; Maraini, S.; Baer, M.; Wiemer, S.

    2004-12-01

    We propose an extensible format-definition for seismic data (QuakeML). Sharing data and seismic information efficiently is one of the most important issues for research and observational seismology in the future. The eXtensible Markup Language (XML) is playing an increasingly important role in the exchange of a variety of data. Due to its extensible definition capabilities, its wide acceptance and the existing large number of utilities and libraries for XML, a structured representation of various types of seismological data should in our opinion be developed by defining a 'QuakeML' standard. Here we present the QuakeML definitions for parameter databases and further efforts, e.g. a central QuakeML catalog database and a web portal for exchanging codes and stylesheets.

  5. FWT2D: A massively parallel program for frequency-domain full-waveform tomography of wide-aperture seismic data—Part 1: Algorithm

    NASA Astrophysics Data System (ADS)

    Sourbier, Florent; Operto, Stéphane; Virieux, Jean; Amestoy, Patrick; L'Excellent, Jean-Yves

    2009-03-01

    This is the first paper in a two-part series that describes a massively parallel code that performs 2D frequency-domain full-waveform inversion of wide-aperture seismic data for imaging complex structures. Full-waveform inversion methods, namely quantitative seismic imaging methods based on the resolution of the full wave equation, are computationally expensive. Therefore, designing efficient algorithms which take advantage of parallel computing facilities is critical for the appraisal of these approaches when applied to representative case studies and for further improvements. Full-waveform modelling requires the resolution of a large sparse system of linear equations which is performed with the massively parallel direct solver MUMPS for efficient multiple-shot simulations. Efficiency of the multiple-shot solution phase (forward/backward substitutions) is improved by using the BLAS3 library. The inverse problem relies on a classic local optimization approach implemented with a gradient method. The direct solver returns the multiple-shot wavefield solutions distributed over the processors according to a domain decomposition driven by the distribution of the LU factors. The domain decomposition of the wavefield solutions is used to compute in parallel the gradient of the objective function and the diagonal Hessian, this latter providing a suitable scaling of the gradient. The algorithm allows one to test different strategies for multiscale frequency inversion ranging from successive mono-frequency inversion to simultaneous multifrequency inversion. These different inversion strategies will be illustrated in the following companion paper. The parallel efficiency and the scalability of the code will also be quantified.

  6. Improvement of Epicentral Direction Estimation by P-wave Polarization Analysis

    NASA Astrophysics Data System (ADS)

    Oshima, Mitsutaka

    2016-04-01

    Polarization analysis has been used to analyze the polarization characteristics of waves and developed in various spheres, for example, electromagnetics, optics, and seismology. As for seismology, polarization analysis is used to discriminate seismic phases or to enhance specific phase (e.g., Flinn, 1965)[1], by taking advantage of the difference in polarization characteristics of seismic phases. In earthquake early warning, polarization analysis is used to estimate the epicentral direction using single station, based on the polarization direction of P-wave portion in seismic records (e.g., Smart and Sproules(1981) [2], Noda et al.,(2012) [3]). Therefore, improvement of the Estimation of Epicentral Direction by Polarization Analysis (EEDPA) directly leads to enhance the accuracy and promptness of earthquake early warning. In this study, the author tried to improve EEDPA by using seismic records of events occurred around Japan from 2003 to 2013. The author selected the events that satisfy following conditions. MJMA larger than 6.5 (JMA: Japan Meteorological Agency). Seismic records are available at least 3 stations within 300km in epicentral distance. Seismic records obtained at stations with no information on seismometer orientation were excluded, so that precise and quantitative evaluation of accuracy of EEDPA becomes possible. In the analysis, polarization has calculated by Vidale(1986) [4] that extended the method proposed by Montalbetti and Kanasewich(1970)[5] to use analytical signal. As a result of the analysis, the author found that accuracy of EEDPA improves by about 15% if velocity records, not displacement records, are used contrary to the author's expectation. Use of velocity records enables reduction of CPU time in integration of seismic records and improvement in promptness of EEDPA, although this analysis is still rough and further scrutiny is essential. At this moment, the author used seismic records that obtained by simply integrating acceleration records and applied no filtering. Further study on optimal type of filter and its application frequency band is necessary. In poster presentation, the results of aforementioned study shall be shown. [1] Flinn, E. A. (1965) , Signal analysis using rectilinearity and direction of particle motion. Proceedings of the IEEE, 53(12), 1874-1876. [2] Smart, E., & Sproules, H. (1981), Regional phase processors (No. SDAC-TR-81-1). TELEDYNE GEOTECH ALEXANDRIA VA SEISMIC DATA ANALYSIS CENTER. [3] Noda, S., Yamamoto, S., Sato, S., Iwata, N., Korenaga, M., & Ashiya, K. (2012). Improvement of back-azimuth estimation in real-time by using a single station record. Earth, planets and space, 64(3), 305-308. [4] Vidale, J. E. (1986). Complex polarization analysis of particle motion. Bulletin of the Seismological society of America, 76(5), 1393-1405. [5] Montalbetti, J. F., & Kanasewich, E. R. (1970). Enhancement of teleseismic body phases with a polarization filter. Geophysical Journal International, 21(2), 119-129.

  7. Seismic slope-performance analysis: from hazard map to decision support system

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.; Ho, Carlton L.

    1999-01-01

    In response to the growing recognition of engineers and decision-makers of the regional effects of earthquake-induced landslides, this paper presents a general approach to conducting seismic landslide zonation, based on the popular Newmark's sliding block analogy for modeling coherent landslides. Four existing models based on the sliding block analogy are compared. The comparison shows that the models forecast notably different levels of slope performance. Considering this discrepancy along with the limitations of static maps as a decision tool, a spatial decision support system (SDSS) for seismic landslide analysis is proposed, which will support investigations over multiple scales for any number of earthquake scenarios and input conditions. Most importantly, the SDSS will allow use of any seismic landslide analysis model and zonation approach. Developments associated with the SDSS will produce an object-oriented model for encapsulating spatial data, an object-oriented specification to allow construction of models using modular objects, and a direct-manipulation, dynamic user-interface that adapts to the particular seismic landslide model configuration.

  8. 6C polarization analysis - seismic direction finding in coherent noise, automated event identification, and wavefield separation

    NASA Astrophysics Data System (ADS)

    Schmelzbach, C.; Sollberger, D.; Greenhalgh, S.; Van Renterghem, C.; Robertsson, J. O. A.

    2017-12-01

    Polarization analysis of standard three-component (3C) seismic data is an established tool to determine the propagation directions of seismic waves recorded by a single station. A major limitation of seismic direction finding methods using 3C recordings, however, is that a correct propagation-direction determination is only possible if the wave mode is known. Furthermore, 3C polarization analysis techniques break down in the presence of coherent noise (i.e., when more than one event is present in the analysis time window). Recent advances in sensor technology (e.g., fibre-optical, magnetohydrodynamic angular rate sensors, and ring laser gyroscopes) have made it possible to accurately measure all three components of rotational ground motion exhibited by seismic waves, in addition to the conventionally recorded three components of translational motion. Here, we present an extension of the theory of single station 3C polarization analysis to six-component (6C) recordings of collocated translational and rotational ground motions. We demonstrate that the information contained in rotation measurements can help to overcome some of the main limitations of standard 3C seismic direction finding, such as handling multiple arrivals simultaneously. We show that the 6C polarisation of elastic waves measured at the Earth's free surface does not only depend on the seismic wave type and propagation direction, but also on the local P- and S-wave velocities just beneath the recording station. Using an adaptation of the multiple signal classification algorithm (MUSIC), we demonstrate how seismic events can univocally be identified and characterized in terms of their wave type. Furthermore, we show how the local velocities can be inferred from single-station 6C data, in addition to the direction angles (inclination and azimuth) of seismic arrivals. A major benefit of our proposed 6C method is that it also allows the accurate recovery of the wave type, propagation directions, and phase velocities of multiple, interfering arrivals in one time window. We demonstrate how this property can be exploited to separate the wavefield into its elastic wave-modes and to isolate or suppress waves arriving from specific directions (directional filtering), both in a fully automated fashion.

  9. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  10. Gas hydrate characterization from a 3D seismic dataset in the deepwater eastern Gulf of Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McConnell, Daniel; Haneberg, William C.

    Seismic stratigraphic features are delineated using principal component analysis of the band limited data at potential gas hydrate sands, and compared and calibrated with spectral decomposition thickness to constrain thickness in the absence of well control. Layers in the abyssal fan sediments are thinner than can be resolved with 50 Hz seismic and thus comprise composite thin-bed reflections. Amplitude vs frequency analysis are used to indicate gas and gas hydrate reflections. Synthetic seismic wedge models show that with 50Hz seismic data, a 40% saturation of a Plio Pleistocene GoM sand in the hydrate stability zone with no subjacent gas canmore » produce a phase change (negative to positive) with a strong correlation between amplitude and hydrate saturation. The synthetic seismic response is more complicated if the gas hydrate filled sediments overlie gassy sediments. Hydrate (or gas) saturation in thin beds enhances the amplitude response and can be used to estimate saturation. Gas hydrate saturation from rock physics, amplitude, and frequency analysis is compared to saturation derived from inversion at several interpreted gas hydrate accumulations in the eastern Gulf of Mexico.« less

  11. 3D seismic data de-noising and reconstruction using Multichannel Time Slice Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Rekapalli, Rajesh; Tiwari, R. K.; Sen, Mrinal K.; Vedanti, Nimisha

    2017-05-01

    Noises and data gaps complicate the seismic data processing and subsequently cause difficulties in the geological interpretation. We discuss a recent development and application of the Multi-channel Time Slice Singular Spectrum Analysis (MTSSSA) for 3D seismic data de-noising in time domain. In addition, L1 norm based simultaneous data gap filling of 3D seismic data using MTSSSA also discussed. We discriminated the noises from single individual time slices of 3D volumes by analyzing Eigen triplets of the trajectory matrix. We first tested the efficacy of the method on 3D synthetic seismic data contaminated with noise and then applied to the post stack seismic reflection data acquired from the Sleipner CO2 storage site (pre and post CO2 injection) from Norway. Our analysis suggests that the MTSSSA algorithm is efficient to enhance the S/N for better identification of amplitude anomalies along with simultaneous data gap filling. The bright spots identified in the de-noised data indicate upward migration of CO2 towards the top of the Utsira formation. The reflections identified applying MTSSSA to pre and post injection data correlate well with the geology of the Southern Viking Graben (SVG).

  12. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.

  13. Seismic reflection response from cross-correlations of ambient vibrations on non-conventional hidrocarbon reservoir

    NASA Astrophysics Data System (ADS)

    Huerta, F. V.; Granados, I.; Aguirre, J.; Carrera, R. Á.

    2017-12-01

    Nowadays, in hydrocarbon industry, there is a need to optimize and reduce exploration costs in the different types of reservoirs, motivating the community specialized in the search and development of alternative exploration geophysical methods. This study show the reflection response obtained from a shale gas / oil deposit through the method of seismic interferometry of ambient vibrations in combination with Wavelet analysis and conventional seismic reflection techniques (CMP & NMO). The method is to generate seismic responses from virtual sources through the process of cross-correlation of records of Ambient Seismic Vibrations (ASV), collected in different receivers. The seismic response obtained is interpreted as the response that would be measured in one of the receivers considering a virtual source in the other. The acquisition of ASV records was performed in northern of Mexico through semi-rectangular arrays of multi-component geophones with instrumental response of 10 Hz. The in-line distance between geophones was 40 m while in cross-line was 280 m, the sampling used during the data collection was 2 ms and the total duration of the records was 6 hours. The results show the reflection response of two lines in the in-line direction and two in the cross-line direction for which the continuity of coherent events have been identified and interpreted as reflectors. There is certainty that the events identified correspond to reflections because the time-frequency analysis performed with the Wavelet Transform has allowed to identify the frequency band in which there are body waves. On the other hand, the CMP and NMO techniques have allowed to emphasize and correct the reflection response obtained during the correlation processes in the frequency band of interest. The results of the processing and analysis of ASV records through the seismic interferometry method have allowed us to see interesting results in light of the cross-correlation process in combination with the Wavelet analysis and conventional seismic reflection techniques. Therefore it was possible to recover the seismic response on each analyzed source-receiver pair, allowing us to obtain the reflection response of each analyzed seismic line.

  14. Seismic sample areas defined from incomplete catalogues: an application to the Italian territory

    NASA Astrophysics Data System (ADS)

    Mulargia, F.; Tinti, S.

    1985-11-01

    The comprehensive understanding of earthquake source-physics under real conditions requires the study not of single faults as separate entities but rather of a seismically active region as a whole, accounting for the interaction among different structures. We define "seismic sample area" the most convenient region to be used as a natural laboratory for the study of seismic source physics. This coincides with the region where the average large magnitude seismicity is the highest. To this end, time and space future distributions of large earthquakes are to be estimated. Using catalog seismicity as an input, the rate of occurrence is not constant but appears generally biased by incompleteness in some parts of the catalog and possible nonstationarities in seismic activity. We present a statistical procedure which is capable, under a few mild assumptions, of both detecting nonstationarities in seismicity and finding the incomplete parts of a seismic catalog. The procedure is based on Kolmogorov-Smirnov nonparametric statistics, and can be applied without a priori assuming the parent distribution of the events. The efficiency of this procedure allows the analysis of small data sets. An application to the Italian territory is presented, using the most recent version of the ENEL seismic catalog. Seismic activity takes place in six well defined areas but only five of them have a number of events sufficient for analysis. Barring a few exceptions, seismicity is found stationary throughout the whole catalog span 1000-1980. The eastern Alps region stands out as the best "sample area", with the highest average probability of event occurrence per time and area unit. Final objective of this characterization is to stimulate a program of intensified research.

  15. Comparison of Earthquake Damage Patterns and Shallow-Depth Vs Structure Across the Napa Valley, Inferred From Multichannel Analysis of Surface Waves (MASW) and Multichannel Analysis of Love Waves (MALW) Modeling of Basin-Wide Seismic Profiles

    NASA Astrophysics Data System (ADS)

    Chan, J. H.; Catchings, R.; Strayer, L. M.; Goldman, M.; Criley, C.; Sickler, R. R.; Boatwright, J.

    2017-12-01

    We conducted an active-source seismic investigation across the Napa Valley (Napa Valley Seismic Investigation-16) in September of 2016 consisting of two basin-wide seismic profiles; one profile was 20 km long and N-S-trending (338°), and the other 15 km long and E-W-trending (80°) (see Catchings et al., 2017). Data from the NVSI-16 seismic investigation were recorded using a total of 666 vertical- and horizontal-component seismographs, spaced 100 m apart on both seismic profiles. Seismic sources were generated by a total of 36 buried explosions spaced 1 km apart. The two seismic profiles intersected in downtown Napa, where a large number of buildings were red-tagged by the City following the 24 August 2014 Mw 6.0 South Napa earthquake. From the recorded Rayleigh and Love waves, we developed 2-Dimensional S-wave velocity models to depths of about 0.5 km using the multichannel analysis of surface waves (MASW) method. Our MASW (Rayleigh) and MALW (Love) models show two prominent low-velocity (Vs = 350 to 1300 m/s) sub-basins that were also previously identified from gravity studies (Langenheim et al., 2010). These basins trend N-W and also coincide with the locations of more than 1500 red- and yellow-tagged buildings within the City of Napa that were tagged after the 2014 South Napa earthquake. The observed correlation between low-Vs, deep basins, and the red-and yellow-tagged buildings in Napa suggests similar large-scale seismic investigations can be performed. These correlations provide insights into the likely locations of significant structural damage resulting from future earthquakes that occur adjacent to or within sedimentary basins.

  16. Site Effects estimation in the Po Plain area (Northern Italy): correlation between passive geophysical surveys and stratigraphic evidence

    NASA Astrophysics Data System (ADS)

    Mascandola, Claudia; Massa, Marco; Barani, Simone; Argnani, Andrea; Poggi, Valerio; Martelli, Luca; Albarello, Dario; Pergalani, Floriana; Compagnoni, Massimo; Lovati, Sara

    2017-04-01

    The recent case of the 2012, Mw 6.1, Emilia seismic sequence (Northern Italy) highlighted the importance of the site effects estimation in the Po Plain, the larger and deeper Italian sedimentary basin. This study, applied on extensive collection of geophysical and geological data in the entire area, allows a macrozonation of the site effects estimation, useful for scientific and applied purpose. In particular, site-response analysis can be performed in defined macrozones, where the geological-geotecnical and geophysical characteristics are homogeneous at macroscale. The collection of the available stratigraphic discontinuities and passive geophysical surveys (single station and array measurements) allowed defining a general macrozonation in terms of amplified frequencies and shear waves velocity (Vs) gradients. The correlation between the obtained geophysical evidence and the known geological information can then be crucial in order to define the most important stratigraphic discontinuities responsible for the local seismic amplification. In particular, ambient vibration data, recorded by all permanent and temporary seismic stations installed in the target region, were collected and then analyzed with the Nakamura technique, to determine the H/V spectral ratio. Moreover, all the available ambient vibration arrays where collected and analyzes to assess the local Vs profile, considering the Rayleigh waves fundamental mode. The Po Plain stratigraphy is defined by regional unconformities (aquifer limits) that have been extensively mapped throughout the basin and by regional geological and structural maps. In general, the H/V results show two ranges of amplified frequencies, both lower than 1 Hz: the former at frequencies lower than about 0.25 Hz and the latter between 0.4 and 1 Hz. The higher frequency range moves from about 0.4 Hz, in the eastern-Adriatic part of the plain, to about 0.8-1.0 Hz in the central and western part. Based on the available seismic array results, this amplification peak seems related to a velocity discontinuity, located in general between 100 m and 300 m of depth, where the Vs exceed 800 m/s. This interface can be ascribed to the seismic bedrock according to the actual seismic code (NTC 2008, Vs> 800 m/s, class A) and may be related to different stratigraphic discontinuities moving from East to West. In order to verify the supposed correspondence between geophysical and geological data, also the H/V ratio where inverted, considering the Sanchez-Sesma method and the nearest array velocity profile as indicative for the target inversion. Finally, an empirical relation between amplified frequencies and depths was calculated, allowing to preliminary map, at regional scale, the most important geological discontinuities for the site effects evaluation. An example of site-specific hazard analysis was performed in correspondence of the INGV seismic station CTL8 in terms of displacement response spectra for periods up to 10 s. The results show that neglecting the effects of the deep discontinuities implies underestimation in hazard evaluation of up to about 49% for MRP of 475 years and about 57% for MRP of 2,475 years, with possible consequences on the design of very tall buildings and large bridges.

  17. Search for Anisotropy Changes Associated with Two Large Earthquakes in Japan and New Zealand

    NASA Astrophysics Data System (ADS)

    Savage, M. K.; Graham, K.; Aoki, Y.; Arnold, R.

    2017-12-01

    Seismic anisotropy is often considered to be an indicator of stress in the crust, because the closure of cracks due to differential stress leads to waves polarized parallel to the cracks travelling faster than the orthogonal direction. Changes in shear wave splitting have been suggested to result from stress changes at volcanoes and earthquakes. However, the effects of mineral or structural alignment, and the difficulty of distinguishing between changes in anisotropy along an earthquake-station path from distinguishing changes in the path itself, have made such findings controversial. Two large earthquakes in 2016 provide unique datasets to test the use of shear wave splitting for measuring variations in stress because clusters of closely-spaced earthquakes occurred both before and after a mainshock. We use the automatic, objective splitting analysis code MFAST to speed process and minimize unwitting observer bias when determining time variations. The sequence of earthquakes related to the M=7.2 Japanese Kumamoto earthquake of 14 April 2016 includes both foreshocks, mainshocks and aftershocks. The sequence was recorded by the NIED permanent network, which already contributed background seismic anisotropy measurements in a previous study of anisotropy and stress in Kyushu. Preliminary measurements of shear wave splitting from earthquakes that occurred in 2016 show results at some stations that clearly differ from those of the earlier study. They also change between earthquakes recorded before and after the mainshock. Further work is under way to determine whether the changes are more likely due to changes in stress during the observation time, or due to spatial changes in anisotropy combined with changes in earthquake locations. Likewise, background seismicity and also foreshocks and aftershocks in the 2013 Cook Strait earthquake sequence including two M=6.5 earthquakes in 2013 in New Zealand were in the same general region as aftershocks of the M=7.8 Kaikoura earthquake that occurred on 14 November 2016. Here again, preliminary analysis suggests the possibility that we are observing changes in stress, but detailed analysis is needed to confirm that.

  18. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic material showing high (>0.4 down to 60 m) Poisson's ratios. Our new model can be used in future studies to better constrain the deep interior of the Moon. Given the rich information derived from the minimalistic recording configuration, our results demonstrate that wavefield gradient analysis should be critically considered for future space missions that aim to explore the interior structure of extraterrestrial objects by seismic methods. Additionally, we anticipate that the proposed shear wave identification methodology can also be applied to the routinely recorded vertical component data from land seismic exploration on Earth.

  19. Ischia Island: Historical Seismicity and Dynamics

    NASA Astrophysics Data System (ADS)

    Carlino, S.; Cubellis, E.; Iannuzzi, R.; Luongo, G.; Obrizzo, F.

    2003-04-01

    The seismic energy release in volcanic areas is a complex process and the island of Ischia provides a significant scenario of historical seismicity. This is characterized by the occurence of earthquakes with low energy and high intensity. Information on the seismicity of the island spans about eight centuries, starting from 1228. With regard to effects, the most recent earthquake of 1883 is extensively documented both in the literature and unpublished sources. The earthquake caused 2333 deaths and the destruction of the historical and environmental heritage of some areas of the island. The most severe damage occurred in Casamicciola. This event, which was the first great catastrophe after the unification of Italy in the 1860s (Imax = XI degree MCS), represents an important date in the prevention of natural disasters, in that it was after this earthquake that the first Seismic Safety Act in Italy was passed by which lower risk zones were identified for new settlements. Thanks to such detailed analysis, reliable modelling of the seismic source was also obtained. The historical data onwards makes it possible to identify the area of the epicenter of all known earthquakes as the northern slope of Monte Epomeo, while analysis of the effects of earthquakes and the geological structures allows us to evaluate the stress fields that generate the earthquakes. In a volcanic area, interpretation of the mechanisms of release and propagation of seismic energy is made even more complex as the stress field that acts at a regional level is compounded by that generated from migration of magmatic masses towards the surface, as well as the rheologic properties of the rocks dependent on the high geothermic gradient. Such structural and dynamic conditions make the island of Ischia a seismic area of considerable interest. It would appear necessary to evaluate the expected damage caused by a new event linked to the renewal of dynamics of the island, where high population density and the high economic value concerned, the island is a tourist destination and holiday resort, increase the seismic risk. A seismic hazard map of the island is proposed according to a comparative analysis of various types of data: the geology, tectonics, historical seismicity and damage caused by the 28 July 1883 Casamicciola earthquake. The analysis was essentially based on a GIS-aided cross-correlation of these data. The GIS is thus able to provide support both for in-depth analysis of the dynamic processes on the island and extend the assessment to other natural risks (volcanic, landslides, flooding, etc.).

  20. Analysis of the 2003-2004 microseismic sequence in the western part of the Corinth Rift

    NASA Astrophysics Data System (ADS)

    Godano, Maxime; Bernard, Pascal; Dublanchet, Pierre; Canitano, Alexandre; Marsan, David

    2013-04-01

    The Corinth rift is one of the most seismically active zones in Europe. The seismic activity follows a swarm organization with alternation of intensive crisis and more quiescent periods. The seismicity mainly occurs under the Gulf of Corinth in a 3-4 km north-dipping layer between 5 and 12 km. Several hypotheses have been proposed to explain this seismic layer. Nevertheless, the relationships between seismicity, deep structures and faults mapped at the surface remain unclear. Moreover, fluids seem to play a key role in the occurrence of the seismic activity (Bourouis and Cornet 2009, Pacchiani and Lyon-Caen 2009). Recently, a detailed analysis of the microseismicity (multiplets identification, precise relocation, focal mechanisms determination) between 2000 and 2007 in the western part of the Corinth rift have highlighted north-dipping (and some south-dipping) planar active microstructures in the seismic layer with normal fault mechanisms (Lambotte et al., in preparation; Godano et al., in preparation). A multiplet (group of earthquakes with similar waveform) can be interpreted as repeated ruptures on the same asperity due to transient forcing as silent creep on fault segment or fluid circulation. The detailed analysis of the multiplets in the Corinth rift is an opportunity to better understand coupling between seismic and aseismic processes. In the present study we focus on the seismic crisis that occurred from October 2003 to July 2004 in the western part of the Corinth Gulf. This crisis consists in 2431 relocated events with magnitude ranging from 0.5 to 3.1 (b-value = 1.4). The joint analysis of (1) the position of the multiplets with respect to the faults mapped at the surface, (2) the geometry of the main multiplets and (3) the fault plane solutions shows that the seismic crisis is probably related to the activation in depth of the Fassouleika and Aigion faults. The spatio-temporal analysis of the microseismicity highlights an overall migration from south-east to north-west characterized by the successive activation of the multiplets. We next perform a spectral analysis to determine source parameters for each multiplet in order to estimate size of the asperities and cumulative coseismic slip. From the preceding observations and results we finally try to reproduce the 2003-2004 microseismic sequence using rate-and-state 3D asperity model (Dublanchet et al., submitted). The deformation measured during the crisis by the strainmeter installed in the Trizonia island is used in the modeling to constrain the maximum slip amplitude.

  1. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  2. Seismo-Live: Training in Seismology with Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Krischer, Lion; Tape, Carl; Igel, Heiner

    2016-04-01

    Seismological training tends to occur within the isolation of a particular institution with a limited set of tools (codes, libraries) that are often not transferrable outside. Here, we propose to overcome these limitations with a community-driven library of Jupyter notebooks dedicated to training on any aspect of seismology for purposes of education and outreach, on-site or archived tutorials for codes, classroom instruction, and research. A Jupyter notebook (jupyter.org) is an open-source interactive computational environment that allows combining code execution, rich text, mathematics, and plotting. It can be considered a platform that supports reproducible research, as all inputs and outputs may be stored. Text, external graphics, equations can be handled using Markdown (incl. LaTeX) format. Jupyter notebooks are driven by standard web browsers, can be easily exchanged in text format, or converted to other documents (e.g. PDF, slide shows). They provide an ideal format for practical training in seismology. A pilot-platform was setup with a dedicated server such that the Jupyter notebooks can be run in any browser (PC, notepad, smartphone). We show the functionalities of the Seismo-Live platform with examples from computational seismology, seismic data access and processing using the ObsPy library, seismic inverse problems, and others. The current examples are all using the Python programming language but any free language can be used. Potentially, such community platforms could be integrated with the EPOS-IT infrastructure and extended to other fields of Earth sciences.

  3. A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community

    NASA Astrophysics Data System (ADS)

    Merchant, B. J.; Chael, E. P.; Young, C. J.

    2013-12-01

    Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.

  4. Full waveform seismic modelling of Chalk Group rocks from the Danish North Sea - implications for velocity analysis

    NASA Astrophysics Data System (ADS)

    Montazeri, Mahboubeh; Moreau, Julien; Uldall, Anette; Nielsen, Lars

    2015-04-01

    This study aims at understanding seismic wave propagation in the fine-layered Chalk Group, which constitutes the main reservoir for oil and gas production in the Danish North Sea. The starting point of our analysis is the Nana-1XP exploration well, which shows strong seismic contrasts inside the Chalk Group. For the purposes of seismic waveform modelling, we here assume a one-dimensional model with homogeneous and isotropic layers designed to capture the main fluctuations in petrophysical properties observed in the well logs. The model is representative of the stratigraphic sequences of the area and it illustrates highly contrasting properties of the Chalk Group. Finite-difference (FD) full wave technique, both acoustic and elastic equations are applied to the model. Velocity analysis of seismic data is a crucial step for stacking, multiple suppression, migration, and depth conversion of the seismic record. Semblance analysis of the synthetic seismic records shows strong amplitude peaks outside the expected range for the time interval representing the Chalk Group, especially at the base. The various synthetic results illustrate the occurrence and the impact of different types of waves including multiples, converted waves and refracted waves. The interference of these different wave types with the primary reflections can explain the strong anomalous amplitudes in the semblance plot. In particular, the effect of strongly contrasting thin beds plays an important role in the generation of the high anomalous amplitude values. If these anomalous amplitudes are used to pick the velocities, it would impede proper stacking of the data and may result in sub-optimal migration and depth conversion. Consequently this may lead to erroneous or sub-optimal seismic images of the Chalk Group and the underlying layers. Our results highlight the importance of detailed velocity analysis and proper picking of velocity functions in the Chalk Group intervals. We show that application of standard front mutes in the mid- and far-offset ranges does not significantly improve the results of the standard semblance analysis. These synthetic modelling results could be used as starting points for defining optimized processing flows for the seismic data sets acquired in the study area with the aim of improving the imaging of the Chalk Group.

  5. Reflection imaging of the Moon's interior using deep-moonquake seismic interferometry

    NASA Astrophysics Data System (ADS)

    Nishitsuji, Yohei; Rowe, C. A.; Wapenaar, Kees; Draganov, Deyan

    2016-04-01

    The internal structure of the Moon has been investigated over many years using a variety of seismic methods, such as travel time analysis, receiver functions, and tomography. Here we propose to apply body-wave seismic interferometry to deep moonquakes in order to retrieve zero-offset reflection responses (and thus images) beneath the Apollo stations on the nearside of the Moon from virtual sources colocated with the stations. This method is called deep-moonquake seismic interferometry (DMSI). Our results show a laterally coherent acoustic boundary around 50 km depth beneath all four Apollo stations. We interpret this boundary as the lunar seismic Moho. This depth agrees with Japan Aerospace Exploration Agency's (JAXA) SELenological and Engineering Explorer (SELENE) result and previous travel time analysis at the Apollo 12/14 sites. The deeper part of the image we obtain from DMSI shows laterally incoherent structures. Such lateral inhomogeneity we interpret as representing a zone characterized by strong scattering and constant apparent seismic velocity at our resolution scale (0.2-2.0 Hz).

  6. Teaching hands-on geophysics: examples from the Rū seismic network in New Zealand

    NASA Astrophysics Data System (ADS)

    van Wijk, Kasper; Simpson, Jonathan; Adam, Ludmila

    2017-03-01

    Education in physics and geosciences can be effectively illustrated by the analysis of earthquakes and the subsequent propagation of seismic waves in the Earth. Educational seismology has matured to a level where both the hard- and software are robust and user friendly. This has resulted in successful implementation of educational networks around the world. Seismic data recorded by students are of such quality that these can be used in classic earthquake location exercises, for example. But even ocean waves weakly coupled into the Earth’s crust can now be recorded on educational seismometers. These signals are not just noise, but form the basis of more recent developments in seismology, such as seismic interferometry, where seismic waves generated by ocean waves—instead of earthquakes—can be used to infer information about the Earth’s interior. Here, we introduce an earthquake location exercise and an analysis of ambient seismic noise, and present examples. Data are provided, and all needed software is freely available.

  7. Surface-Source Downhole Seismic Analysis in R

    USGS Publications Warehouse

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the inversion for the velocity of each layer. The analyst usually picks layer-interfaces by visual inspection of the travel-time data. I have also developed an algorithm that automatically finds boundaries which can save a significant amount of the time when analyzing a large number of sites. The results of the automatic routines should be reviewed to check that they are reasonable. The interactivity of these scripts allows the user to add and to remove layers quickly, thus allowing rapid feedback on how the residuals are affected by each additional parameter in the inversion. In addition, the script allows many models to be compared at the same time.

  8. Gas Reservoir Identification Basing on Deep Learning of Seismic-print Characteristics

    NASA Astrophysics Data System (ADS)

    Cao, J.; Wu, S.; He, X.

    2016-12-01

    Reservoir identification based on seismic data analysis is the core task in oil and gas geophysical exploration. The essence of reservoir identification is to identify the properties of rock pore fluid. We developed a novel gas reservoir identification method named seismic-print analysis by imitation of the vocal-print analysis techniques in speaker identification. The term "seismic-print" is referred to the characteristics of the seismic waveform which can identify determinedly the property of the geological objectives, for instance, a nature gas reservoir. Seismic-print can be characterized by one or a few parameters named as seismic-print parameters. It has been proven that gas reservoirs are of characteristics of negative 1-order cepstrum coefficient anomaly and Positive 2-order cepstrum coefficient anomaly, concurrently. The method is valid for sandstone gas reservoir, carbonate reservoir and shale gas reservoirs, and the accuracy rate may reach up to 90%. There are two main problems to deal with in the application of seismic-print analysis method. One is to identify the "ripple" of a reservoir on the seismogram, and another is to construct the mapping relationship between the seismic-print and the gas reservoirs. Deep learning developed in recent years is of the ability to reveal the complex non-linear relationship between the attribute and the data, and of ability to extract automatically the features of the objective from the data. Thus, deep learning could been used to deal with these two problems. There are lots of algorithms to carry out deep learning. The algorithms can be roughly divided into two categories: Belief Networks Network (DBNs) and Convolutional Neural Network (CNN). DBNs is a probabilistic generative model, which can establish a joint distribution of the observed data and tags. CNN is a feedforward neural network, which can be used to extract the 2D structure feature of the input data. Both DBNs and CNN can be used to deal with seismic data. We use an improved DBNs to identify carbonate rocks from log data, the accuracy rate can reach up to 83%. DBNs is used to deal with seismic waveform data, more information is obtained. The work was supported by NSFC under grant No. 41430323 and No. 41274128, and State Key Lab. of Oil and Gas Reservoir Geology and Exploration.

  9. Structure of the Suasselkä postglacial fault in northern Finland obtained by analysis of local events and ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Afonin, Nikita; Kozlovskaya, Elena; Kukkonen, Ilmo; Dafne/Finland Working Group

    2017-04-01

    Understanding the inner structure of seismogenic faults and their ability to reactivate is particularly important in investigating the continental intraplate seismicity regime. In our study we address this problem using analysis of local seismic events and ambient seismic noise recorded by the temporary DAFNE array in the northern Fennoscandian Shield. The main purpose of the DAFNE/FINLAND passive seismic array experiment was to characterize the present-day seismicity of the Suasselkä postglacial fault (SPGF), which was proposed as one potential target for the DAFNE (Drilling Active Faults in Northern Europe) project. The DAFNE/FINLAND array comprised an area of about 20 to 100 km and consisted of eight short-period and four broadband three-component autonomous seismic stations installed in the close vicinity of the fault area. The array recorded continuous seismic data during September 2011-May 2013. Recordings of the array have being analysed in order to identify and locate natural earthquakes from the fault area and to discriminate them from the blasts in the Kittilä gold mine. As a result, we found a number of natural seismic events originating from the fault area, which proves that the fault is still seismically active. In order to study the inner structure of the SPGF we use cross-correlation of ambient seismic noise recorded by the array. Analysis of azimuthal distribution of noise sources demonstrated that during the time interval under consideration the distribution of noise sources is close to the uniform one. The continuous data were processed in several steps including single-station data analysis, instrument response removal and time-domain stacking. The data were used to estimate empirical Green's functions between pairs of stations in the frequency band of 0.1-1 Hz and to calculate corresponding surface wave dispersion curves. The S-wave velocity models were obtained as a result of dispersion curve inversion. The results suggest that the area of the SPGF corresponds to a narrow region of low S-wave velocities surrounded by rocks with high S-wave velocities. We interpret this low-velocity region as a non-healed mechanically weak fault damage zone (FDZ) formed due to the last major earthquake that occurred after the last glaciation.

  10. Seismo-acoustic analysis of the near quarry blasts using Plostina small aperture array

    NASA Astrophysics Data System (ADS)

    Ghica, Daniela; Stancu, Iulian; Ionescu, Constantin

    2013-04-01

    Seismic and acoustic signals are important to recognize different type of industrial blasting sources in order to discriminate between them and natural earthquakes. We have analyzed the seismic events listed in the Romanian catalogue (Romplus) for the time interval between 2011 and 2012, and occurred in the Dobrogea region, in order to determine detection seismo-acoustic signals of quarry blasts by Plostina array stations. Dobrogea is known as a seismic region characterized by crustal earthquakes with low magnitudes; at the same time, over 40 quarry mines are located in the area, being sources of blasts recorded both with the seismic and infrasound sensors of the Romanian Seismic Network. Plostina seismo-acoustic array, deployed in the central part of Romania, consists of 7 seismic sites (3C broad-band instruments and accelerometers) collocated with 7 infrasound instruments. The array is particularly used for the seismic monitoring of the local and regional events, as well as for the detection of infrasonic signals produced by various sources. Considering the characteristics of the infrasound sensors (frequency range, dynamic, sensibility), the array proved its efficiency in observing the signals produced by explosions, mine explosion and quarry blasts. The quarry mines included for this study cover distances of two hundreds of kilometers from the station and routinely generate explosions that are detected as seismic and infrasonic signals with Plostina array. The combined seismo-acoustic analysis uses two types of detectors for signal identification: one, applied for the seismic signal identification, is based on array processing techniques (beamforming and frequency-wave number analysis), while the other one, which is used for infrasound detection and characterization, is the automatic detector DFX-PMCC (Progressive Multi-Channel Correlation Method). Infrasonic waves generated by quarry blasts have frequencies ranging from 0.05 Hz up to at least 6 Hz and amplitudes below 5 Pa. Seismic data analysis shows that the frequency range of the signals are above 2 Hz. Surface explosions such as quarry blasts are useful sources for checking detection and location efficiency, when seismic measurements are added. The process is crucial for discrimination purposes and for establishing of a set of ground-truth infrasound events. Ground truth information plays a key role in the interpretation of infrasound signals, by including near-field observations from industrial blasts.

  11. Modeling the effects of structure on seismic anisotropy in the Chester gneiss dome, southeast Vermont

    NASA Astrophysics Data System (ADS)

    Saif, S.; Brownlee, S. J.

    2017-12-01

    Compositional and structural heterogeneity in the continental crust are factors that contribute to the complex expression of crustal seismic anisotropy. Understanding deformation and flow in the crust using seismic anisotropy has thus proven difficult. Seismic anisotropy is affected by rock microstructure and mineralogy, and a number of studies have begun to characterize the full elastic tensors of crustal rocks in an attempt to increase our understanding of these intrinsic factors. However, there is still a large gap in length-scale between laboratory characterization on the scale of centimeters and seismic wavelengths on the order of kilometers. To address this length-scale gap we are developing a 3D crustal model that will help us determine the effects of rotating laboratory-scale elastic tensors into field-scale structures. The Chester gneiss dome in southeast Vermont is our primary focus. The model combines over 2000 structural data points from field measurements and published USGS structural data with elastic tensors of Chester dome rocks derived from electron backscatter diffraction data. We created a uniformly spaced grid by averaging structural measurements together in equally spaced grid boxes. The surface measurements are then projected into the third dimension using existing subsurface interpretations. A measured elastic tensor for the specific rock type is rotated according to its unique structural input at each point in the model. The goal is to use this model to generate artificial seismograms using existing numerical wave propagation codes. Once completed, the model input can be varied to examine the effects of different subsurface structure interpretations, as well as heterogeneity in rock composition and elastic tensors. Our goal is to be able to make predictions for how specific structures will appear in seismic data, and how that appearance changes with variations in rock composition.

  12. High-resolution seismic-reflection profiles and sidescan-sonar records collected on Block Island Sound by U.S. Geological Survey, R/V ASTERIAS, cruise AST 81-2

    USGS Publications Warehouse

    Needell, S. W.; Lewis, R.S.

    1982-01-01

    Cruise AST 81-2 was conducted aboard the R/V ASTERIAS during September 10-18, 1981, in Block Island Sound by the U.S. Geological Survey. It was funded in part by the Connecticut Geological and Natural History Survey. The purpose of the study was to define and map the geology and shallow structure, to determine the geologic framework and late Tertiary to Holocene history, and to identify and map any potential geologic hazards of Block Island Sound.The survey was conducted using an EG&G Uniboom seismic system and an EDO Western sidescan-sonar system. Seismic signals were band-passed between 400 and 4,000 Hz and were recorded at a quarter-second sweep rate. Sidescan sonographs were collected at a 100-m scan range to each side of the ship track. In all, 702 km of seismic-reflection profiles and 402 km of sidescan-sonar records were collected. Navigation was by Loran-C, and the ship position was recorded at 5-minute intervals. Seismic-reflection profiling is continuous and good in quality. Sidescan-sonar records are varied in quality; coverage was intermittent and eventu­ally terminated owing to difficulties with the recorder.Original records can be seen and studied at the U.S. Geological Survey Data Library at Woods Hole, MA 02543. Microfilm copies of the seismic-reflection pro­files and the sidescan sonographs can be purchased only from the National Geo­physical and Solar-Terrestrial Data Center, NOAA/EDIS/NGSDC, Code D621, 325 Broad­way, Boulder, CO 80303 (telephone 303-497-6338).

  13. Risk-targeted versus current seismic design maps for the conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  14. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    NASA Astrophysics Data System (ADS)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate the slope critical acceleration (Ac)x for which a prefixed probability exists that seismic shaking would result in a Dn value equal to a threshold x whose exceedence would cause landslide triggering. The obtained ac values represent the minimum slope resistance required to keep the probability of seismic-landslide triggering within the prefixed value. In particular we calculated the spatial distribution of (Ac)x for x thresholds of 10 and 2 cm in order to represent triggering conditions for coherent slides (e.g., slumps, block slides, slow earth flows) and disrupted slides (e.g., rock falls, rock slides, rock avalanches), respectively. Then we produced a probabilistic national map that shows the spatial distribution of (Ac)10 and (Ac)2, for a 10% probability of exceedence in 50 year, which is a significant level of hazard equal to that commonly used for building codes. The spatial distribution of the calculated (Ac)xvalues can be compared with the in situ actual ac values of specific slopes to estimate whether these slopes have a significant probability of failing under seismic action in the future. As example of possible application of this kind of time probabilistic map to hazard estimates, we compared the values obtained for the Manjil region with a GIS map providing spatial distribution of estimated ac values in the same region. The spatial distribution of slopes characterized by ac < (Ac)10 was then compared with the spatial distribution of the major landslides of coherent type triggered by the Manjil earthquake. This comparison provides indications on potential, problems and limits of the experimented approach for the study area. References Cornell, C.A., 1968: Engineering seismic risk analysis, Bull. Seism. Soc. Am., 58, 1583-1606. Del Gaudio V., Wasowski J., & Pierri P., 2003: An approach to time probabilistic evaluation of seismically-induced landslide hazard. Bull Seism. Soc. Am., 93, 557-569. Jibson, R.W., E.L. Harp and J.A. Michael, 1998: A method for producing digital probabilistic seismic landslide hazard maps: an example from the Los Angeles, California, area, U.S. Geological Survey Open-File Report 98-113, Golden, Colorado, 17 pp.

  15. Estimation of anisotropy parameters in organic-rich shale: Rock physics forward modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herawati, Ida, E-mail: ida.herawati@students.itb.ac.id; Winardhi, Sonny; Priyono, Awali

    Anisotropy analysis becomes an important step in processing and interpretation of seismic data. One of the most important things in anisotropy analysis is anisotropy parameter estimation which can be estimated using well data, core data or seismic data. In seismic data, anisotropy parameter calculation is generally based on velocity moveout analysis. However, the accuracy depends on data quality, available offset, and velocity moveout picking. Anisotropy estimation using seismic data is needed to obtain wide coverage of particular layer anisotropy. In anisotropic reservoir, analysis of anisotropy parameters also helps us to better understand the reservoir characteristics. Anisotropy parameters, especially ε, aremore » related to rock property and lithology determination. Current research aims to estimate anisotropy parameter from seismic data and integrate well data with case study in potential shale gas reservoir. Due to complexity in organic-rich shale reservoir, extensive study from different disciplines is needed to understand the reservoir. Shale itself has intrinsic anisotropy caused by lamination of their formed minerals. In order to link rock physic with seismic response, it is necessary to build forward modeling in organic-rich shale. This paper focuses on studying relationship between reservoir properties such as clay content, porosity and total organic content with anisotropy. Organic content which defines prospectivity of shale gas can be considered as solid background or solid inclusion or both. From the forward modeling result, it is shown that organic matter presence increases anisotropy in shale. The relationships between total organic content and other seismic properties such as acoustic impedance and Vp/Vs are also presented.« less

  16. Post-blasting seismicity in Rudna copper mine, Poland - source parameters analysis.

    NASA Astrophysics Data System (ADS)

    Caputa, Alicja; Rudziński, Łukasz; Talaga, Adam

    2017-04-01

    The really important hazard in Polish copper mines is high seismicity and corresponding rockbursts. Many methods are used to reduce the seismic hazard. Among others the most effective is preventing blasting in potentially hazardous mining panels. The method is expected to provoke small moderate tremors (up to M2.0) and reduce in this way a stress accumulation in the rockmass. This work presents an analysis, which deals with post-blasting events in Rudna copper mine, Poland. Using the Full Moment Tensor (MT) inversion and seismic spectra analysis, we try to find some characteristic features of post blasting seismic sources. Source parameters estimated for post-blasting events are compared with the parameters of not-provoked mining events that occurred in the vicinity of the provoked sources. Our studies show that focal mechanisms of events which occurred after blasts have similar MT decompositions, namely are characterized by a quite strong isotropic component as compared with the isotropic component of not-provoked events. Also source parameters obtained from spectral analysis show that provoked seismicity has a specific source physics. Among others, it is visible from S to P wave energy ratio, which is higher for not-provoked events. The comparison of all our results reveals a three possible groups of sources: a) occurred just after blasts, b) occurred from 5min to 24h after blasts and c) not-provoked seismicity (more than 24h after blasting). Acknowledgements: This work was supported within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.

  17. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  18. Shear-wave velocity profile and seismic input derived from ambient vibration array measurements: the case study of downtown L'Aquila

    NASA Astrophysics Data System (ADS)

    Di Giulio, Giuseppe; Gaudiosi, Iolanda; Cara, Fabrizio; Milana, Giuliano; Tallini, Marco

    2014-08-01

    Downtown L'Aquila suffered severe damage (VIII-IX EMS98 intensity) during the 2009 April 6 Mw 6.3 earthquake. The city is settled on a top flat hill, with a shear-wave velocity profile characterized by a reversal of velocity at a depth of the order of 50-100 m, corresponding to the contact between calcareous breccia and lacustrine deposits. In the southern sector of downtown, a thin unit of superficial red soils causes a further shallow impedance contrast that may have influenced the damage distribution during the 2009 earthquake. In this paper, the main features of ambient seismic vibrations have been studied in the entire city centre by using array measurements. We deployed six 2-D arrays of seismic stations and 1-D array of vertical geophones. The 2-D arrays recorded ambient noise, whereas the 1-D array recorded signals produced by active sources. Surface-wave dispersion curves have been measured by array methods and have been inverted through a neighbourhood algorithm, jointly with the H/V ambient noise spectral ratios related to Rayleigh waves ellipticity. We obtained shear-wave velocity (Vs) profiles representative of the southern and northern sectors of downtown L'Aquila. The theoretical 1-D transfer functions for the estimated Vs profiles have been compared to the available empirical transfer functions computed from aftershock data analysis, revealing a general good agreement. Then, the Vs profiles have been used as input for a deconvolution analysis aimed at deriving the ground motion at bedrock level. The deconvolution has been performed by means of EERA and STRATA codes, two tools commonly employed in the geotechnical engineering community to perform equivalent-linear site response studies. The waveform at the bedrock level has been obtained deconvolving the 2009 main shock recorded at a strong motion station installed in downtown. Finally, this deconvolved waveform has been used as seismic input for evaluating synthetic time-histories in a strong-motion target site located in the middle Aterno river valley. As a target site, we selected the strong-motion station of AQV 5 km away from downtown L'Aquila. For this site, the record of the 2009 L'Aquila main shock is available and its surface stratigraphy is adequately known making possible to propagate the deconvolved bedrock motion back to the surface, and to compare recorded and synthetic waveforms.

  19. Seismic catalog condensation with applications to multifractal analysis of South Californian seismicity

    NASA Astrophysics Data System (ADS)

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2014-05-01

    Latest advances in the instrumentation field have increased the station coverage and lowered event detection thresholds. This has resulted in a vast increase in the number of located events with each year. The abundance of data comes as a double edged sword: while it facilitates more robust statistics and provides better confidence intervals, it also paralyzes computations whose execution times grow exponentially with the number of data points. In this study, we present a novel method that assesses the relative importance of each data point, reduces the size of datasets while preserving the information content. For a given seismic catalog, the goal is to express the same spatial probability density distribution with fewer data points. To achieve this, we exploit the fact that seismic catalogs are not optimally encoded. This coding deficiency is the result of the sequential data entry where new events are added without taking into account previous ones. For instance, if there are several events with identical parameters occurring at the same location, these could be grouped together rather than occupying the same memory space as if they were distinct events. Following this reasoning, the proposed condensation methodology is implemented by grouping all event according to their overall variance, starting from the group with the highest variance (worst location uncertainty), each event is sampled by a number of sample points, these points are then used to calculate which better located events are able to express these probable locations with a higher likelihood. Based on these likelihood comparisons, weights from poorly located events are successively transferred to better located ones. As a result of the process, a large portion of the events (~30%) ends up with zero weights (thus being fully represented by events increasing their weights), while the information content (i.e the sum of all weights) remains preserved. The resulting condensed catalog not only provides more optimally encoding but is also regularized with respect to the local information quality. By investigating the locations of mass enrichment and depletion at different scales, we observe that the areas of increased mass are in good agreement with reported surface fault traces. We also conduct multifractal spatial analysis on condensed catalogs and investigate different spatial scaling regimes made clearer by reducing the effect of location uncertainty.

  20. Ground motion amplification at rock sites: the competing role of topography and fractured rocks in the San Giovanni fault, central Italy

    NASA Astrophysics Data System (ADS)

    Pischiutta, M.; Cara, F.; Di Giulio, G.; Vassallo, M.; Cultrera, G.

    2017-12-01

    Amplification at rock sites in areas of high topographic relief has been increasingly observed in the last years, with unexpected level of damage after strong earthquakes. In regions affected by recent tectonic activity, topographic irregularities can include fault damage zones. In such conditions, seismic waves can be locally amplified as a double effect of wave focusing along the topography and /or the presence of fractures/joints or locally weakened rocks. The role of topography vs. geological complexities in controlling the ground motion amplification at rock sites is a newly debated issue in the seismological community. The most crucial questions regard what is the real contribution of the topography shape and fracturing, and how to parameterize such effects for their inclusion in the seismic design codes. In this framework, the EMERSITO INGV task force installed 7 seismic stations across the San Giovanni fault, after the Amatrice mainshock of the 2016 sequence in Central Italy. This active normal fault is located in the area of the Montereale intermountain basin (Abruzzi region, Italy) and bounds the southwestern slope of Mt. Mozzano, a roughly 2D-shaped, up to 1450 m high pronounced topography. Moreover, this fault has been recently studied by several authors who performed detailed geological and geophysical surveys. Our stations recorded more than 100 earthquakes with magnitude ranging from 2.5 to 3.9 as well as a 4.4 M earthquake with hypocenter in Capitignano district, few kilometres far. We have analyzed in detail the recorded signals calculating the traditional spectral ratios at single station (HVSRs) and using the reference site (SSRs) using both ambient noise and earthquakes. In order to obtain a robust estimate of the site amplification effect at each station, we have investigated the influence of backazimuth and epicentral distance. We have also applied the time-domain covariance matrix analysis and the frequency domain polarization analysis. We have found that, in spite of the complexity of the seismic data, the observed polarization pattern is generally oriented orthogonal to the ridge elongation, as well as to the fault strike, suggesting the existence of a high angle relation between ground motion polarization and fracture systems.

  1. Fragility Analysis Methodology for Degraded Structures and Passive Components in Nuclear Power Plants - Illustrated using a Condensate Storage Tank

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C.

    2010-06-30

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. In the Year 1 scope of work, BNL collected and reviewed degradation occurrences in US NPPs and identified important aging characteristics needed for the seismic capability evaluations. This information is presented in the Annual Report for the Year 1 Task, identified as BNL Report-81741-2008 and also designated as KAERI/RR-2931/2008. The report presents results of the statistical and trending analysis of this data and compares the results to prior aging studies. In addition, the report provides a description of U.S. current regulatory requirements, regulatory guidance documents, generic communications, industry standards and guidance, and past research related to aging degradation of SSCs. In the Year 2 scope of work, BNL carried out a research effort to identify and assess degradation models for the long-term behavior of dominant materials that are determined to be risk significant to NPPs. Multiple models have been identified for concrete, carbon and low-alloy steel, and stainless steel. These models are documented in the Annual Report for the Year 2 Task, identified as BNL Report-82249-2009 and also designated as KAERI/TR-3757/2009. This report describes the research effort performed by BNL for the Year 3 scope of work. The objective is for BNL to develop the seismic fragility capacity for a condensate storage tank with various degradation scenarios. The conservative deterministic failure margin method has been utilized for the undegraded case and has been modified to accommodate the degraded cases. A total of five seismic fragility analysis cases have been described: (1) undegraded case, (2) degraded stainless tank shell, (3) degraded anchor bolts, (4) anchorage concrete cracking, and (5)a perfect combination of the three degradation scenarios. Insights from these fragility analyses are also presented.« less

  2. 5 years of continuous seismic monitoring of a mountain river in the Pyrenees

    NASA Astrophysics Data System (ADS)

    Diaz, Jordi; Sanchez-Pastor, Pilar S.; Gallart, Josep

    2017-04-01

    The analysis of background seismic noise variations in the proximity of river channels has revealed as a useful tool to monitor river flow, even for modest discharges. Nevertheless, this monitoring is usually carried on using temporal deployments of seismic stations. The CANF seismic broad-band station, acquiring data continuously since 2010 and located inside an old railway tunnel in the Central Pyrenees, at about 400 m of the Aragón River channel, provides an excellent opportunity to enlarge this view and present a long term monitoring of a mountain river. Seismic signals in the 2-10 Hz band clearly related to river discharges have been identified in the seismic records. Discharge increases due to rainfall, large storms resulting in floods and snowmelt periods can be discriminated from the analysis of the seismic data. Up to now, two large rainfall events resulting in large discharge and damaging floods have been recorded, both sharing similar properties which can be used to implement automatic procedures to identify seismically potentially damaging floods. Another natural process that can be characterized using continuouly acquired seismic data is mountain snowmelt, as this process results in characteristic discharge patterns which can be identified in the seismic data. The time occurrence and intensity of the snowmelt stages for each season can be identified and the 5 seasons available so far compared to detect possible trends The so-called fluvial seismology can also provide important clues to evaluate the beadload transport in rivers, an important parameter to evaluate erosion rates in mountain environments. Analyzing both the amplitude and frequency variations of the seismic data and its hysteresis cycles, it seems possible to estimate the relative contribution of water flow and bedload transport to the seismic signal. The available results suggest that most of the river-generated seismic signal seems related to bed load transportation, while water turbulence is only significant above a discharge thres.hold Since 2015 we are operating 2 additional stations located beside the Cinca and Segre Rivers, also in the Pyrenean range. First results confirm that the river-generated signal can also be identified at these sites, although wind-related signals are recorded in a close frequency band and hence some further analysis is required to discern between both processes. (Founding: MISTERIOS project, CGL2013-48601-C2-1-R)

  3. Infrasonic and seismic signals from earthquakes and explosions observed with Plostina seismo-acoustic array

    NASA Astrophysics Data System (ADS)

    Ghica, D.; Ionescu, C.

    2012-04-01

    Plostina seismo-acoustic array has been recently deployed by the National Institute for Earth Physics in the central part of Romania, near the Vrancea epicentral area. The array has a 2.5 km aperture and consists of 7 seismic sites (PLOR) and 7 collocated infrasound instruments (IPLOR). The array is being used to assess the importance of collocated seismic and acoustic sensors for the purposes of (1) seismic monitoring of the local and regional events, and (2) acoustic measurement, consisting of detection of the infrasound events (explosions, mine and quarry blasts, earthquakes, aircraft etc.). This paper focuses on characterization of infrasonic and seismic signals from the earthquakes and explosions (accidental and mining type). Two Vrancea earthquakes with magnitude above 5.0 were selected to this study: one occurred on 1st of May 2011 (MD = 5.3, h = 146 km), and the other one, on 4th October 2011 (MD = 5.2, h = 142 km). The infrasonic signals from the earthquakes have the appearance of the vertical component of seismic signals. Because the mechanism of the infrasonic wave formation is the coupling of seismic waves with the atmosphere, trace velocity values for such signals are compatible with the characteristics of the various seismic phases observed with PLOR array. The study evaluates and characterizes, as well, infrasound and seismic data recorded from the explosion caused by the military accident produced at Evangelos Florakis Naval Base, in Cyprus, on 11th July 2011. Additionally, seismo-acoustic signals presumed to be related to strong mine and quarry blasts were investigated. Ground truth of mine observations provides validation of this interpretation. The combined seismo-acoustic analysis uses two types of detectors for signal identification: one is the automatic detector DFX-PMCC, applied for infrasound detection and characterization, while the other one, which is used for seismic data, is based on array processing techniques (beamforming and frequency-wave number analysis). Spectrograms of the recorded infrasonic and seismic data were examined, showing that an earthquake produces acoustic signals with a high energy in the 1 to 5 Hz frequency range, while, for the explosion, this range lays below 0.6 Hz. Using the combined analysis of the seismic and acoustic data, Plostina array can greatly enhance the event detection and localization in the region. The analysis can be, as well, particularly important in identifying sources of industrial explosion, and therefore, in monitoring of the hazard created both by earthquakes and anthropogenic sources of pollution (chemical factories, nuclear and power plants, refineries, mines).

  4. Seismic data compression speeds exploration projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galibert, P.Y.

    As part of an ongoing commitment to ensure industry-wide distribution of its revolutionary seismic data compression technology, Chevron Petroleum Technology Co. (CPTC) has entered into licensing agreements with Compagnie Generale de Geophysique (CGG) and other seismic contractors for use of its software in oil and gas exploration programs. CPTC expects use of the technology to be far-reaching to all of its industry partners involved in seismic data collection, processing, analysis and storage. Here, CGG--one of the world`s leading seismic acquisition and processing companies--talks about its success in applying the new methodology to replace full on-board seismic processing. Chevron`s technology ismore » already being applied on large off-shore 3-D seismic surveys. Worldwide, CGG has acquired more than 80,000 km of seismic data using the data compression technology.« less

  5. Holistic Overview of the Contribution of Tectonic, Geomorphic, and Geologic Factors to the Seismic Hazard of the Kathmandu Valley, Nepal

    NASA Astrophysics Data System (ADS)

    Banda, S.; Chang, A.; Sanquini, A.; Hilley, G. E.

    2013-12-01

    Nepal has been a seismically active region since the mid-Eocene collision of the Indian and Eurasian plates. It can be divided into four major tectonostratigraphic units. The Lesser Himalayan Zone, where Kathmandu Valley is located, is bounded to the south by the Main Boundary Thrust (MBT) and to the north by the Main Central Thrust (MCT). These faults, and the Main Frontal Thrust (MFT) traverse the NW-SE length of Nepal and sole into the Main Himalayan Thrust (MHT). Slip along these structures during the Plio-Quaternary has ponded sediment in the interior of the orogen, producing the nearly circular Kathmandu Basin, which hosts a series of radially converging rivers that exit the basin to the south. The sediment that is ponded within the basin consists of alluvial, lacustrine and debris flow deposits that are ~500 m thick. The faults in the vicinity of the Kathmandu Valley currently serve as potential earthquake sources. Sources that might plausibly be generated by these faults are constrained by structural, paleoseismic, and geodetic observations. The continued collision between India and Tibet is reflected in a convergence rate of about 20 mm/yr, as measured by Global Positioning System (GPS) geodetic networks. Strain accumulates on the MHT, and is released during large earthquakes. The epicenter of the 1934 (M8.2) earthquake, about 175 km to the east of Kathmandu, resulted in MMI VIII- IX shaking intensity in the Kathmandu Valley. Seismic waves generated from faults in proximity to Kathmandu may be amplified or attenuated at particular locations due to specific site responses that reflect the geologic framework of the Kathmandu Valley. The ponded sediments within the Kathmandu Basin may contribute to basin effects, trapping seismic waves and prolonging ground motion, as well as increasing the amplitude of the waves as they travel from crystalline outer rocks into the soft lake-bed sediments. A hazard analysis suggests that a M8.0 earthquake originating in the currently seismically-locked area to the west of Kathmandu would produce MMI VIII intensity in Kathmandu Valley, and a M5.8 earthquake on an active fault in the valley itself would result in MMI IX intensity close to the fault, and MMI VII - VIII elsewhere in the valley. The government of Nepal initiated a seismic hazard analysis and scenario-based estimation of the impact of a major earthquake in Kathmandu Valley in support of the development of a National Building Code. Earthquake awareness, preparation and mitigation initiatives have been undertaken, including implementation of the School Earthquake Safety Program, a preparedness and risk mitigation program for raising awareness and strengthening vulnerable buildings. The effectiveness of this program has been well-demonstrated, and it is a candidate for acceleration of adoption.

  6. Study on the applicability of the microtremor HVSR method to support seismic microzonation in the town of Idrija (W Slovenia)

    NASA Astrophysics Data System (ADS)

    Gosar, Andrej

    2017-06-01

    The town of Idrija is located in an area with an increased seismic hazard in W Slovenia and is partly built on alluvial sediments or artificial mining and smelting deposits which can amplify seismic ground motion. There is a need to prepare a comprehensive seismic microzonation in the near future to support seismic hazard and risk assessment. To study the applicability of the microtremor horizontal-to-vertical spectral ratio (HVSR) method for this purpose, 70 free-field microtremor measurements were performed in a town area of 0.8 km2 with 50-200 m spacing between the points. The HVSR analysis has shown that it is possible to derive the sediments' resonance frequency at 48 points. With the remaining one third of the measurements, nearly flat HVSR curves were obtained, indicating a small or negligible impedance contrast with the seismological bedrock. The isofrequency (a range of 2.5-19.5 Hz) and the HVSR peak amplitude (a range of 3-6, with a few larger values) maps were prepared using the natural neighbor interpolation algorithm and compared with the geological map and the map of artificial deposits. Surprisingly no clear correlation was found between the distribution of resonance frequencies or peak amplitudes and the known extent of the supposed soft sediments or deposits. This can be explained by relatively well-compacted and rather stiff deposits and the complex geometry of sedimentary bodies. However, at several individual locations it was possible to correlate the shape and amplitude of the HVSR curve with the known geological structure and prominent site effects were established in different places. In given conditions (very limited free space and a high level of noise) it would be difficult to perform an active seismic refraction or MASW measurements to investigate the S-wave velocity profiles and the thickness of sediments in detail, which would be representative enough for microzonation purposes. The importance of the microtremor method is therefore even greater, because it enables a direct estimation of the resonance frequency without knowing the internal structure and physical properties of the shallow subsurface. The results of this study can be directly used in analyses of the possible occurrence of soil-structure resonance of individual buildings, including important cultural heritage mining and other structures protected by UNESCO. Another application of the derived free-field isofrequency map is to support soil classification according to the recent trends in building codes and to calibrate Vs profiles obtained from the microtremor array or geophysical measurements.

  7. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  8. Structural Identification And Seismic Analysis Of An Existing Masonry Building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Del Monte, Emanuele; Galano, Luciano; Ortolani, Barbara

    2008-07-08

    The paper presents the diagnostic investigation and the seismic analysis performed on an ancient masonry building in Florence. The building has historical interest and is subjected to conservative restrictions. The investigation involves a preliminary phase concerning the research of the historic documents and a second phase of execution of in situ and laboratory tests to detect the mechanical characteristics of the masonry. This investigation was conceived in order to obtain the 'LC2 Knowledge Level' and to perform the non-linear pushover analysis according to the new Italian Standards for seismic upgrading of existing masonry buildings.

  9. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  10. Refining locations of the 2005 Mukacheve, West Ukraine, earthquakes based on similarity of their waveforms

    NASA Astrophysics Data System (ADS)

    Gnyp, Andriy

    2009-06-01

    Based on the results of application of correlation analysis to records of the 2005 Mukacheve group of recurrent events and their subsequent relocation relative to the reference event of 7 July 2005, a conclusion has been drawn that all the events had most likely occurred on the same rup-ture plane. Station terms have been estimated for seismic stations of the Transcarpathians, accounting for variation of seismic velocities beneath their locations as compared to the travel time tables used in the study. In methodical aspect, potentials and usefulness of correlation analysis of seismic records for a more detailed study of seismic processes, tectonics and geodynamics of the Carpathian region have been demonstrated.

  11. Forecasting induced seismicity rate and Mmax using calibrated numerical models

    NASA Astrophysics Data System (ADS)

    Dempsey, D.; Suckale, J.

    2016-12-01

    At Groningen, The Netherlands, several decades of induced seismicity from gas extraction has culminated in a M 3.6 event (mid 2012). From a public safety and commercial perspective, it is desirable to anticipate future seismicity outcomes at Groningen. One way to quantify earthquake risk is Probabilistic Seismic Hazard Analysis (PSHA), which requires an estimate of the future seismicity rate and its magnitude frequency distribution (MFD). This approach is effective at quantifying risk from tectonic events because the seismicity rate, once measured, is almost constant over timescales of interest. In contrast, rates of induced seismicity vary significantly over building lifetimes, largely in response to changes in injection or extraction. Thus, the key to extending PSHA to induced earthquakes is to estimate future changes of the seismicity rate in response to some proposed operating schedule. Numerical models can describe the physical link between fluid pressure, effective stress change, and the earthquake process (triggering and propagation). However, models with predictive potential of individual earthquakes face the difficulty of characterizing specific heterogeneity - stress, strength, roughness, etc. - at locations of interest. Modeling catalogs of earthquakes provides a means of averaging over this uncertainty, focusing instead on the collective features of the seismicity, e.g., its rate and MFD. The model we use incorporates fluid pressure and stress changes to describe nucleation and crack-like propagation of earthquakes on stochastically characterized 1D faults. This enables simulation of synthetic catalogs of induced seismicity from which the seismicity rate, location and MFD are extracted. A probability distribution for Mmax - the largest event in some specified time window - is also computed. Because the model captures the physics linking seismicity to changes in the reservoir, earthquake observations and operating information can be used to calibrate a model at a specific site (or, ideally, many models). This restricts analysis of future seismicity to likely parameter sets and provides physical justification for linking operational changes to subsequent seismicity. To illustrate these concepts, a recent study of prior and forecast seismicity at Groningen will be presented.

  12. Seismic imaging and hydrogeologic characterization of the Potomac Formation in northern New Castle County, Delaware

    NASA Astrophysics Data System (ADS)

    Zullo, Claudia Cristina

    Water supply demands of a growing population in the Coastal Plain of Delaware make detailed understanding of aquifers increasingly important. Previous studies indicate that the stratigraphy of the non-marine Potomac Formation, which includes the most important confined aquifers in the area, is complex and lithologically heterogeneous, making sands difficult to correlate. This study aimed to delineate the stratigraphic architecture of these sediments with a focus on the sand bodies that provide significant volumes of groundwater to northern Delaware. This project utilized an unconventional seismic system, a land streamer system, for collecting near-surface, high-resolution seismic reflection data on unpaved and paved public roadways. To calibrate the 20 km of seismic data to lithologies, a corehole and wireline geophysical logs were obtained. Six lithofacies (paleosols, lake, frequently flooded lake/abandoned channel, splay/levee, splay channel, fluvial channel) and their respective geophysical log patterns were identified and then correlated with the seismic data to relate seismic facies to these environments. Using seismic attribute analysis, seismic facies that correspond to four of the lithofacies were identified: fluvial channel seismic facies, paleosol seismic facies, splay/levee seismic facies, and a frequently flooded lake/abandoned channel and splay/levee combined seismic facies. Correlations for eleven horizons identified in the seismic sections and cross sections show local changes in thickness and erosional relief. The analysis of seismic facies sections provides a two-dimensional basis for detailed understanding of the stratigraphy of the Potomac Formation, and suggests an anastomosing fluvial style with poorly connected winding channel sands encased in fine-grained overbank sediments that produced a complex, labyrinth-style heterogeneity. The results indicate that the 2D lateral connectivity of the sand bodies of the Potomac Formation is limited to short distances, contrary to correlations in previous studies that have indicated connection of sands at distances of at least 3 km. The results highlight the importance of integrating multiple sources of geologic information for the interpretation of the stratigraphic architecture of non-marine sediments, and the value of roadway-based land-streamer seismic data for the interpretation of near-surface (less than 300-m-depth) aquifer sand characteristics in developed areas.

  13. Two types of seismicity accompanying hydraulic fracturing in Harrison County, Ohio - implications for seismic hazard and seismogenic mechanism

    NASA Astrophysics Data System (ADS)

    Kozlowska, M.; Brudzinski, M.; Friberg, P. A.; Skoumal, R.; Baxter, N. D.; Currie, B.

    2017-12-01

    While induced seismicity in the United States has mainly been attributed to wastewater disposal, Eastern Ohio has provided cases of seismicity induced by both hydraulic fracturing (HF) and wastewater disposal. In this study, we investigate five cases of seismicity associated with HF in Harrison County, OH. Because of their temporal and spatial isolation from other injection activities, this provide an ideal setting for studying the relationships between high pressure injection and earthquakes. Our analysis reveals two distinct groups of seismicity. Deeper earthquakes occur in the Precambrian crystalline basement, reach larger magnitudes (M>2), have lower b-values (<1), and continue for weeks following stimulation shut down. Shallower earthquakes, on the other hand, occur in Paleozoic sedimentary rocks 400 m below HF, are limited to smaller magnitudes (M<1), have higher b-values (>1.5), and lack post-stimulation activity. We seek the physical explanation of observed difference in earthquakes character and hypothesize that the maturity of faults is the main factor determining sequences b-values. Based on published results of laboratory experiments and fault modeling, we interpret the deep seismicity as slip on more mature faults in the older crystalline rocks and the shallow seismicity as slip on immature faults in the younger, lower viscosity sedimentary rocks. This suggests that HF inducing seismicity on deeper, more mature faults poses higher seismic hazards. The analysis of water and gas production data from these wells suggests that wells inducing deeper seismicity produced more water than wells with shallow seismicity. This indicates more extensive hydrologic connections outside the target reservoir, which may explain why gas production drops more quickly for wells with deeper seismicity. Despite these indications that hydraulic pressure fluctuations induce seismicity, we also find only 2-3 hours between onset of stimulation of HF wells and seismicity that is too short for typical fluid pressure diffusion rates across distances of 1 km. We conclude that a combination of pore fluid pressure changes and poroelastic stress changes are responsible for inducing shear slip during HF.

  14. Microseismic monitoring of Chocolate Bayou, Texas: the Pleasant Bayou No. 2 geopressured/geothermal energy test well program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauk, F.J.; Kimball, B.; Davis, R.A.

    1984-01-01

    The Brazoria seismic network, instrumentation, design, and specifications are described. The data analysis procedures are presented. Seismicity is described in relation to the Pleasant Bayou production history. Seismicity originating near the chemical plant east of the geopressured/geothermal well is discussed. (MHR)

  15. Microseismic monitoring of Chocolate Bayou, Texas: The Pleasant Bayou no. 2 geopressured/geothermal energy test well program

    NASA Astrophysics Data System (ADS)

    Mauk, F. J.; Kimball, B.; Davis, R. A.

    The Brazoria seismic network, instrumentation, design, and specifications are described. The data analysis procedures are presented. Seismicity is described in relation to the Pleasant Bayou production history. Seismicity originating near the chemical plant east of the geopressured/geothermal well is discussed.

  16. 75 FR 36715 - Advisory Committee on Reactor Safeguards; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-28

    ... Seismic Input for Site Response and Soil Structure Interaction Analyses'' (Open)--The Committee will hold... Seismic Input for Site Response and Soil Structure Interaction Analyses.'' 9:30 a.m.-10:30 a.m.: Interim Staff Guidance (ISG) DC/COL-ISG-020, ``Implementation of Seismic Margin Analysis for New Reactors Based...

  17. Design and development of digital seismic amplifier recorder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan, E-mail: gunawanhandayani@gmail.com

    2015-04-16

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩmore » and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.« less

  18. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  19. Numerical simulation of bubble plumes and an analysis of their seismic attributes

    NASA Astrophysics Data System (ADS)

    Li, Canping; Gou, Limin; You, Jiachun

    2017-04-01

    To study the bubble plume's seismic response characteristics, the model of a plume water body has been built in this article using the bubble-contained medium acoustic velocity model and the stochastic medium theory based on an analysis of both the acoustic characteristics of a bubble-contained water body and the actual features of a plume. The finite difference method is used for forward modelling, and the single-shot seismic record exhibits the characteristics of a scattered wave field generated by a plume. A meaningful conclusion is obtained by extracting seismic attributes from the pre-stack shot gather record of a plume. The values of the amplitude-related seismic attributes increase greatly as the bubble content goes up, and changes in bubble radius will not cause seismic attributes to change, which is primarily observed because the bubble content has a strong impact on the plume's acoustic velocity, while the bubble radius has a weak impact on the acoustic velocity. The above conclusion provides a theoretical reference for identifying hydrate plumes using seismic methods and contributes to further study on hydrate decomposition and migration, as well as on distribution of the methane bubble in seawater.

  20. Waveform classification and statistical analysis of seismic precursors to the July 2008 Vulcanian Eruption of Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Rodgers, Mel; Smith, Patrick; Pyle, David; Mather, Tamsin

    2016-04-01

    Understanding the transition between quiescence and eruption at dome-forming volcanoes, such as Soufrière Hills Volcano (SHV), Montserrat, is important for monitoring volcanic activity during long-lived eruptions. Statistical analysis of seismic events (e.g. spectral analysis and identification of multiplets via cross-correlation) can be useful for characterising seismicity patterns and can be a powerful tool for analysing temporal changes in behaviour. Waveform classification is crucial for volcano monitoring, but consistent classification, both during real-time analysis and for retrospective analysis of previous volcanic activity, remains a challenge. Automated classification allows consistent re-classification of events. We present a machine learning (random forest) approach to rapidly classify waveforms that requires minimal training data. We analyse the seismic precursors to the July 2008 Vulcanian explosion at SHV and show systematic changes in frequency content and multiplet behaviour that had not previously been recognised. These precursory patterns of seismicity may be interpreted as changes in pressure conditions within the conduit during magma ascent and could be linked to magma flow rates. Frequency analysis of the different waveform classes supports the growing consensus that LP and Hybrid events should be considered end members of a continuum of low-frequency source processes. By using both supervised and unsupervised machine-learning methods we investigate the nature of waveform classification and assess current classification schemes.

Top